Kevin Weil and Bill Peebles exit OpenAI as company continues to shed ‘side quests’
OpenAI is shifting focus from consumer-facing 'moonshots' like Sora to enterprise AI, with key personnel departures and team consolidations.
Read on TechCrunch →Meta Platforms is developing a series of custom AI inference chips to enhance its data centers, improve energy efficiency, and manage the growing demands of its AI capabilities, with new chips rolling out until 2027.
Why it matters
This initiative by Meta underscores the critical importance of custom hardware for major tech companies to manage the escalating computational demands of AI. By developing its own inference chips, Meta aims to reduce reliance on external vendors, optimize performance specifically for its AI models, and achieve substantial cost and energy efficiencies. This move is crucial for scaling its AI-driven products and services, potentially influencing the broader AI hardware market and setting a precedent for other big tech players.
Meta is building its own specialized computer chips to power its AI systems more efficiently and cost-effectively. These chips are designed for AI inference, helping Meta handle the massive computing needs of its AI products and services, with new versions rolling out over the next few years to improve performance and save money.
OpenAI is shifting focus from consumer-facing 'moonshots' like Sora to enterprise AI, with key personnel departures and team consolidations.
Read on TechCrunch →Zoom partners with Sam Altman's World to implement human ID verification in meetings, aiming to combat AI-generated imposters.
Read on TechCrunch →Anthropic has launched Claude Design, a new AI-powered product aimed at helping non-designers like founders and product managers quickly create visuals to share their ideas.
Read on TechCrunch →