Meta Announces Muse Spark, Enhancing AI Performance and Responsiveness
Meta has announced "Muse Spark," a new compact and fast AI model developed by Meta Superintelligence Labs in nine months. Muse Spark supports advanced logical reasoning and multimodal task processing, capable of addressing complex problems in science, mathematics, and health. It is currently integrated into the Meta AI app and Meta.AI web version, with plans for rollout to Facebook, Instagram, Messenger, WhatsApp, and AI glasses in the coming weeks. The model can understand visual information and generate custom websites and mini-games.
📋 Article Processing Timeline
- 📰 Published: April 9, 2026 at 12:28
- 🔍 Collected: April 9, 2026 at 13:00 (32 min after Published)
- 🤖 AI Analyzed: April 15, 2026 at 17:54 (148h 54m after Collected)
Meta released a press statement explaining that over the past nine months, Meta Superintelligence Labs has built Meta AI's technology stack from scratch, pushing progress far beyond previous development cycles. Muse Spark is the first model in the new Muse series, designed to be small and fast, with capabilities sufficient to reason about complex problems in fields such as science, mathematics, and health.
Muse Spark has now been integrated into the Meta AI assistant in the Meta AI app and Meta.AI web version, supporting advanced logical reasoning and multimodal task processing. It is expected to be rolled out to Facebook, Instagram, Messenger, WhatsApp, and AI glasses in the coming weeks.
Meta stated that the rapid changes and diversity of the real world mean that most information is not suitable for being confined within text dialogue boxes. Therefore, Meta has introduced multimodal perception capabilities into Muse Spark. Unlike other AIs that rely on sequential input descriptions, Meta AI can now not only read text input from users but also directly see and understand the visual information provided by the public.
For example, by simply taking a photo of an airport snack shelf, Meta AI can identify and rank snacks with the highest protein content, without the need to painstakingly check nutritional labels.
Muse Spark also excels at visual programming development. Users only need to enter prompts to create customized websites and mini-games. In the future, one can quickly generate a retro arcade game through Meta AI and invite friends to challenge it; or launch an imaginative flight simulator and share these game contents with friends. (Editor: Zhang Junmao) 1150409
Muse Spark has now been integrated into the Meta AI assistant in the Meta AI app and Meta.AI web version, supporting advanced logical reasoning and multimodal task processing. It is expected to be rolled out to Facebook, Instagram, Messenger, WhatsApp, and AI glasses in the coming weeks.
Meta stated that the rapid changes and diversity of the real world mean that most information is not suitable for being confined within text dialogue boxes. Therefore, Meta has introduced multimodal perception capabilities into Muse Spark. Unlike other AIs that rely on sequential input descriptions, Meta AI can now not only read text input from users but also directly see and understand the visual information provided by the public.
For example, by simply taking a photo of an airport snack shelf, Meta AI can identify and rank snacks with the highest protein content, without the need to painstakingly check nutritional labels.
Muse Spark also excels at visual programming development. Users only need to enter prompts to create customized websites and mini-games. In the future, one can quickly generate a retro arcade game through Meta AI and invite friends to challenge it; or launch an imaginative flight simulator and share these game contents with friends. (Editor: Zhang Junmao) 1150409