Magic, an innovative AI startup focusing on creating code generation models and automating various software development tasks, has announced a significant funding raise with support from key investors including former Google CEO Eric Schmidt.
In a recent blog post, Magic disclosed that it successfully closed a $320 million fundraising round, with contributions from prominent investors such as Jane Street, Sequoia, Atlassian, Nat Friedman & Daniel Gross, Elad Gil, CapitalG, and others. This funding brings Magic’s total investments to nearly $465 million, placing it among the top-funded generative coding startups alongside Anysphere, Codeium, and Augment (where Eric Schmidt is also an investor).
Additionally, Magic revealed a new partnership with Google Cloud to develop two advanced “supercomputers” on the Google Cloud Platform. These supercomputers, named Magic-G4 and Magic-G5, will be powered by Nvidia H100 GPUs and Nvidia’s upcoming Blackwell chips, respectively.
Magic’s CEO, Eric Steinberg, expressed excitement about the collaboration with Google and Nvidia, highlighting the potential of Nvidia’s cutting-edge technology to enhance the efficiency of Magic’s AI models.
Steinberger and Sebastian De Ro founded Magic in 2022, drawing inspiration from the transformative power of AI. The platform offers AI-powered tools to assist software engineers in various coding tasks, acting as an automated pair programmer to analyze and understand coding projects.
One of Magic’s key innovations lies in its models’ ultra-long context windows, enabling them to process vast amounts of input data before generating output. The company’s latest model, LTM-2-mini, boasts a 100 million-token context window, significantly larger than other commercial models like Google’s Gemini models.
Thanks to its extensive context window, LTM-2-mini has already demonstrated its capabilities by implementing features like a password strength meter for an open-source project and creating a calculator using a custom UI framework. Magic is currently working on training an even larger version of the model to further expand its capabilities.