The 5 biggest revelations from Blake Lively's complaint against Justin Baldoni
- yesterday, 5:55 PM
- businessinsider.com
- 0
The San Francisco-based data analytics company Databricks has raised a stunning $10 billion funding round, which values the company at $62 billion and likely puts it on a path toward an IPO.
The $10 billion investment (which included OpenAI investors Thrive Capital and Andreessen Horowitz) is one of the largest venture rounds in history, and is equal to the largest single investment in the AI space–Microsoft’s $10 billion investment in OpenAI early last year.
The size of the funding round (technically its “Series J”) shows investors’ willingness to bet big on companies they suspect could be central to a new tech paradigm shift. And to ride that wave requires more capital than such shifts have required in the past.
Databricks’ smart bet on MosaicML
Databricks has for years offered a secure place where enterprises can host their data (and run analytics on that data). In 2023 it acquired the generative AI company MosaicML, with the idea of helping its customers create customized AI models that they could run in the same cloud that’s already hosting their data.
In March, Databricks rolled out one of the first fruits of its MosaicML buy: a new LLM called DBRX. With DBRX, Databricks can offer its some 12,000 customers a secure cloud where they can also expose their data to advanced AI models, which it argues helps avoid the security risks of sending proprietary data out through an API to an AI model hosted by another company. In some regulated industries such as finance and healthcare, avoiding that security risk is a major selling point. This is part of the secret of Databricks’ success.
DBRX’s “mixture of experts”
DBRX, which is available as open source, isn’t as capable as such state-of-the-art models as Google’s Gemini or OpenAI’s GPT-4 but, as Databricks CEO Ali Ghodsi said during a press gathering in March, many enterprises don’t require gigantic models for the kinds of applications they’re looking to carry out. A financial institution, for example, might be able to use DBRX to look for signs of fraud among its databases of numbers, and a healthcare organization could use the AI to look for patterns of disease across thousands of electronic patient records.
Many of today’s LLMs expend too much energy to tackle simple problems, which both uses up compute power and slows delivery of an answer to a user. DBRX addresses this issue by using a “mixture-of-experts” design that divides up the model’s brain into 16 specialized “experts.” When a specific type of calculation is requested, a “router” inside the model knows which “expert” to call on. The whole DBRX model contains 132 billion parameters, but because of that division of labor, it uses only 36 billion parameters at any given time, Ghodsi explained. For businesses that want to use AI for day-to-day operations, this style of LLM architecture could lower the barrier to entry.
The efficiencies built into DBRX and other Databricks models likely improves the economics of deploying AI models for customers.
“These are still the early days of AI,” Ghodsi said in a statement Tuesday. “We are positioning the Databricks Data Intelligence Platform to deliver long-term value … and our team is committed to helping companies across every industry build data intelligence.”
No comments