Trending Now
We have updated our Privacy Policy and Terms of Use for Eurasia Group and its affiliates, including GZERO Media, to clarify the types of data we collect, how we collect it, how we use data and with whom we share data. By using our website you consent to our Terms and Conditions and Privacy Policy, including the transfer of your personal data to the United States from your country of residence, and our use of cookies described in our Cookie Policy.
{{ subpage.title }}
What is open-source AI anyway?
A key artificial intelligence industry body has released a long-awaited definition that could affect how different AI models are viewed — if not regulated. The Open Source Initiative, a public benefit corporation, sets standards for what constitutes open-source systems in the technology industry. Over the past year, the group has investigated a big question: What constitutes open-source AI?
Meta has been one of the leading voices on open-source AI development with its LLaMA suite of large language models. But some critics have argued it isn’t truly open-source because it has licensing rules about how third-party developers can use its models and isn’t fully transparent about its training data.
Now, according to the new definition, an AI system must meet these four requirements to be considered open-source:
- It can be used by any person and without having to ask for permission
- Outsiders need to be able to study how the system works and inspect its components
- Developers need to be able to modify the system
- Users need to be able to share the system with others with or without modifications — for any purpose
Meta took issue with the new definition, maintaining that its models are, in fact, open-source. “There is no single open-source AI definition, and defining it is a challenge because previous open-source definitions do not encompass the complexities of today’s rapidly advancing AI models,” a company spokesperson told TechCrunch.
Still, the definition could help regulators and international organizations differentiate between open- and closed-source (or proprietary) models. That’s important. Recently, California lawmakers got pushback for advancing a bill requiring AI developers to have a “kill switch” for shutting down their models — something critics called a “de facto ban on open-source development.” (The bill was ultimately vetoed by Gov. Gavin Newsom.)