www.technologyreview.com Open in urlscan Pro
192.0.66.184  Public Scan

URL: https://www.technologyreview.com/2024/09/25/1104465/a-tiny-new-open-source-ai-model-performs-as-well-as-powerful-big-ones/
Submission: On September 26 via api from US — Scanned from CA

Form analysis 1 forms found in the DOM

<form class="stayConnected__form--18307598e0857e3bb696e62099890f3d"><label for="email" class="stayConnected__formWrap--bc55a4930c54f6e27130718c8abe7f9f">
    <div class="stayConnected__labelWrap--a7bf6e431dad2248d7a5d6743face8f4"><span class="stayConnected__label--49df038d675331d33cb3be3744dc2aa4">Enter your email</span></div>
    <div class="stayConnected__inputWrap--c530e7aff15a90a594ff9e4f5f5e1783"><input type="email" id="email" name="email" class="stayConnected__input--4b385ef9dfe4b3c2c6268801645e29e9" autocomplete="email" required=""><input type="text" id="name"
        name="name" class="stayConnected__hp--8d4ec1df04d5951cd624f35ba9d4fbfa" tabindex="-1" autocomplete="off">
      <div class="stayConnected__submitWrap--8615eaac88e0ce927745491334961b19"><button type="submit" class="stayConnected__submit--1d8dc2a487df662d908236592da5e5fa" aria-label="Submit email">
          <div class="stayConnected__arrow--464ca9ee54c47630a3d460b7de7c7f71"><svg width="17" height="15" fill="none">
              <path stroke="#000" d="M0 8.149h15.66M9.532 1L16 7.809l-6.468 6.468"></path>
            </svg></div>
          <div class="stayConnected__check--467f0ffeb06f074452e92848a7f5b816"><svg width="18" height="16" viewBox="0 0 18 16" fill="none">
              <path data-name="check" d="M1 8.32502L6.5 14.2766L17 1" stroke="black"></path>
            </svg></div>
        </button></div>
    </div><a href="/privacy/" class="stayConnected__privacyLink--cae6fca4c8d36d5ebfe832b6c2d52a88" aria-label="Read our Privacy Policy">Privacy Policy</a>
  </label>
  <div class="stayConnected__responseMessage--12bde8aaa78e1b908dcc7bf65887ca9f">
    <div class="stayConnected__labelWrap--a7bf6e431dad2248d7a5d6743face8f4">
      <p class="stayConnected__label--49df038d675331d33cb3be3744dc2aa4 stayConnected__centered--b7b09fbd037c0a7bf4fabf1d2474317a">Thank you for submitting your email!</p>
    </div><a class="stayConnected__btn--220de7597843d01758713e133a307e0d" href="/newsletter-preferences?email_address=undefined">Explore more newsletters</a>
  </div>
  <div class="stayConnected__responseMessage--12bde8aaa78e1b908dcc7bf65887ca9f">
    <div class="stayConnected__labelWrap--a7bf6e431dad2248d7a5d6743face8f4">
      <p class="stayConnected__label--49df038d675331d33cb3be3744dc2aa4 stayConnected__error--a1952d4717e04c4e1ed4f70509b52342 stayConnected__errorTitle--a6c139deaa8c1fca29336ea15a098c50">It looks like something went wrong.</p>
    </div>
    <p class="stayConnected__errorMessage--9a73d41f2009763d9cc7f54cee45585b"> We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at
      <a href="mailto:customer-service@technologyreview.com" class="stayConnected__link--35dca415e25519bea77fdb0f42df6d96">customer-service@technologyreview.com</a> with a list of newsletters you’d like to receive.</p>
  </div>
</form>

Text Content

You need to enable JavaScript to view this site.
Skip to Content
MIT Technology Review


 * Featured
 * Topics
 * Newsletters
 * Events
 * Audio

Sign in
Subscribe
MIT Technology Review


 * Featured
 * Topics
 * Newsletters
 * Events
 * Audio

Sign in
Subscribe
Artificial intelligence


A TINY NEW OPEN-SOURCE AI MODEL PERFORMS AS WELL AS POWERFUL BIG ONES

The results suggest that training models on less, but higher-quality, data can
lower computing costs.

By
 * Melissa Heikkiläarchive page

September 25, 2024

Photo Illustration by Sarah Rogers/MITTR | Photos Getty




The Allen Institute for Artificial Intelligence (Ai2), a research nonprofit, is
releasing a family of open-source multimodal language models, called Molmo, that
it says perform as well as top proprietary models from OpenAI, Google, and
Anthropic. 

The organization claims that its biggest Molmo model, which has 72 billion
parameters, outperforms OpenAI’s GPT-4o, which is estimated to have over a
trillion parameters, in tests that measure things like understanding images,
charts, and documents.  

Meanwhile, Ai2 says a smaller Molmo model, with 7 billion parameters, comes
close to OpenAI’s state-of-the-art model in performance, an achievement it
ascribes to vastly more efficient data collection and training methods. 

What Molmo shows is that open-source AI development is now on par with closed,
proprietary models, says Ali Farhadi, the CEO of Ai2. And open-source models
have a significant advantage, as their open nature means other people can build
applications on top of them. The Molmo demo is available here, and it will be
available for developers to tinker with on the Hugging Face website. (Certain
elements of the most powerful Molmo model are still shielded from view.) 



Other large multimodal language models are trained on vast data sets containing
billions of images and text samples that have been hoovered from the internet,
and they can include several trillion parameters. This process introduces a lot
of noise to the training data and, with it, hallucinations, says Ani Kembhavi, a
senior director of research at Ai2. In contrast, Ai2’s Molmo models have been
trained on a significantly smaller and more curated data set containing only
600,000 images, and they have between 1 billion and 72 billion parameters. This
focus on high-quality data, versus indiscriminately scraped data, has led to
good performance with far fewer resources, Kembhavi says.


RELATED STORY

We finally have a definition for open-source AI

Researchers have long disagreed over what constitutes open-source AI. An
influential group has offered up an answer.

Ai2 achieved this by getting human annotators to describe the images in the
model’s training data set in excruciating detail over multiple pages of text.
They asked the annotators to talk about what they saw instead of typing it. Then
they used AI techniques to convert their speech into data, which made the
training process much quicker while reducing the computing power required. 



These techniques could prove really useful if we want to meaningfully govern the
data that we use for AI development, says Yacine Jernite, who is the machine
learning and society lead at Hugging Face, and was not involved in the
research. 

“It makes sense that in general, training on higher-quality data can lower the
compute costs,” says Percy Liang, the director of the Stanford Center for
Research on Foundation Models, who also did not participate in the research. 

Another impressive capability is that the model can “point” at things, meaning
it can analyze elements of an image by identifying the pixels that answer
queries.

In a demo shared with MIT Technology Review, Ai2 researchers took a photo
outside their office of the local Seattle marina and asked the model to identify
various elements of the image, such as deck chairs. The model successfully
described what the image contained, counted the deck chairs, and accurately
pinpointed to other things in the image as the researchers asked. It was not
perfect, however. It could not locate a specific parking lot, for example. 

Other advanced AI models are good at describing scenes and images, says Farhadi.
But that’s not enough when you want to build more sophisticated web agents that
can interact with the world and can, for example, book a flight. Pointing allows
people to interact with user interfaces, he says. 

Jernite says Ai2 is operating with a greater degree of openness than we’ve seen
from other AI companies. And while Molmo is a good start, he says, its real
significance will lie in the applications developers build on top of it, and the
ways people improve it.

Farhadi agrees. AI companies have drawn massive, multitrillion-dollar
investments over the past few years. But in the past few months, investors have
expressed skepticism about whether that investment will bring returns. Big,
expensive proprietary models won’t do that, he argues, but open-source ones can.
He says the work shows that open-source AI can also be built in a way that makes
efficient use of money and time. 

“We’re excited about enabling others and seeing what others would build with
this,” Farhadi says. 

hide



BY MELISSA HEIKKILÄ


SHARE

 * Share story on linkedin
   
 * Share story on twitter
   
 * Share story on facebook
   
 * Share story on email
   


 * 


 * POPULAR
   
    1. A controversial Chinese CRISPR scientist is still hopeful about embryo
       gene editing. Here’s why.Zeyi Yang
    2. Why OpenAI’s new model is such a big dealJames O'Donnell
    3. Reimagining cloud strategy for AI-first enterprisesMIT Technology Review
       Insights
    4. Meet the radio-obsessed civilian shaping Ukraine’s drone defenseCharlie
       Metcalfe


DEEP DIVE


ARTIFICIAL INTELLIGENCE


WHY OPENAI’S NEW MODEL IS SUCH A BIG DEAL

The bulk of LLM progress until now has been language-driven. This new model
enters the realm of complex reasoning, with implications for physics, coding,
and more.

By
 * James O'Donnellarchive page


OPENAI HAS RELEASED A NEW CHATGPT BOT THAT YOU CAN TALK TO

The voice-enabled chatbot will be available to a small group of people today,
and to all ChatGPT Plus users in the fall. 

By
 * Melissa Heikkiläarchive page


ROBLOX IS LAUNCHING A GENERATIVE AI THAT BUILDS 3D ENVIRONMENTS IN A SNAP

It will make it easy to build new game environments on the platform, even if you
don’t have any design skills.

By
 * Scott J Mulliganarchive page


HOW “PERSONHOOD CREDENTIALS” COULD HELP PROVE YOU’RE A HUMAN ONLINE

A system proposed by researchers from MIT, OpenAI, Microsoft, and others could
curb the use of deceptive AI by exploiting the technology’s weaknesses.

By
 * Rhiannon Williamsarchive page


STAY CONNECTED

Illustration by Rose Wong


GET THE LATEST UPDATES FROM
MIT TECHNOLOGY REVIEW

Discover special offers, top stories, upcoming events, and more.

Enter your email

Privacy Policy

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and
updating them one more time. If you continue to get this message, reach out to
us at customer-service@technologyreview.com with a list of newsletters you’d
like to receive.





THE LATEST ITERATION OF A LEGACY

Founded at the Massachusetts Institute of Technology in 1899, MIT Technology
Review is a world-renowned, independent media company whose insight, analysis,
reviews, interviews and live events explain the newest technologies and their
commercial, social and political impact.
READ ABOUT OUR HISTORY


ADVERTISE WITH MIT TECHNOLOGY REVIEW

Elevate your brand to the forefront of conversation around emerging technologies
that are radically transforming business. From event sponsorships to custom
content to visually arresting video storytelling, advertising with MIT
Technology Review creates opportunities for your brand to resonate with an
unmatched audience of technology and business elite.
ADVERTISE WITH US

© 2024 MIT Technology Review




 * ABOUT
   
    * About us
    * Careers
    * Custom content
    * Advertise with us
    * International Editions
    * Republishing
    * MIT Alumni News


 * HELP
   
    * Help & FAQ
    * My subscription
    * Editorial guidelines
    * Privacy policy
    * Terms of Service
    * Write for us
    * Contact us

 * linkedin opens in a new window
   
 * instagram opens in a new window
   
 * reddit opens in a new window
   
 * facebook opens in a new window
   
 * rss opens in a new window
   








COOKIE POLICY

We use cookies to give you a more personalized browsing experience and analyze
site traffic.See our cookie policy

Accept all cookies

Cookies settings


PRIVACY PREFERENCE CENTER

When you visit any website, it may store or retrieve information on your
browser, mostly in the form of cookies. This information might be about you,
your preferences or your device and is mostly used to make the site work as you
expect it to. The information does not usually directly identify you, but it can
give you a more personalized web experience. Because we respect your right to
privacy, you can choose not to allow some types of cookies. Click on the
different category headings to find out more and change our default settings.
However, blocking some types of cookies may impact your experience of the site
and the services we are able to offer.
More information
Allow all


MANAGE CONSENT PREFERENCES

STRICTLY NECESSARY COOKIES

Always Active

These cookies are necessary for the website to function and cannot be switched
off in our systems. They are usually only set in response to actions made by you
which amount to a request for services, such as setting your privacy
preferences, logging in or filling in forms. You can set your browser to block
or alert you about these cookies, but some parts of the site will not then work.
These cookies do not store any personally identifiable information.

FUNCTIONAL COOKIES

Functional Cookies

These cookies enable the website to provide enhanced functionality and
personalisation. They may be set by us or by third party providers whose
services we have added to our pages. If you do not allow these cookies then some
or all of these services may not function properly.

PERFORMANCE COOKIES

Performance Cookies

These cookies allow us to count visits and traffic sources so we can measure and
improve the performance of our site. They help us to know which pages are the
most and least popular and see how visitors move around the site. All
information these cookies collect is aggregated and therefore anonymous. If you
do not allow these cookies we will not know when you have visited our site, and
will not be able to monitor its performance.

TARGETING COOKIES

Targeting Cookies

These cookies may be set through our site by our advertising partners. They may
be used by those companies to build a profile of your interests and show you
relevant adverts on other sites. They do not store directly personal
information, but are based on uniquely identifying your browser and internet
device. If you do not allow these cookies, you will experience less targeted
advertising.

Back Button


PERFORMANCE COOKIES



Search Icon
Filter Icon

Clear
checkbox label label
Apply Cancel
Consent Leg.Interest
checkbox label label
checkbox label label
checkbox label label

Reject all Confirm my choices