Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

Meet Molmo, the free model that could outshine GPT-4

byAytun Çelebi
September 26, 2024
in News
Home News

The Allen Institute for AI (Ai2) has made public Molmo, an innovative set of open-source multimodal models that contest the guiding influence of proprietary AI systems. With strengths in superior image recognition and actionable insights, Molmo is ready to assist developers, researchers, and startups by delivering an advanced yet easy-to-use AI application development tool. The launch brings attention to an important change in the landscape of AI, uniting open-source and proprietary models and improving everyone’s access to leading AI tech.

Molmo offers features that provide an exceptional degree of image understanding, permitting it to correctly read a wide variety of visual data—from mundane items to complex charts and menus. Instead of being like most AI models, Molmo surpasses perception by enabling users to interact with virtual and real environments through pointing and a range of spatial actions. This capability denotes a breakthrough, allowing for the introduction of complex AI agents, robotics, and many other applications that depend on a granular understanding of both visual and contextual data.

Efficiency and accessibility serve as major aspects of the Molmo development strategy. Molmo’s advanced skills come from a dataset of less than one million images, in stark contrast to the billions of images processed by other models such as GPT-4V and Google’s Gemini. The implemented approach has contributed to Molmo being not just highly efficient in using computational resources but has also created a model that is equally powerful as the most effective proprietary systems and features fewer hallucinations and quicker training rates.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

Making Molmo fully open-source is part of Ai2’s larger strategic effort to democratize AI development. Ai2 enables a diverse array of users—from startups to academic laboratories—to innovate and advance in AI technology without the high costs of investment or vast computing power. It gives them access to Molmo’s language and vision training data, model weights, and source code.

Molmo tryout on a homepage image
Here is our tryout with Molmo. Just showed it the main page and check what it saw…

Matt Deitke, Researcher at the Allen Institute for AI, told “Molmo is an incredible AI model with exceptional visual understanding, which pushes the frontier of AI development by introducing a paradigm for AI to interact with the world through pointing. The model’s performance is driven by a remarkably high quality curated dataset to teach AI to understand images through text. The training is so much faster, cheaper, and simpler than what’s done today, such that the open release of how it is built will empower the entire AI community, from startups to academic labs, to work at the frontier of AI development”.

Molmo comparison, Source: Allen Institute
Molmo comparison, Source: Allen Institute

According to internal evaluations, Molmo’s largest model, sporting 72 billion parameters, surpassed OpenAI’s GPT-4V and other leading competitors on several benchmarks. The tiniest Molmo model, including only one billion parameters, is big enough to function on a mobile device while outperforming models with ten times that number of parameters. Here you can see the models and try it for yourself.

Tags: AIimage recognitionmodelopen-source

Related Posts

Zoom announces AI Companion 3.0 at Zoomtopia

Zoom announces AI Companion 3.0 at Zoomtopia

September 19, 2025
Google Cloud adds Lovable and Windsurf as AI coding customers

Google Cloud adds Lovable and Windsurf as AI coding customers

September 19, 2025
Radware tricks ChatGPT’s Deep Research into Gmail data leak

Radware tricks ChatGPT’s Deep Research into Gmail data leak

September 19, 2025
Elon Musk’s xAI chatbot Grok exposed hundreds of thousands of private user conversations

Elon Musk’s xAI chatbot Grok exposed hundreds of thousands of private user conversations

September 19, 2025
Roblox game Steal a Brainrot removes AI-generated character, sparking fan backlash and a debate over copyright

Roblox game Steal a Brainrot removes AI-generated character, sparking fan backlash and a debate over copyright

September 19, 2025
DeepSeek releases R1 model trained for 4,000 on 512 H800 GPUs

DeepSeek releases R1 model trained for $294,000 on 512 H800 GPUs

September 19, 2025

LATEST NEWS

Zoom announces AI Companion 3.0 at Zoomtopia

Google Cloud adds Lovable and Windsurf as AI coding customers

Radware tricks ChatGPT’s Deep Research into Gmail data leak

Elon Musk’s xAI chatbot Grok exposed hundreds of thousands of private user conversations

Roblox game Steal a Brainrot removes AI-generated character, sparking fan backlash and a debate over copyright

DeepSeek releases R1 model trained for $294,000 on 512 H800 GPUs

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.