REDWOOD CITY, Calif., Oct. 28, 2024 /PRNewswire/ — Drawing on their deep knowledge of the brain and memory, the Tianqiao & Chrissy Chen Institute’s (TCCI) internal AI team achieved a major breakthrough in artificial intelligence, with their self-developed OMNE Multiagent Framework which took the top position on the GAIA (General AI Assistants) benchmark leaderboard (https://huggingface.co/spaces/gaia-benchmark/leaderboard), co-launched by Meta AI, Hugging Face, and AutoGPT by Hugging Face. OMNE outperformed frameworks from some of the world’s leading institutions, including Microsoft Research. This achievement builds on years of brain research at TCCI, equipping agents with Long-Term Memory (LTM) capabilities, which enable the framework to engage in deeper, slower thinking and enhance the decision-making capabilities of Large Language Models (LLMs) in complex problem-solving.
This milestone is a major accomplishment for TCCI’s AI team since the institute’s founder, former Chinese tech giant Tianqiao Chen, announced the “All-In AI Strategy” last year.
OMNE currently boasts an overall success rate of 40.53%, surpassing submissions from companies like Meta, Microsoft, Hugging Face, Princeton University, the University of Hong Kong, the British AI Safety Research Institute, and Baichuan. In comparison, GPT-4 equipped with plugins achieved a success rate of only 15%.
GAIA is one of the most demanding datasets for multi-agent intelligence and topping its leaderboard showcases the depth of TCCI’s AI expertise and their ability to push the boundaries of innovation.
OMNE is a multi-agent collaboration framework based on long-term memory (LTM). Each agent has the same and independent system structure and can autonomously learn and understand the complete world model, thereby independently understanding its environment. The multi-agent collaboration system based on LTM enables the AI system to adapt to individual behavior changes in real time, optimize task planning and execution, and promote personalized and efficient self-evolution.
This breakthrough is the integration of long-term memory mechanism, which greatly reduces the search space of MCTS and improves the decision-making ability on complex problems. By introducing more efficient, logical reasoning, OMNE not only improves the intelligence level of a single agent, but also significantly enhances the capabilities of the multi-agent system by optimizing the collaboration mechanism. This enhancement is inspired by the study of the columnar structure of the human cerebral cortex. As the basic unit of the brain’s cognitive and behavioral functions, the cortical column realizes information-processing through a complex collaboration mechanism. By strengthening the collaboration between single intelligence and agents, the AI model may gradually produce the emergence of cognitive abilities, build an internal representation model, and then promote a leap in the overall intelligence of the system.
“We are incredibly proud to see OMNE top the GAIA leaderboard,” said the head of TCCI AI team. “This achievement demonstrates the vast potential of using long-term memory to drive AI self-evolution and solve real-world problems. We believe that advancing research in Long-Term Memory and AI self-evolution is crucial for the ongoing development and practical application of AI technologies.”
Recruiting: AItalents@cheninstitute.org
SOURCE Tianqiao & Chrissy Chen Institute
3,250 InsuJet Starter Packs Being Delivered to Pharmacies Across CanadaToronto, Ontario--(Newsfile Corp. - November 22,…
The new app allows users to browse services, book IV therapy, manage appointments, purchase packages,…
PHILADELPHIA, Nov. 22, 2024 /PRNewswire/ -- Independence Blue Cross (IBX) is implementing the Epic Payer Platform…
NEW YORK, Nov. 22, 2024 /PRNewswire/ -- Report with market evolution powered by AI -…
To help improve clinician workflow and visualization during endoscopic bronchoscopy procedures, the company has added…
eClinicalWorks and the AI medical scribe assist the 126-provider health center in completing documentation at…