Hubbry Logo
search button
Sign in
Allen Institute for AI
Allen Institute for AI
Comunity Hub
History
arrow-down
starMore
arrow-down
bob

Bob

Have a question related to this hub?

bob

Alice

Got something to say related to this hub?
Share it here.

#general is a chat channel to discuss anything related to the hub.
Hubbry Logo
search button
Sign in
Allen Institute for AI
Community hub for the Wikipedia article
logoWikipedian hub
Welcome to the community hub built on top of the Allen Institute for AI Wikipedia article. Here, you can discuss, collect, and organize anything related to Allen Institute for AI. The purpose of the hub i...
Add your contribution
Allen Institute for AI

The Allen Institute for AI (abbreviated AI2) is a 501(c)(3) non-profit scientific research institute founded by late Microsoft co-founder and philanthropist Paul Allen in 2014. The institute seeks to conduct high-impact AI research and engineering in service of the common good.[1] AI2 is based in Seattle, and also has an active office in Tel Aviv, Israel.[2]

Key Information

History

[edit]

Oren Etzioni was appointed by Paul Allen in September 2013 to direct the research at the institute.[3] After leading the organization for nine years, Etzioni stepped down from his role as CEO on September 30, 2022.[4] He was replaced in an interim capacity by the leading researcher of the company's Aristo project, Peter Clark.[5] On June 20, 2023, AI2 announced Ali Farhadi as its next CEO starting July 31, 2023.[5]

Teams

[edit]
  • Aristo: Aristo is a flagship project of AI2. Its original project goal was to design an artificially intelligent system that could successfully read, learn, and reason from texts and ultimately demonstrate its knowledge by successfully passing an 8th-grade science exam – the team achieved this objective in 2018. It was inspired by a similar project called Project Halo carried out by Seattle-based investment company Vulcan.[6] The current focus of the team is to build the next generation of systems that can systematically reason, explain, and continually improve over time.[7]
  • PRIOR: The PRIOR team seeks to advance the field of computer vision by creating AI systems that can see, explore, learn, and reason about the world.[8] The team released the open embodied AI platform AI2-THOR in 2016, supporting the training of AI agents in simulated environments.[9] In February 2018, the team released the game Iconary as a demonstration of an AI that can understand and produce situated scenes from a limited set of icons.
  • Semantic Scholar: Semantic Scholar tool is an artificial-intelligence backed search engine for academic publications publicly released in November 2015.[10] It uses advances in natural language processing to provide features such as summaries for scholarly papers, contextual information about inline citations, and the ability to create libraries of papers and receive paper recommendations.[11]
  • AllenNLP: The AllenNLP team works on research to improve NLP systems' performance and accountability, and advance scientific methodologies for evaluating and understanding NLP systems. The team produces its own research as well as open-source tools to accelerate NLP research.[12]
  • MOSAIC: The Mosaic project is focused on defining and building common sense knowledge and reasoning for AI systems.[13]
  • AI for the Environment: These teams seek to apply artificial intelligence solutions to the prevention of poaching and illegal fishing in locations around the world, as well as environmental problems like climate modeling and wildfire management. The teams in this group include EarthRanger, Skylight, Climate Modeling, and Wildlands.

Generative AI

[edit]

OLMo model family

[edit]

On May 11, 2023, AI2 announced they were developing OLMo, an open language model aiming to match the performance of other state-of-the-art language models.[14] In February 2024, 1B and 7B parameter variants of the model were open-sourced, including code, model weights with intermediate snapshots and logs, and contents of their Dolma training dataset,[15] making it the most open state-of-the-art model available.[16][17] In November 2024, AI2 released the second iteration of OLMo, OLMo 2, with the initial release including 7B and 13B parameter models.[18][19] In March 2025, AI2 released a 32B variant of OLMo 2, claiming to have released "the first fully-open model (all data, code, weights, and details are freely available) to outperform GPT3.5-Turbo and GPT-4o mini".[20]

Tulu models and post-training recipes

[edit]

In addition to the fully-open OLMo family of models, AI2 has also developed Tulu, a family of instruction-tuned models and open post-training recipes that build on open-weights base models (e.g., Meta's Llama) to provide fully transparent alternatives to proprietary instruction-tuning methods.[21] AI2 released the first iteration of Tulu in June 2023,[22] with subsequent iterations being released in November 2023 (Tulu 2[23]) and November 2024 (Tulu 3[24]).

See also

[edit]

References

[edit]
  1. ^ "About — Allen Institute for AI". allenai.org. Retrieved 2023-05-26.
  2. ^ "AI2 Israel — Allen Institute for AI". allenai.org. Retrieved 2023-05-26.
  3. ^ Cook, John (2013-09-04). "Going beyond Siri and Watson: Microsoft co-founder Paul Allen taps Oren Etzioni to lead new Artificial Intelligence Institute". GeekWire. Retrieved 2023-05-26.
  4. ^ Schlosser, Kurt (2022-06-15). "Oren Etzioni stepping down as CEO of Allen Institute for AI after nine years at research hub". GeekWire. Retrieved 2023-05-26.
  5. ^ a b Bishop, Todd (2023-06-20). "Apple machine learning leader Ali Farhadi named CEO of Allen Institute for Artificial Intelligence". GeekWire. Archived from the original on 2023-06-20.
  6. ^ "Aristo — Allen Institute for AI". allenai.org. Retrieved 2023-05-26.
  7. ^ "Aristo — Allen Institute for AI". allenai.org. Retrieved 2023-05-26.
  8. ^ "PRIOR". prior.allenai.org. Retrieved 2023-05-26.
  9. ^ "AI2-THOR". Allen Institute for AI. Retrieved 2023-05-26.
  10. ^ Rodriguez, Jesus (2021-07-08). "🔹🔸Edge#104: AllenNLP Makes Cutting-Edge NLP Models Look Easy". TheSequence. Retrieved 2023-05-26.
  11. ^ "Semantic Scholar | Product". www.semanticscholar.org. Retrieved 2023-05-26.
  12. ^ "AllenNLP — Allen Institute for AI". allenai.org. Retrieved 2023-05-26.
  13. ^ Dormehl, Luke (2018-04-13). "Forget Cloning, A.I. is the Real Way to Let Your Family Pooch Live Forever". Digital Trends. Retrieved 2023-05-26.
  14. ^ Groeneveld, Dirk; Beltagy, Iz; Walsh, Pete; Bhagia, Akshita; Kinney, Rodney; Tafjord, Oyvind; Jha, Ananya Harsh; Ivison, Hamish; Magnusson, Ian (2024). "OLMo: Accelerating the Science of Language Models". arXiv:2402.00838 [cs.CL].
  15. ^ Soldaini, Luca; Kinney, Rodney; Bhagia, Akshita; Schwenk, Dustin; Atkinson, David; Authur, Russell; Bogin, Ben; Chandu, Khyathi; Dumas, Jennifer (2024). "Dolma: an Open Corpus of Three Trillion Tokens for Language Model Pretraining Research". arXiv:2402.00159 [cs.CL].
  16. ^ AI2 (2023-05-18). "Announcing AI2 OLMo, an open language model made by scientists, for scientists". Medium. Retrieved 2023-05-26.{{cite web}}: CS1 maint: numeric names: authors list (link)
  17. ^ Wiggers, Kyle (2024-02-01). "AI2 open sources text-generating AI models -- and the data used to train them". TechCrunch. Retrieved 2024-02-03.
  18. ^ "OLMo 2: The best fully open language model to date | Ai2". allenai.org. Retrieved 2025-08-26.
  19. ^ OLMo, Team; Walsh, Pete; Soldaini, Luca; Groeneveld, Dirk; Lo, Kyle; Arora, Shane; Bhagia, Akshita; Gu, Yuling; Huang, Shengyi (2025). "2 OLMo 2 Furious". arXiv:2501.00656 [cs.CL].
  20. ^ "OLMo 2 32B: First fully open model to outperform GPT 3.5 and GPT 4o mini | Ai2". allenai.org. Retrieved 2025-08-26.
  21. ^ Coldewey, Devin (2024-11-21). "Ai2's open source Tülu 3 lets anyone play the AI post-training game". TechCrunch. Retrieved 2025-08-26.
  22. ^ Wang, Yizhong; Ivison, Hamish; Dasigi, Pradeep; Hessel, Jack; Khot, Tushar; Chandu, Khyathi Raghavi; Wadden, David; MacMillan, Kelsey; Smith, Noah A. (2023). "How Far Can Camels Go? Exploring the State of Instruction Tuning on Open Resources". arXiv:2306.04751 [cs.CL].
  23. ^ Ivison, Hamish; Wang, Yizhong; Pyatkin, Valentina; Lambert, Nathan; Peters, Matthew; Dasigi, Pradeep; Jang, Joel; Wadden, David; Smith, Noah A. (2023). "Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2". arXiv:2311.10702 [cs.CL].
  24. ^ Lambert, Nathan; Morrison, Jacob; Pyatkin, Valentina; Huang, Shengyi; Ivison, Hamish; Brahman, Faeze; Miranda, Lester James V.; Liu, Alisa; Dziri, Nouha (2025-04-14). "Tulu 3: Pushing Frontiers in Open Language Model Post-Training". arXiv:2411.15124 [cs.CL].
[edit]