Galactica

LLMs Explained,
Galactica

Galactica is a large-scale language model developed by Meta AI in collaboration with Papers with Code. It has been trained with 48 million papers, textbooks and lecture notes, millions of compounds and proteins, scientific websites, encyclopedias, and more from the "NatureBook" dataset. Galactica's computational performance is comparable to other state-of-the-art language models, making it a promising technology for real-world applications in natural language processing. With its exceptional efficiency and performance, Galactica represents a significant step forward in developing NLP systems, enabling researchers and practitioners to tackle complex language-related challenges and unlock new opportunities for innovation.

Model Details View All Models

100+ Technical Experts

50 Custom AI projects

4.8 Minimum Rating

An Overview of Galactica

Galactica is a large-scale language model developed by by Meta AI in collaboration with Papers with Code.

Scales up to 120B parameters and improves training time!

120B parameters

The model has multiple architecture variations, ranging from the base architecture with 125M parameters to larger models with up to 120B parameters.

Outperforms the latest GPT-3 on various NLP tasks

Outperforms GPT3

On technical knowledge probes such as LaTeX equations, the Galactica model outperforms the latest GPT-3 large language model by 68.2% versus 49.0%.

Outperforms the BLOOM on various NLP tasks

Outperforms BLOOM

Galactica outperforms BLOOM and OPT-175B on BIG-bench. It also sets new state-of-the-art downstream tasks such as PubMedQA and MedMCQA.

Blockchain Success Starts here

  • Introduction

  • Business Applications

  • Model Features

  • Model Tasks

  • Fine-tuning

  • Benchmarking

  • Sample Codes

  • Limitations

  • Other LLMs