GLM

LLMs Explained,
GLM 130B

GLM is a General Language Model pretrained with an autoregressive blank-filling objective and can be finetuned on various natural language understanding and generation tasks. The model is trained on a diverse and extensive corpus of text data. GLM-130B, with 130 billion parameters, has demonstrated cutting-edge performance in various language tasks, including question-answering, sentiment analysis, and machine translation. On a wide range of tasks across Natural Language Understanding, conditional and unconditional generation, GLM outperforms BERT, T5, and GPT on common testing conditions.

Model Details View All Models

100+ Technical Experts

50 Custom AI projects

4.8 Minimum Rating

An Overview of GLM

GLM's largest variant, GLM-130B, is a large-scale language model with 130 billion parameters, trained on a diverse and extensive corpus of text data.

GLM has variants' parameter range from 110M to 130B

130B parameters

Largest GLM, GLM-130B, is a pre-trained language model developed with 130 billion parameters. It can capture complex linguistic patterns and nuances.

GLM-130B is trained on English as well as Chinese.

Bilingual Lang Model

GLM-130B is also evaluated on Chinese benchmarks as a bilingual LLM with Chinese. Trained on 1.0T Chinese WudaoCorpora, and 250G Chinese corpora.

GLM-130B outperforms ERNIE TITAN 3.0

Outperforms ERNIE TITAN

GLM-130B consistently outperforms ERNIE Titan 3.0 (largest Chinese LLM). Model outperforms ERNIE by at least 260% on two abstractive MRC datasets.

Blockchain Success Starts here

  • Introduction

  • Business Applications

  • Model Features

  • Model Tasks

  • Fine-tuning

  • Benchmarking

  • Limitations

  • Other LLMs