The Wayback Machine - https://web.archive.org./web/20221004011639/https://www.eleuther.ai/

EleutherAI

A grassroots collective of researchers working to open source AI research.

GPT-J

GPT-J-6B, a 6 billion parameter model trained on the Pile, is now available for use with our new codebase, Mesh Transformer JAX.

Mesh Transformer JAX on GitHub >

GPT-Neo

GPT-Neo 1.3B and 2.7B are now available on Hugging Face Model Hub! Run the models with Transformers or call for them through their on-demand Inference API.

EleutherAI on Model Hub >

GPT-Neo

GPT-Neo 1.3B and 2.7B, trained on the Pile, are now available to run with the GPT-Neo framework.

GPT-Neo on GitHub >

The Pile

We are proud to announce the release of the Pile, a free and publicly available 825GB dataset of diverse English text for language modeling!

Visit the Pile >

AlphaFold2 Replication

In Progress

CARP

Completed

CLASP

In progress

Eval Harness

In Progress

GPT-Neo

Completed

GPT-NeoX

In Progress

Mesh Transformer JAX

Completed

OpenWebText2

Completed

The Pile

Completed