Skip Navigation

GPT-4's details are leaked.

threadreaderapp.com Thread by @Yampeleg on Thread Reader App

@Yampeleg: GPT-4's details are leaked. It is over. Everything is here: twitter.com/i/web/status/1… Parameters count: GPT-4 is more than 10x the size of GPT-3. We believe it has a total of ~1.8 trillion parameters ac...…

Thread by @Yampeleg on Thread Reader App

cross-posted from: https://lemmy.intai.tech/post/72919

Parameters count:

GPT-4 is more than 10x the size of GPT-3. We believe it has a total of ~1.8 trillion parameters across 120 layers. Mixture Of Experts - Confirmed.

OpenAI was able to keep costs reasonable by utilizing a mixture of experts (MoE) model. They utilizes 16 experts within their model, each is about ~111B parameters for MLP. 2 of these experts are routed to per forward pass.

Related Article: https://lemmy.intai.tech/post/72922

26
TechNews @radiation.party irradiated @radiation.party
BOT
[HN] GPT-4 Details Leaked
Natural Language Programming | Prompting (chatGPT) @lemmy.intai.tech manitcor @lemmy.intai.tech
GPT-4's details are leaked.

You're viewing a single thread.

26 comments
You've viewed 26 comments.