赞
踩
目录
Understanding Transformers: A Step-by-Step Math Example — Part 1了解 Transformer:分步数学示例 — 第 1 部分
This blog is incomplete, here is the complete version of it:该博客不完整,以下是完整版本:
Inputs and Positional Encoding输入和位置编码
Step 1 (Defining the data)步骤 1(定义数据)
Step 2 (Finding the Vocab Size)第 2 步(计算词汇量)
Step 3 (Encoding and Embedding)步骤 3(编码和嵌入)
Step 4 (Positional Embedding)步骤 4(位置嵌入)
Step 1 (Performing Single Head Attention)第 1 步(执行单头注意力)
Step 1 — Defining our Dataset第 1 步 - 定义我们的数据集
Step 2— Finding Vocab Size第 2 步 — 查找词汇量
Step 4 — Calculating Embedding第 4 步 — 计算嵌入
Step 5 — Calculating Positional Embedding第 5 步 — 计算位置嵌入
Step 6 — Concatenating Positional and Word Embeddings第 6 步 — 连接位置嵌入和词嵌入
Step 7 — Multi Head Attention第 7 步 — 多头注意力
Step 8 — Adding and Normalizing第 8 步 — 添加和规范化
Step 9 — Feed Forward Network第 9 步——前馈网络
Step 10 — Adding and Normalizing Again第 10 步 — 再次添加并标准化
Step 11 — Decoder Part第11步——解码器部分
Step 12 — Understanding Mask Multi Head Attention第 12 步 — 了解 Mask Multi Head Attention
Let’s do a simplified calculation:我们来做一个简单的计算:
Step 13 — Calculating the Predicted Word第 13 步 — 计算预测词
Transformer Architecture explainedTransformer 架构解释
How GPT3 Works - Visualizations and Animations
The GPT-3 Architecture, on a Napkin餐巾纸上的 GPT-3 架构
OpenAI GPT-3: Understanding the ArchitectureOpenAI GPT-3:了解架构
What are Language Models?什么是语言模型?
How does language modeling work?语言建模如何工作?
OpenAI GPT-3 ArchitectureOpenAI GPT-3 架构
Why GPT-3 is so powerful?为什么GPT-3如此强大?
Building machine learning models/code构建机器学习模型/代码
How Can We Get Our Hands on the Model?我们如何获得模型?
Limitations of OpenAI GPT-3OpenAI GPT-3 的局限性
I have already written a detailed blog on how transformers work using a very small sample of the dataset, which will be my best blog ever because it has elevated my profile and given me the motivation to write more. However, that blog is incomplete as it only covers 20% of the transformer architecture and contains numerous calculation errors, as pointed out by readers. After a considerable amount of time has passed since that blog, I will be revisiting the topic in this new blog.
我已经写了一篇详细的博客,介绍变压器如何使用非常小的数据集样本工作,
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。