-
Notifications
You must be signed in to change notification settings - Fork 15
Open
Description
- The Illustrated GPT-2 (Visualizing Transformer Language Models) – Jay Alammar – Visualizing machine learning one concept at a time.
- Fine-tune GPT2 for Text Generation Using Pytorch | Towards Data Science
- Beginner’s Guide to Retrain GPT-2 (117M) to Generate Custom Text Content | by Ng Wai Foong | AI Innovation | Medium
- Fine-tune a non-English GPT-2 Model with Huggingface | by Philipp Schmid | Towards Data Science
| Code | Support Chinese | Framework | Remark |
|---|---|---|---|
| openai/gpt-2: Code for the paper "Language Models are Unsupervised Multitask Learners" | No | Tensorflow 1.x | Official (Open AI) one; Better Language Models and Their Implications |
| minimaxir/gpt-2-simple: Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts | No | Tensorflow 1.x | minimaxir/textgenrnn: Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code.; OpenAI「假新闻」生成器GPT-2的最简Python实现 - 知乎 |
| yangjianxin1/GPT2-chitchat: GPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想) | Yes | PyTorch | Very cool project based on Huggingface; 用于中文闲聊的GPT2模型:GPT2-chitchat - 知乎 |
| rish-16/gpt2client: ✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝 | No | TensorFlow 1.x | 新加坡高中生开源轻量级GPT-2“客户端”:五行代码玩转GPT-2 - 知乎 |
| Morizeyao/GPT2-Chinese: Chinese version of GPT2 training code, using BERT tokenizer. | Yes | PyTorch | Based on Huggingface |
Huggingface
GPT2LMHeadModel
- OpenAI GPT2 — transformers 3.4.0 documentation
- Examples — transformers 2.0.0 documentation (About fine-tune)
- Converting Tensorflow Checkpoints — transformers 3.4.0 documentation
- How to train a new language model from scratch using Transformers and Tokenizers
TODO: WWM?!
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels