BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
-
Updated
Aug 24, 2023 - Python
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
GPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想)
中文nlp解决方案(大模型、数据、模型、训练、推理)
RoBERTa中文预训练模型: RoBERTa for Chinese
a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Human ChatGPT Comparison Corpus (HC3), Detectors, and more! 🔥
Chinese NewsTitle Generation Project by GPT2.带有超级详细注释的中文GPT2新闻标题生成项目。
Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation
Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models
Pre-trained Transformers for Arabic Language Understanding and Generation (Arabic BERT, Arabic GPT2, Arabic ELECTRA)
A PyTorch implementation of "Graph Wavelet Neural Network" (ICLR 2019)
NLP 领域常见任务的实现,包括新词发现、以及基于pytorch的词向量、中文文本分类、实体识别、摘要文本生成、句子相似度判断、三元组抽取、��训练模型等。
Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
MindSpore online courses: Step into LLM
1. Use BERT, ALBERT and GPT2 as tensorflow2.0's layer. 2. Implement GCN, GAN, GIN and GraphSAGE based on message passing.
PyTorch Implementation of OpenAI GPT-2
Add a description, image, and links to the gpt2 topic page so that developers can more easily learn about it.
To associate your repository with the gpt2 topic, visit your repo's landing page and select "manage topics."