Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM
-
Updated
Jan 14, 2024 - Python
Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
Implementation of MusicLM, Google's new SOTA model for music generation using attention networks, in Pytorch
Implementation of AudioLM, a SOTA Language Modeling Approach to Audio Generation out of Google Research, in Pytorch
Implementation of Toolformer, Language Models That Can Use Tools, by MetaAI
Implementation of Make-A-Video, new SOTA text to video generator from Meta AI, in Pytorch
Implementation of Alphafold 3 in Pytorch
Awesome List of Attention Modules and Plug&Play Modules in Computer Vision
Implementation of Muse: Text-to-Image Generation via Masked Generative Transformers, in Pytorch
Implementation of Phenaki Video, which uses Mask GIT to produce text guided videos of up to 2 minutes in length, in Pytorch
Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
PyTorch Dual-Attention LSTM-Autoencoder For Multivariate Time Series
Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch
Implementation of MagViT2 Tokenizer in Pytorch
Unofficial implementation of iTransformer - SOTA Time Series Forecasting using Attention networks, out of Tsinghua / Ant group
Implementation of Band Split Roformer, SOTA Attention network for music source separation out of ByteDance AI Labs
Official PyTorch Implementation for "Rotate to Attend: Convolutional Triplet Attention Module." [WACV 2021]
🦖Pytorch implementation of popular Attention Mechanisms, Vision Transformers, MLP-Like models and CNNs.🔥🔥🔥
Implementation of Recurrent Memory Transformer, Neurips 2022 paper, in Pytorch
Add a description, image, and links to the attention-mechanisms topic page so that developers can more easily learn about it.
To associate your repository with the attention-mechanisms topic, visit your repo's landing page and select "manage topics."