Skip to content

carlomerola/TransformerSeq2Seq

Repository files navigation

Transformer_Seq2Seq

Model Description: Implementation of the Transformer Model Architecture in Keras.

✏️ Deep Learning Problem: The purpose of this project is to take in input a sequence of words corresponding to a random permutation of a given english sentence, and reconstruct the original sentence. The choice of the Transformer model as the atchitecture to use has been done especially for the ability to find correlations on tokens independently from their position in the sequence, unlike LSTM based NNs.

🔴 Problem Category: seq2seq problem.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published