Project Brief
Transformer architecture built from scratch with reference to the paper and trained on opus_dataset’s language translation dataset to translate english to italian
Accomplishments
- Implemented the entire transformer architecture from scratch, including the attention module with reference to Attention is All You Need paper
- Prepared opus_dataset’s language translation dataset
- Trained the transformer architecture and got coherent results after 20 epochs
Technologies Used:
Training
