https://github.com/jtakahashi0604/tiny-experiment-nn-transformer
π Text:
"Let's build GPT" by @karpathy
βοΈ Note:
Finally got the Transformer algorithm running, Itβs alive π€
Huge thanks to @karpathy for the awesome tutorial.

π Text:
"Let's build GPT" by @karpathy
βοΈ Note:
Today, I learned about lower triangular matrices.
Iβm amazed they can replace a for loop.
π Text:
"Let's build GPT" by @karpathy
βοΈ Note:
Just started diving into the Transformer architecture.
Today, I implemented a BigramLanguageModel - a simple model predicting the next character from the curr character.
It's very simple, but seeing it output somewhat "plausible" text is a great start.

