these days I was taking a look on how transformers work (I still dont understand), and learned a lot about word embedding

it is so interesting, learned a lot about word2vec

Reply to this note

Please Login to reply.

Discussion

Samah’s been trying to explain transformers to me. I think I grok about 20%. Maybe.

Transformers are 🔥. Muti-head attention and RL from human feedback and self-supervised learning in general. IYKYK, and if you know, let’s talk.