Global Feed Post Login
Replying to Avatar liminal 🦠

Now that's a statement i can't relate more to 😅

I first encountered LDA from a previous project -> "find k topics in n documents" sounds great until you realize you need to be pretty confident in how many topics there are. Then, not wanting to be sure if i can impose that assumption, i go to heirarchical LDA, but in both cases i just didn't know how to interpret the clusters.

embedding models, being primarily trained on natural language (bert, gpt) just need to associate coocurring words together. You get some high dimensional vector and the idea is if concepts are similar, they'll be close to each other in this vector space. So you can cluster spatially without needing to assume how many there are 🙂

Avatar
ben 1y ago

makes sense! thanks for elaborating

Reply to this note

Please Login to reply.

Discussion

No replies yet.