~350gbs partitioned into paragraphs.
make those partioned paragraphs into individual notes (replacing the citations/latex unique identifiers with their actual text)
this is one use case for the modular article nip
~350gbs partitioned into paragraphs.
make those partioned paragraphs into individual notes (replacing the citations/latex unique identifiers with their actual text)
this is one use case for the modular article nip
In addition to the actual citation network built by the authors, im taking the word embeddings of a subset to create a more organic semantic network -> higher resolution knowledge navigation at the paragraph level
Have a script to mass publish modular articles (kind 30040 + 30041) rate limiting do be a bitch though