It seems to fix all the problems I had with Markdown (the unnecessary complexity and ambiguity) and a ton of other problems I had no idea about, looks pretty good.
I think I read somewhere that his Djot parser is 8x faster than his Markdown parser.
It seems to fix all the problems I had with Markdown (the unnecessary complexity and ambiguity) and a ton of other problems I had no idea about, looks pretty good.
I think I read somewhere that his Djot parser is 8x faster than his Markdown parser.
Is it geographically neutral like markdown?
Is it automatically compatible with a large percentage of wikipedia articles like wikitext?
Is there some other advantage you pretend would make up for the flaw of missing both points above?
Yes, it is "geographically neutral", I don't know why you're so attached to this.
No, nothing is compatible with Wikipedia articles, not even Wikipedia itself. Wikitext is not a standard, it's not a format, it's a language, wikitext is hell. The only way to parse Wikipedia articles "correctly" is by using the same parser they use which is some cursed mix of JavaScript and PHP.