Those are the absolutely worst examples you could provide. Every single one of those companies has been in the headlines for how demonstrably worse they have gotten over the years, even being pulled before congress in some cases.
Go by number of downloads? skellock's extension wasn't updated in 5 years, is marked archived on GitHub, and doesn't accept any fixes (and there's a handful of reported problems that could use fixing). So while its README suggests that it once had a lot of effort put into it, it's not the choice that will grow with you.
kokakiwi's has nextmost downloads, but its website and git repository is self-hosted on a site that was gone since December 2020, so also 5 years of staleness. I suppose that's another way to archive your extension.
nefrob's has fewer downloads yet, but got a single 5-star review (Open-VSX doesn't have a lot of active users), the GitHub repo was updated 3 months ago, and it seems alive. Also, the parser itself seems simplistic and admits to not get things perfect.
wolfmah's has been inactive for more than one year, and it contains a single commit.
It was even less obvious with Typst: There are 12 results for the keyword "typst", and the leading extension didn't have many downloads at all -- I can see now that it's #2 for downloads.
Altogether, downloads / stars / reviews are a great way to get an honest answer in many cases when there aren't commercial interests at play, since there are fewer incentives to game metrics.
No one has infinite time in the universe to read the code of all alternatives before deciding which one is best for their use case.
GitHub stars are a filtering mechanism.
Most engineers, when given 5 projects with the following star count - 5K, 2K, 500, 200, 100, will only evaluate the code of the first two projects.