- cross-posted to:
- technews@radiation.party
- hackernews@derp.foo
- cross-posted to:
- technews@radiation.party
- hackernews@derp.foo
Oddly enough, by posting this data publicly, those least viewed articles will end up getting a lot more views now.
I want to see a website that links to whatever is the least viewed Wikipedia article at any given time until all Wikipedia articles basically have the same number of views.
There is a site, that randomly shows YouTube videos with 0 views.
I do remember that. I suppose not enough people would ever use it for things to ever balance out.
https://en.wikipedia.org/wiki/Wikipedia:New_pages_patrol already does something like that to ensure all new pages get some minimum number of views to check the quality.
Really enjoyed the read. Thanks for sharing. I’m surprised by the random page implementation.
Usually in a database each record has an integer primary key. The keys would be assigned sequentially as pages are created. Then the “random page” function could select a random integer between zero and the largest page index. If that index isn’t used (because the page was deleted), you could either try again with a new random number or then march up to the next non empty index.
Marching up to the next non-empty key would skew the distribution—pages preceded by more empty keys would show up more often under “random”.
Fun fact, that concept is used in computer security exploits: https://en.wikipedia.org/wiki/NOP_slide
For choosing an article, it would be better to just pick a new random number.
Although there are probably more efficient ways to pick a random record out of a database. For example, by periodically reindexing, or by sorting extant records by random (if supported by the database).
Did you know one of the most translated articles on Wikipedia is none other than American actor Corbin Bleu?
https://www.insider.com/why-corbin-bleu-wikipedia-pages-2019-1
Very cool! I love stuff like this.