Andrew Lih, 03/09/2015 19:12:
four randomized field experiments
At least, for once, this is not an euphemism of "random acts of vandalism": they say «one true positive and one true negative fact from a reputable news source on each senator that was not currently mentioned on that senator’s Wikipedia page».
What they find would probably not even be called bias by normal people: «We find no systematic evidence that this effect is moderated by covariates (state population, Senate class, character count of page, geographic region, length of incumbency, political party [...]». Besides, one on the experiments had opposite outcome: «the positive facts were removed more quickly than the negative ones».
Their conclusion may be disputable, given their own findings. Is seems they ignore the experiment with different outcome. They also don't consider that a bias in assessing individual edits doesn't necessarily imply a bias is the end article, as they blindly assume.
A possible explanation of the findings is that the reversions be "anti-cyclical": if articles of inactive senators usually receive more bad edits on the positive side than on the negative, and articles on active senators the opposite (which is likely), then it makes sense that editors are more critical of the most frequent error and revert more often; the bias in removing would cancel the bias in adding. But their experiment doesn't consider the whole article and can't say anything about its quality or bias.
Nemo