Individuals and corporations spend millions of dollars every year on software that sniffs out potentially dangerous bugs in computer programs. And whether the software finds 10 bugs or 100, there is no way determine how many go unnoticed, nor to measure the efficacy of bug-finding tools.
Researchers at New York University Tandon School of Engineering, in collaboration with the MIT Lincoln Laboratory and Northeastern University, are taking an unorthodox approach to tackling this problem: Instead of finding and remediating bugs, they’re adding them by the hundreds of thousands.
The most popular social network in the world has conducted an experiment aimed to test users’ dedication to Facebook.
It’s believed that within several weeks Facebook secretly crashed its Android app using a slew of artificial errors in code. The experiment was designed to test at what point a Facebook user would give up.
According to the results of the experiment, rather than give up, usually users continued to visit the site. They found other ways to access the social network, such as the mobile site or other.
Probably such a test has been a step that Facebook was taking to prepare for the possibility that Google one day removes its apps from the Play Store for competitive reasons. The clash of interests could be caused by Facebook upgrades, such as videoservice and posting of articles straight on Facebook servers. If Google ever removes Facebook app from its Play Store, the users would have to find other ways to download the app.
This is not the first time Facebook has monitored its users’ emotional responses. In October 2014 the company was criticized for subjecting users to psychological testing, which manipulated users' emotions using the Facebook News Feed.