Amazon is learning the hard way that computer algorithms are not perfect and can create problems. Consider here and here. In the first instance, its algorithms suggested bomb-making paraphernalia when one searched for an ingredient. In the second instance, it assumed thousands of women were pregnant and in the market for baby things. Amazon has apologized in both cases and adjusted its formula, but there will be others because no algorithm is correct all of the time without human intervention. Facebook is learning the same lesson with the discovery that it was allowing “Jew Haters” to advertise on its site. Facebook is putting people in charge to catch future attempts by neo-Nazis and others who rail against Jews. The challenge both companies face is that the bigger they get, the harder it is to monitor their systems. We haven’t seen the last of this.