John Sarapata’s (BS ’87) road from Caltech to Google was admittedly winding, including stints everywhere from defense contractor TRW* to an illustration software start-up to banking giant JP Morgan. It seems random on paper, he says, but there’s a common thread: “Pretty much every job basically entailed, ‘Here is a PDE (partial differential equation), solve it with computers,’” says Sarapata with a laugh. He’s still solving problems at Google, but they aren’t financial or technical—they’re social and political. Sarapata is head of engineering at Jigsaw—an incubator within Google’s parent company, Alphabet—which he describes as “focused on helping people facing organized abuse or oppression.” “We’re looking for ways that emerging technology can be applied in creative ways to address some of these problems,” he says. It’s a broad charge—more than any one organization can address, he notes—filled with seemingly intractable problems. But Sarapata is optimistic. “I’m starting to feel like, little by little, we’re getting our footing—and we’re starting to make some progress,” he says. And he’s motivated by that progress: “The fact that we have all the resources of one of the most successful companies that the world has ever known and can be focused on doing good for the world is very, very inspiring.” Below are three examples of Jigsaw’s impact.
Jigsaw’s Project Shield began in response to feedback from political opposition groups in Eastern Europe, whose websites were being shut down by government-directed DDoS (distributed denial of service) attacks—which overloads sites with requests, rendering them inoperative. “That’s the entire reason we came up with Project Shield—to say that the opposition should have a voice,” says Sarapata. Jigsaw recently offered its free DDoS protection services to opposition groups in advance of elections in Ecuador, where the government has previously used DDoS attacks to silence opposition groups. Its most high-profile project came in September 2016, when it successfully defended the website of an investigative journalist against the Mirai botnet—which was powered by a massive network of infected Internet of Things devices, including baby monitors. “It was the largest DDoS attack ever seen,” says Sarapata. “So we had a chance to harden our system against a very dedicated and very capable attacker.”
If you’ve ever waded into the comments section of any online article, you know that the discussion often devolves into crude insults. “We were really worried about the lack of online conversation,” says Sarapata, noting that users were often bullied into silence. “A lot of newspapers were shutting off their comment sections entirely.” Human moderation works, but it’s expensive and time-consuming. “Every comment that goes up on The New York Times is human-moderated before it goes live, but they only have enough moderators to handle 20 articles and videos a day, and they publish 200 articles and videos a day,” says Sarapata. “So they just disabled comments on the other 90 percent.” But Jigsaw has been working with the Times to employ Conversation AI’s programming code that can automatically detect “toxic” speech like insults and abusive comments. But how much quality do you give up when you switch from human moderators to a series of algorithms? Of the 12,000 article comments that the Times received on a recent day, Conversation AI and the human moderators disagreed 30 times. “And then we went to the manager of the moderators, who actually agreed with the model in 28 of those cases and with the human moderators in two,” says Sarapata. Three of the world’s five largest commenting platforms are now employing Conversation AI’s code, he notes, which gives him some optimism. “We still think we’ve got a long way to go, but I think right now I feel like it’s a problem that we can solve. And I did not feel the same way two years ago.”
“Sarapata says it’s impossible to know for sure if the project changed any minds, but he says it’s powerful to think that they found a way to get people who were potentially on a path to radicalization to stop and take eight minutes to listen to the other side.”
Seeking to disrupt ISIS’s sophisticated online recruiting efforts, Jigsaw decided to offer would-be members an alternative perspective. “We are not going to take things out of search results—that’s totally against our corporate philosophy,” says Sarapata. “But one of the things that we can do is to try and surface other ideas.” Their target was very specific: Potential recruits who weren’t just interested in learning more about ISIS but were ready to join, and busy searching for things like how to travel to Aleppo. Targeting related search keywords, users were shown ads—text, image, and video—that countered ISIS’s recruiting messages, with links to curated content like speeches from a moderate cleric or videos that showed the grim reality of life inside the caliphate. “To actually see what it’s like on the ground there is a very strong antidote,” he says. The results: Jigsaw’s curated ads saw three to four times the click-through rates of regular ads and had an average viewing time of about eight minutes. Sarapata says it’s impossible to know for sure if the project changed any minds, but he says it’s powerful to think that they found a way to get people who were potentially on a path to radicalization to stop and take eight minutes to listen to the other side.
*Simon Ramo (PhD ‘36) and Dean Wooldridge (PhD ’36) are the R and W that make up TRW.
Dan Lieberman and his team are working to bring oxygen to children at risk of dying from pneumonia in the developing world.
“Venturing out in the dead of night, mastering the steep slopes of Mt. Lee, dangling 45 feet in the air, the incurable pranksters of California Institute of Technology struck again,” the Los Angeles Times reported on May 19, 1987.
More posts like this
How research into a crippling African weed may be leading the way to a new green revolution.