In 2016, cybersecurity pro Patrick Wardle heard a story that deeply disturbed him: cybercriminals were using malware to surreptitiously spy on people through their MacOS webcams and microphones. In one particularly unsettling case, a hacker had used a malware called “Fruitfly” to hijack the webcams of laptops with the goal of spying on children.
Wardle had experience spotting these kinds of programs. Prior to moving into the private sector, he had worked as a malware analyst at the National Security Agency, where he analyzed code used to target Defense Department computer systems. Experienced in playing digital defense, Wardle decided to do something about the spyware threat: he created OverSight, a MacOS tool that lets you monitor your webcam and mic for signs of malware manipulation. “It was really popular, everyone loved it,” he said of the tool, which he released for free via his IT non-profit Objective-See.
However, a couple years later, Wardle was analyzing some suspicious code for a client and came across something weird within a tool that had been downloaded onto the client’s own device. The tool was created by a major company but offered similar functionality to OverSight, including the ability to monitor a MacOS webcam and mic. Sifting through the program, Wardle found familiar code. Too familiar. His entire OverSight algorithm—including bugs that he had failed to remove—was contained within the other program. A developer had reverse-engineered his tool, stolen his work, and repurposed it for a different but nearly identical product.
“The analogy I like to use is plagiarism: someone has copied what you have written and they copied your spelling and grammar mistakes,” said Wardle. “I always say there are many ways to skin the proverbial cat but this was like blatant copyright [infringement].”
The developer was taken back. He contacted the company immediately and attempted to alert them to the fact that a developer had hijacked his code. Unfortunately, Wardle said, it wasn’t the last time he would find that a company had co-opted his work. Over the course of the next couple years, he would find evidence that two other major companies had employed his algorithm for their own products.
This week, Wardle gave a presentation on his experiences at black hat, the annual cybersecurity conference in Las Vegas. Alongside John Hopkins University professor Tom McGuire, Wardle demonstrated how reverse engineering—the process by which a program is taken apart and reconstructed—can reveal evidence of such theft.
The developer has declined to identify the companies that stole his code. This isn’t about revenge, he says. It’s about identifying a “systemic issue” affecting “the cybersecurity community,” he said. To do that, Wardle used this week’s talk to outline some lessons he had learned while attempting to notify companies about the theft issue.
“You reach out to these companies and say, ‘Hey, you guys, you basically stole from me. You reverse engineered my tool and reimplemented the algorithm—that’s legally very… uh, gray.’ In the EU, there is a directive that if you…[do that] that’s illegal. But also just the optics are bad. I run a non-profit. You’re essentially stealing from a non-profit and putting this in your commercial code and then profiting from it. Bad look,” he says, chuckling.
The responses Wardle got were often mixed. “It depends on the company,” he said. “Some are great: I get an email from the CEO admitting it and asking, ‘What can we fix?’ Awesome…[With] others, it’s a three-week internal investigation, and then they come back and tell you to take a hike because they don’t see any internal consistencies.” In those cases, Wardle has had to provide more evidence of what happened.
Why does this sort of thing even happen in the first place? Wardle says his views have shifted over time. “I went in thinking these were evil corporations out to squash the independent developer. But in every case, it was essentially a misguided or naive developer who had been tasked with [finding a way to] monitor the mic and the webcam…and then he or she would reverse engineer my tool and steal the algorithm…and then nobody in the corporation would ask, ‘Hey, where did you get this from?’”
In all three cases, after Wardle stated his case to a company, executives eventually admitted wrongdoing and offered to rectify the situation. To effectively make his case, however, Wardle often had to show them the evidence. He said he had to take their own, closed-source software and employ reverse-engineering to understand how their code worked and demonstrate its similarity to his own. To bolster his case, Wardle also teamed up with the non-profit Electronic Frontier Foundation (EFF), which offers pro-bono legal services to independent security researchers. “Having them on my side gave me a lot of credibility,” he said, suggesting that other developers also employ a similar strategy.
“I’m in a good position because I collaborated with EFF, I have a large audience in the community because I’ve been doing this for a long time,” said Wardle. “But, if this is happening to me, this is happening to other developers who might not have quite [the same standing]…and in those cases the companies might just tell them to take a hike. So what I’m really trying to do is talk about this and show that, ‘Hey this is not okay.’”
As to how widespread the practice of algorithm theft is, Wardle believes it’s quite prevalent. “I believe it’s a systemic issue because as soon as I started looking I didn’t just find one, I found several. And they [the companies] were all completely unrelated.”
“One of the takeaways I’m trying to push is, if you’re a corporation, you really need to educate your employees or developers [not to steal]. If you do this, it puts your entire organization at legal risk. And, again, the optics look really bad,” he said.