Whenever a major website has significant downtime, people start to wonder: is it intentional? Is Anonymous behind it? Or a secretive group of enemy government hackers?
It’s a reasonable assumption, as it turns out that DDoS—distributed denial of service—attacks are relatively easy to pull off these days. To accomplish it, a ne’er-do-well need only harness thousands of “zombie” computers, point them toward their intended target, and harass the web servers with so much traffic that they are overwhelmed. It’s a temporary effect, but can cause severe economic damage.
It used to be that coordinating such an attack required a great deal of skill. A criminal needed to first infiltrate those thousands of machines using some kind of trojan horse or other malware. To harness their collective power, they would stitch together a “botnet” by designing a way to control them all remotely by issuing them commands, then bend them all to whatever nefarious purpose they have in mind. (Besides DDoS attacks, botnets also send a lot of spam.) Today, however, pre-configured botnets can be rented for a pittance. One source claims to rent a 10,000-strong network of zombie machines for $200.
This got me wondering: why not rent a botnet, and use it for good?
Botnets are a scourge. But what exactly makes botnets bad? Three things:
- BAD ETHICS: They do things without the CPU owner’s consent.
- BAD RESULTS: They are used to do bad things (like clogging inboxes with spam or shutting down other websites).
- SIDE EFFECTS: Doing those bad things often comes with negative side-effects for the CPU owner herself (slowing the computer, increasing power consumption, etc.).
[email protected] is a program hosted at Stanford that has a scientific mission: to understand how human proteins fold (or “misfold”). “Misfolding” proteins are known to have a hand in several high-priority human diseases, notably Alzheimer’s. But as it turns out, to simulate protein folding on a computer takes an enormous amount of computing power, something like 30-60 CPU-years per fold, for an average home computer.
Why not just use a supercomputer? Because it’s enormously expensive. To build a supercomputer capable of doing these calculations would cost somewhere around $100-250 million, followed by around $6-7 million of power per year to run the behemoth. But there’s another way: harness the “unused” cycles on thousands of average home computers, all working on a piece of the puzzle, without any of the up-front costs. It’s a voluntary botnet.
So here we have roughly 1-2 million good souls volunteering CPU time from their home computers to various “good” botnets, and probably hundreds of millions of poor saps having their CPU time used in “bad” botnets for nefarious purposes without their knowledge.
Can we make bad botnets good?
What if the scientists at [email protected] simply rented a “bad” botnet in order to do more potentially life-saving calculations? Is it defensible?
No. It’s not.
On the one hand, no. Doing anything at all with someone’s computer without their consent is wrong (not to mention illegal, but let’s ignore that for the purposes of the thought experiment). And it still causes the negative side-effects.
But suppose the [email protected] folks could design a bit of code that would help their cause, but utilize the botnet without the negative side-effects to the CPU owners. For example, they could design the program to use very small amount of computing power (say, 1% of CPU), and only when they are sure the computer is idle. In that case, the botnet rental has two direct benefits: an incremental benefit to society via their research and an incremental benefit to the CPU owner, who is then NOT used for “bad” code that has negative side effects to their machine.
And there’s a third, indirect benefit: [email protected] would be crowding out criminal access to botnets, reducing the total botnet resource pool available for “bad” purposes.
There are plenty of issues with this line of reasoning. For instance, renting the botnet involves giving money to criminals, thus increasing their incentive to recruit more zombie machines. The result is that more machines will be recruited into a botnet that would not otherwise have been, perhaps through increased R&D efforts on the part of the botnet creators. But this feels like a small price to pay at the margin. Perhaps [email protected] can mitigate this issue by donating an equivalent amount of money to security firms battling botnets, like a carbon offset. This would double their cost, but at pennies per machine, it may be worth it.
Furthermore, there is precedent for paying money to (potential) criminals for the purpose of crowding out bad behavior: cash for guns programs. These initiatives offer money for guns, often with “no questions asked” policies—the guns can be legal or illegal. And it works. Guns come off the street, and gun-related deaths fall.
Or maybe this is just how computers should work
Botnets exist. They are a resource, a tool. Renting them to do life-saving research may be a net social improvement in the short run, a reality where botnets will be used for either good or evil, available to the highest bidder.
But what about the long run? It’s hard to imagine a socially acceptable equilibrium where spare CPU cycles are harvested without consent. And in the long run, the goal should be to prevent unauthorized access entirely (a pipe dream perhaps, but we’re talking about the long run here). So what can a policy-maker, or better yet a technology manufacturer, do to capitalize on this model?
For the moment, [email protected] and its ilk are left to their own devices to convince CPU owners to donate resources. Illicit botnet creators have the luxury of forcing the issue, doing an end-run around all kinds of end-user complexity, such as opting in, installing software, granting access to a command and control system, and so on. These are high hurdles that the good botnet crowd face, and explain the fact that they are out-numbered by bad botnet participants 100-to-1.
What if this were just the way computers worked in the first place? Imagine an operating system where all of that complexity is baked in, where users can opt in and control the services or social issues they want to support. Or even, like a utility offering to buy excess solar power from its customers, sell access to their spare cycles to commercial cloud computing providers (who can avoid building expensive data centers). After the “Welcome to [OS X/Windows 15/Android/iOS 10]” screen, people are offered a set of selections for either donating or selling spare cycles, and that’s just what people exp
ect from the get-go.
There’s a vast infrastructure out there—spare cycles—currently being co-opted by criminals. Let’s take it back and put it to good use.