This will be published in the July ISSA Journal. Just put it down here, already, to be able to link to it. ;-]
And, first, a picture:
[Toronno, ON]
After which (Dutch version linked here):
You have the Bystander Bug
One of the major pluses of open source software is that anyone, even you, could check the source code so, via logic, the chance that a somewhat hidden bug will be found in a heartbeat will rise to about one when enough people look at the source, now they can.
But we were recently surprised by just such a bug, with global implications. Sure, it turned out actually no-one keeps tabs on what open source software is used (in) where by whom.
So all the global software behemoths turned out to rely on pieces of open source software – and that software, maintained by literally a handful of volunteers on a budget of less than a couple of seconds of the major software vendors’ CEOs, actually had not been scrutinized to the level one would require even on a risk base. Certainly not given the widespread use which would make the risk to society grow high. Did we tacitly expect all the software vendors of the world to have inspected the open source code more carefully before it being deployed in the global infrastructure ..? How would one propose to organize that within those big, huge for-profit companies? What where (not if) the global infrastructure wasn’t ‘compiled’ into one but built using so many somewhat-black boxes? Virtualization and ‘cloud’ abstract this picture even further. Increasing the risks…
But more worryingly, this also means that ‘we all’ suffer from the Bystander Effect. Someone’s in the water, unable to get out, and we all stand by without helping out because our psychology suggests we follow the herd. Yes, there are the stories of the Ones that beats this and jumps in to the rescue – but there’s also stories where such heroes don’t turn up. And, apparently, in the open source software world, there’s too few volunteers, on budgets far less than a shoe string, that jump in and do the hard work of detailed code inspections. Which means there’s also a great number, potentially about all, of us that look the other way, have made ourselves unaware, and just want to do our 9-to-5 job anywhere in the industry. In that way, we’re suffering from the bystander effect, aren’t we ..?
And, even worse, so far we seem to have escaped the worst results of this in e.g., voting machines. Here, how close was the call where everyone just accepted the machine program-ming and expected that because of its open source nature (if …), “someone will have looked at it, right …!?”. Though of course, on this subject, some zealots (?) did indeed do some code checking in some cases, and the troubles with secrecy of votes overshadowed the potential (!) tallying biases programmed in, knowingly or not. But still… when ‘machines’ are ever more relied upon, moving from such simple things like voting machines to a combination of human-independent big data analysis/action automata with software-defined-everything and the Internet of Things, where’s the scrutiny?
Will a new drive for increased security turn out to be too little, too narrowly focused and for too short at time, as many if not all after-the-fact corrections have been? If we leave it to ‘others’ again, with their own long-term interests probably more at heart than our even longer-term societal interests, we still suffer from the bystander effect and we may be guilty by inaction.
But then again, how do we get all this stuff organized …? Your suggestions, please.
[Edited to add I: The above is the original text; improved and toned down a bit when published officially in the ISSA Journal]
[Edited to add II: This link to an article on the infra itself]
[Edited to add III: The final version in PDF, here.]