Four leading cyber security researchers sat down for a panel - moderated by Duo Security's Wendy Nather - at the IP Expo in London yesterday to air their catastrophic predictions about the biggest infosec threats in the coming years.
Nightmare scenarios regarding autonomous killer robots, terrorists taking control of fleets of autonomous vehicles, AI writing its own malware, and the threat posed by IoT devices all quickly followed.
Note: The first published version of this article did not mention by name or acknowledge the contribution to this panel of moderator Wendy Nather. This was in line with our usual editorial policy of not naming panel moderators, but was not appropriate in this case given the expert commentary and analysis Ms Nather provided. Given the gender profile of the panel, this oversight was deeply unfortunate and we apologise for any offence caused. IDG and ComputerworldUK are committed to promoting diversity within IT, and have a policy of never having all-male panels at our own events. Again, we acknowledge our error, and apologise for any offence caused.
Rik Ferguson, global VP of security research at Trend Micro started off by stating that an area "ripe for innovation in the security and criminal landscape" is artificial intelligence and machine learning.
"One thing I find scary is the fact that we have a petition to the UN from 120 leading academics to outlaw autonomous weaponry," he said. "We are already in Skynet, that is the world we live in. So I have no doubt attackers will start using AI to build autonomous attack machinery online, as well as physical autonomous weaponry."
Ferguson pointed out that although there is much good work that can be done on open machine learning platforms from the likes of Amazon Web Services and Google's TensorFlow, these could also be exploited for criminal misuse.
James Lyne, global security advisor at Sophos, agreed, pointing to the specific threat of "metamorphic malicious code with a broad set of instructions".
"This will try to leverage the infrastructure to achieve its goal to reach a scary level of collateral damage to what we have today," Lyne said. "We are on a path to do that in the current world of legitimate tech for legitimate reasons, while still fighting over freedom and encryption. So we get to that technology level, and the opportunities it creates for attackers, long before we have got to protections."
Mikko Hypponen, chief research officer at F-Secure went on to discuss the implications this will have for autonomous weapons - killer robots to you and I.
"[It's] clear that these autonomous robots will become a reality when you think it through," Hypponen said. "Look at drones, they may not be autonomous, they are probably operated by someone in Nevada to shoot someone in Syria, but the obvious weakness is the link from drone to human, as that link can be disrupted or cut or spied on.
"So removing that weakness is simple: make the drone smart enough to work without the human. It's going to happen and it is scary as hell."
Ferguson didn't just come armed with warnings, as he urged the security and IT industry to "see that coming" and "make sure we build our own toolsets to harness that capability of AI and ML and set it loose as an ethical, autonomous hacker. This will operate at speed and scale unlike we have seen before, so we need to see this coming."
Ferguson said that it is the responsibility of the security industry to deploy these AI and ML tools as "a new iteration of chaos monkey" - essentially as a way of staying ahead of the criminals.
Moving onto the threat IoT devices pose, Ferguson started out by stating that the manufacturers and the people "who are good at security" aren't in the same companies.
Both Ferguson and Hypponen suggested that the solution is to come up with a regulated standard for connected devices. Ferguson said that the industry must work harder to figure out minimum security standards including encryption, standardised testing and an update strategy, all "from the ground up" and to be able to enforce it "across the board".
Hypponen added that manufacturers "don't need to be specific" about how they secure their devices.
"We just need to make them responsible for the damages from hacks," he said. "That will incentivise them to make the devices secure."
Autonomous vehicles and terrorists
Ferguson asked the audience: "What are terrorists' favourite weapons right now? Vehicles."
He then posed a scenario where a "whole fleet of cars on a common subsystem, with common vulnerabilities" provides a "distributed arsenal" to terrorists.
"If we don't concentrate and think of real world ramifications that's where we are going," he said. "So we have to have this conversation, as it is no longer about credit card details, more is at risk."