Artificial intelligence (AI) and machine learning play an important role in helping law enforcement deal with increasing threats, but the need then for access to data is likely to further drive concerns about privacy.
Closer collaboration between the private and public sectors as well as citizens also would be essential, according to delegates at Interpol World 2017 in Singapore this week.
The rise of urbanisation, globalisation, and online connectivity had unleashed tremendous amount of data that was never before available, Anselm Lopez, director of strategic relations directorate, international cooperation and partnerships division at Singapore's Ministry of Home Affairs. He also is part of Interpol's Asia executive committee.
Lopez noted that data had become a critical element in decision making for law enforcement, as it had for enterprises, and these agencies would have to adapt or be rendered irrelevant.
He said the data could be used and analysed to combat crime and threats, including terrorism, incident response, and cybercrime. Failure to do so efficiently, especially amid the deluge of data available, could lead to law enforcement missing out on critical details and making decisions that were not supported by sound analysis, and possibly leading to loss of lives or losses.
Law enforcement agencies then would need to determine how they could integrate data acquisition and analytics on a daily basis to sharpen risk management, and do so in and around locations during large-scale crisis situations so their internal systems could support decision making.
Adding that there was no cookie-cutter approach, Lopez said systems and methodologies would need to extract data and be able to distinguish innocuous events from real and serious threats. "We must acquire the ability to distill the noise and sharpen our focus," he said.
This further emphasised the importance of partnership between the private sector and law enforcement, which would ensure the necessary capabilities were developed "to fight the new order of threats".
Speed and efficiency, for one, would be crucial. Within hours of the Boston bomb attacks in April 2013, for instance, law enforcement had to process more than 2,300 videos, 9,600 calls, and 5,500 tips from the public. In the more recent Manchester bombing incident, some 30,000 man-hours were spent scrutinising videos for information.
The large volumes of data output today placed an unacceptable level of burden on humans, including law enforcement agencies that had finite resources, said Michael Hershman, group CEO for International Centre for Sport Security (ICSS), who called for tools that could help ease the pressure.
A common thread in security incidents was the difficulties faced by law enforcement and security agencies in identifying acts ahead of time and preventing them, Hershman said. Investigations conducted after such incidents sometimes revealed that information available prior to the event would have at least prompted a closer look at the instigators, but failed to trigger alerts due to the inability to process data in a timely fashion.
While it would be impossible to prevent all acts of violence, he said technology could make a profound impact in helping to prevent a significant number.
ICSS was developing a "data fusion system (DFS)" that aimed to provide a predictive analysis platform to collect, integrate, and analyse data. This would be used to help emergency services and law enforcement agencies predict potential threats and facilitate higher security at events, such as the 2022 Fifa World Cup to be held in Qatar.
The platform, for instance, would be able to use behavioural analytics to assess social media narratives and identify individuals who were being radicalised. Such data could then support operational command centres at event sites and matched against persons identified on-site.
"Technology that is used in the right way can play a pivotal role in protecting and securing large-scale major sporting events," he said. "There is now a clear need to help strengthen the smart data and security capabilities of major events and law enforcement agencies, as well as proactively enhance their situational awareness in the high-demanding security situations."
Darktrace's global CEO Nicole Eagan said there were early signs that threat actors, too, were starting to use AI and machine as part of their attacks. They presumably did so to stay undetected within a network so they could learn as much as they could from the organisation, such as intellectual property and product development information, said Eagan, in an interview with ZDNet on the sidelines of Interpol World.
She said it was crucial that companies, in turn, used AI to slow down attacks and provide time for security teams to rectify the breach before systems could be encrypted, for instance, in the case of ransomware attacks.
AI also could recommend appropriate actions to take so inexperienced security professionals such as fresh graduates could be better supported in preventing breaches, she said, adding that this could alleviate global shortage of cybersecurity skills.
Increased level of automation and autonomous systems would be necessary with growing use of Internet of Things (IoT) devices and sensors. Pointing to Singapore's smart nation initiative, for instance, she said data sensors used in the country could be more efficiently and securely managed with automation.
Businesses, though, would need to think about the context and determine where and how to implement autonomous responses to prevent business interruptions alongside considerations for safety, she said.
Pointing to Darktrace's core principles, Eagan said the vendor likened security to the human immune system, which was able to combat various infections whilst having an inert sense of "what's self and not-self". Machine learning capabilities needed to emulate that concept, so it could build patterns for the user and device and identify what was not behaving naturally, she noted.
Potential tensions between privacy and need for data access
Hershman noted, though, that analytics required access to pertinent data. The more comprehensive and near real-time, the more effective the insights, he said, adding that many had faced frustration over incomplete or unavailability of data. This was due to several reasons including protocols related to classified information, privacy laws, as well as inefficiencies and uncoordinated efforts.
A delegate then asked how this could impact tensions between privacy and security, especially as governments, in particular the US, sometimes overstepping the boundaries and the European Union especially sensitive about the need to protect data privacy.
Jamie Wylly, Microsoft's general manager of public safety and national security worldwide public sector, acknowledged it was a difficult line to balance. He said the software vendor managed this by requiring court orders for access to its data.
"What the industry will push back on is broad sweeping [requests] for customer information," Wylly said, adding that vendors would want to comply with accurate and proper requests for data. "Customers want their privacy and we need to continue to not allow a broad sweep [for access]."
"Data sharing isn't a technical challenge. It's a policy challenge of organisations that want to work with each other," he said. "There's always a debate of what cloud companies that hold customer data and the need to protect [that data] versus the need to work with the government's need for data. The answer is a court order."
Hershman said it did not necessarily mean privacy and security were "incompatible", noting that losing one or the other would not be in the best interests of everyone involved.
He also stressed the need to establish trust to drive collaboration between government, private companies, and civil societies. This meant ensuring clear communication and transparency from law enforcement, which needed to better communicate their values and ethics.