Response to Danielle Cave’s “Data Driven” Image Credit: Pxfuel

Response to Danielle Cave’s “Data Driven”

Correspondence
Friends, Allies and Enemies

This correspondence is featured in Australian Foreign Affairs 10: Friends, Allies and Enemies. To read the full issue, log in, subscribe or buy the issue.

“Data Driven” by Danielle Cave

Lesley Seebeck

Danielle Cave’s article “Data Driven” (AFA9: Spy vs Spy) does an excellent job of setting out a number of the challenges facing the collectors of intelligence in a technology-imbued, interconnected world. But technology has had perverse and insidious effects on the intelligence process even beyond the issues Cave describes.

Intelligence has been faced with many challenges over the last thirty years. In 1994, Joseph Nye wrote that, with the end of the Cold War, intelligence shifted from secrets to mysteries. It’s one thing to pursue missile counts and sensor ranges – secrets – but entirely another to ascertain whether Boris Yeltsin can control inflation.

The 9/11 attacks on the United States in 2001 altered the frame again. Western intelligence agencies reeled from apparent failures in collection, analysis and coordination, exacerbated by the subsequent revelation that no weapons of mass destruction were found in Iraq. Their focus had to shift from established nation-states to an amorphous group of low-tech fighters in broken states. And they had to provide evidence, not simply reasoning borne from partial information and intuition.

Technology came to the fore in intelligence agencies. Mass information collection – sourcing reams of data – was less risky than the messy, difficult business of recruiting agents in unfamiliar environments. Data was traceable and verifiable – useful in legal prosecution – and it could be endlessly reanalysed. Combined with immense computational capability, it rendered possible the tedious task of finding tiny needles in huge haystacks, allowing America to exploit its position at the centre of the global internet and communications network.

As the threat from within – lone wolves, terrorists and hackers – came to dominate government attention, US internal security and law-enforcement agencies started to develop and borrow technologies and capabilities that were once the preserve of outward-looking foreign and defence intelligence agencies, turning them on their own populations.

The dangers of this technology-driven approach are manifold.

First, technology has encouraged notions of absolutism. As Cave notes, in the modern digital society little is exempt from mass collection. Both decision-makers and the public have come to expect immediacy and certainty. A lack of awareness of threats is considered unacceptable and, after an incident or an attack, unforgiveable, driving agencies to seek out ever more intrusive surveillance in pursuit of exceptionally low signals in high levels of noise.

Second, mass collection reinforces the idea that more data is better. That’s a trap: beyond a certain point, more data is simply more noise, with less clarity and more opportunities for misinterpretation, as American mathematician and cryptographer Claude E. Shannon noted in 1948. The answers to secrets and mysteries do not leap, fully formed, from the raw data; typically, a framing question or concept is required. It’s not more data we need, but more seasoned analysts armed with sharper questions to draw out key judgements to inform policy. As US intelligence analyst Zachery Tyson Brown observed in February 2020, “consumers of intelligence are drowning in data, but thirsting for insight”.

Third, digital technology, especially its reach and speed, disrupts the traditional roles in the intelligence process. The precarious stability of the Cold War refined the intelligence cycle: establish requirements in response to policy questions, undertake collection, analyse the take and disseminate assessments to policymakers. Now, policymakers reach straight to collectors and form their own judgements, often without the context provided by separate intelligence analysis.

Fourth, digital technologies create unhelpful intelligence rivalries. Not only do assessment agencies find themselves bit players, but the community as a whole is competing with wealthier actors with greater capabilities and reach. Large tech platforms such as Google, Facebook and Twitter engage in the same process of mass collection, analysis and sharing, crowdsourcing insights and tailoring responses to consumer preferences. Decision-makers are inundated with social media opinion and online commentary, which are often more immediately accessible than intelligence. Algorithms, fake news and disinformation, which also shape the operating environment, if not the actual decisions, of policymakers can make it harder for assessment agencies and their careful, objective reasoning to be heard.

Last, Cave’s article raises questions about the purpose of intelligence. After all, if all queries can be googled or crowdsourced – or information corrupted – then what is the value of intelligence? And if statecraft rests on knowledgeable, objective insight and judgement about the external environment based on national interests, how can this be attained in a technologically driven world?

There are no easy solutions here. But informed intelligence is needed more than ever in a disruptive and increasingly contested global environment. Asking the right questions and examining the influence that technology is having on our perceptions of information-gathering and on our intelligence agencies, as Cave has begun to do, is a promising start.

Professor Lesley Seebeck is the CEO of the Cyber Institute at the Australian National University.


Read Danielle Cave’s response here.  


Friends, Allies and Enemies

This is correspondence to Australian Foreign Affairs 9: Spy vs Spy. To read the full issue, log in, subscribe or buy the issue.

This correspondence is featured in Australian Foreign Affairs 10: Friends, Allies and Enemies.