in the media

Raja Mandala: Catching up on Information Statecraft

New Delhi needs to turn its attention in 2018 to creating significant domestic capabilities for information operations against threats at home and abroad.

published by
Indian Express
 on December 25, 2017

Source: Indian Express

Among the critical features of 2017 has been the rapid commercial diffusion of advanced technologies like artificial intelligence (AI) and the fear that these technologies might pose an unprecedented threat to the future of humanity as a species. Scientists and entrepreneurs such as Stephen Hawking and Elon Musk, have demanded that the United Nations ban killer robots.

Others like Microsoft’s Satya Nadella have called for new set of international norms — a cyber code of conduct — that will better protect individuals, companies and nations as technological transformation upends our world. While collective agreements among nations are far away, the technological advance is likely to be relentless in 2018 and beyond. The gap between the pace of technological change and the capacity of states to regulate it effectively within nations and between them has been part of our evolution. The digital revolution is no exception.

Even as calls for preventing the militarisation of AI get louder, governments have never stopped trying to find and exploit the strategic possibilities of new technologies. Russian President Vladimir Putin said earlier this year that “artificial intelligence is the future, not only for Russia, but for all humankind”. Underlining the opportunities and threats that AI presents, Putin added, “whoever becomes the leader in this sphere will become the ruler of the world”. Meanwhile, the inescapable fact is that the digital domain has already turned into a major political battlefield. While the problems associated with cyber threats to critical infrastructure has been debated in recent years, 2017 might be remembered for new awareness on the possibilities for significant interference in the political affairs of other nations.

Consider, for example, the arguments on alleged Russian meddling in the US presidential elections at the end of 2016. Hillary Clinton, the losing candidate and the Democratic Party that nominated her have insisted that Russia tilted the election in favour of Trump. That, in turn, has led to investigation of the personal and business connections between Russia and President Donald Trump and his family.

Trump and Putin both dismissed these allegations as baseless. But the idea that you can use chatbots and algorithms to shape the politics of another country has certainly gained ground this year. The National Security Strategy (NSS) presented by the Trump Administration last week does not accuse Russia of interfering in the US elections. No surprise there. But it certainly underlines the potential threats from Russia’s information warfare. “Russia uses information operations as part of its offensive cyber efforts to influence public opinion across the globe. Its influence campaigns blend covert intelligence operations and false online personas with state-funded media, third-party intermediaries, and paid social media users,” the NSS says.

More broadly, the NSS argues that “America’s competitors weaponise information to attack the values and institutions that underpin free societies, while shielding themselves from outside information. They exploit marketing techniques to target individuals based upon their activities, interests, opinions, and values. They disseminate misinformation and propaganda”. Non-state actors too weaponise information. As the NSS puts it, “Jihadist terrorist groups continue to wage ideological information campaigns to establish and legitimise their narrative of hate, using sophisticated communications tools to attract recruits” and mount attacks. Even as the offensive use of the web has grown, some states are devising ways to limit and control access to the internet to domestic audiences.

China’s great internet wall is one such example. According to the NSS, China “combines data and the use of AI to rate the loyalty of its citizens to the state and uses these ratings to determine jobs and more.” Russia has been talking about building an alternative internet to the one today that is centred in America.

Just a few years ago, it was widely assumed that the internet would favour Western democracies and undermine authoritarian regimes. Recall Hillary Clinton’s Internet Freedom Project announced in 2010, when she was the US Secretary of State. Today, Russia, China and some other countries continue to fear that the US will foment trouble within their societies. Moscow and Beijing have developed strong defensive capabilities against such intervention and demonstrated the capability for offensive operations in Western societies.

The NSS recognises the new dynamic and calls it “information statecraft”. Disinformation and deception to undermine the adversary’s court and society have long been part of statecraft. What has lent information statecraft a new edge today are expansive reach of the social media and the awesome capability to analyse big data.

As a diverse society and chaotic democracy, India is indeed very vulnerable to hostile information operations. The NDA government has indeed devoted much energy to turn the Indian economy into a digital one. It has sought to access massive data on citizens for the declared purposes for better mobilisation of tax revenues and delivery of services.

But there is no public evidence of a coherent strategy for the strategic use of information for internal and external security. Delhi needs to turn its attention in 2018 to creating significant domestic capabilities for information operations against threats at home and abroad. Unlike in many other countries, democratic India’s information statecraft must, however, be in full consonance with the rights of its citizens and subject to political oversight.

This article was originally published in the Indian Express.

Carnegie does not take institutional positions on public policy issues; the views represented herein are those of the author(s) and do not necessarily reflect the views of Carnegie, its staff, or its trustees.