By Yasar Jarrar
Innovations have swept the government nomenclature over the past decade. Many government entities followed tried and tested private sector frameworks, and a lot of good outcomes were delivered in terms of better public policies and improved government services. However, the bulk of these innovations were encouraged inside-out, designed and delivered from within. In the meantime, the more interesting form of innovation was coming from outside government; namely the private sector, civil society and individual citizens. Empowered with more and more data, they started thinking truly out of the box and offering various “government hacks”.
Today, fast-moving and evolving trends in digital technologies are leading to a radical change in citizen expectations. Citizens are changing their approach to interacting with, and relating to, governmental organisations and services. The nature of these evolving interactions is horizontal, empowering and spontaneous. In many ways, the exact opposite of the traditional hierarchical, bureaucratic and rules-based systems government developed over the decades. Central to this new form of interaction is data: up-to-date, reliable, user-friendly and open data.
This need for data is quickly becoming a central theme that applies to all aspects of our evolving digital society. A case in point is the field of artificial intelligence, which promises to revolutionise society (governments included). Companies such as Google, Facebook and Microsoft are using AI-related techniques to train computers to recognise objects in photos and understand human language. It is possible to train computers to perform these difficult feats because we have the enormous quantities of data that is required. The same applies to all forms of machine learning, smart manufacturing and every other tech-driven trend shaping the future. They are all reliant on data and are only as good as the data they crunch. In this context, data has been described as the “new oil”.
Governing in the age of data
The rapid pace of technology evolution over the past decades gave us new business models (at the centre of which is e-commerce), an unprecedented level of global connectivity (accelerated by the smartphone phenomenon). These developments created enormous volumes of data, which led to the rapid rise of the “data field”. What was once the domain of intelligence agencies, market research professionals and some technical statisticians is now going mainstream.
The new connected world of today is producing data at a pace that is unprecedented in human history. It is estimated that today more than 3 billion people are connected to the internet (compared to only 2.3 million people in 1990). These 3 billion people are producing data every second of their digital lives. This has led to the rise of big data, commonly defined using the four Vs: volume, variety (of sources), velocity (effectively around the clock) and veracity (given abundance, quality assurance becomes key).
If used effectively, big data can be a powerful tool. Various researchers have found a strong link between effective data management strategy and financial performance of companies as it helps them get to market faster with products and services that are better aligned with customer needs. It has the same performance enhancement potential for the public sector in terms of better policies, more tailored government services, and more effective and efficient distribution of resources. It can also lead to negative outcomes if used incorrectly, in addition to the much-discussed issue of privacy.
Effectively managing big data is now possible given the hardware and software developments, at the centre of which is the exponential growth storage capacity. Today, a hard disk with one terabyte storage capacity costs about $50 (that was the global storage capacity only four decades ago). It is because of this storage power that many entities are following the “collect now, sort out later” approach when it comes to data. The low cost of storage and better methods of analysis mean that you generally do not need to have a specific purpose for the data in mind before you collect it. This means big data will only get bigger, and – per IBM’s Watson data crunching service team – the value of this data will go up every day AI advances.
Operating models in the data age
Today, a large majority of the world’s data is in the hands of the private sector (such as IT, telco and retail firms). Some, like Google and Facebook, managed to monetize this data and made it central to their business model. Others, including Uber and Airbnb, used data to develop platform models that disrupted their industry. So far, people have been willingly offering their data for free in exchange for access to technology services (e.g. email). But this will not remain the case for long. Business models are being developed to find the ways and means to start paying people for the data they generate in their daily lives. An exciting, and widely unregulated, sector is emerging.
The remainder of the global data sits in government hands, mostly stored in the paper format or legacy systems. To maximise the societal benefits of the data age, a new movement started promoting open data. While government data is all data or information that government entities produce or collect, making it open refers to publishing and sharing data that can be readily and easily consulted and re-used by anyone with access to the internet with no fees or technological barriers.
Data is increasingly becoming a source of wealth and public value creation. In that context, one can argue it is more valuable than just being “the new oil”. It is the lifeline of the digital society. A business running without accurate data is running blind, and this is even more relevant in the public sector (especially given the growing scarcity of public funds).
However, there are big questions that are yet to be answered in the data age. Who owns the data, and who should own it (given its centrality in our digital society of the future)? Should there be a basic data charter for citizens so they understand their rights and responsibilities? Who is responsible for our data quality and security? How do we manage and ensure privacy? And, will people accept to continue generating data without being compensated for sharing it?
From e-government to government as data curator
Commercial decisions, innovations, public policies and all choices based on big and/or open data are only as good as the quality of data they use. The data needs to be vetted, maintained up-to-date and useable, and protected. This cannot be always done at the source due to the data sources’ variety and veracity. Societies will look more and more to their governments to play that crucial role.
Over the decades, governments have always had a technology arm. We moved from the first generation (web 1.0) e-government to web 2.0, which gave us richer, immersive web-based services with online applications. Now, we are looking at government 3.0. But rather than being represented by a technology or toolset, it is a shift in culture that views government as a platform for enabling the creation of public value. Data is at the heart of this platform.
Data is indeed the new oil, and it has the same economic and social transformative potential. If “crude” data can be extracted, refined and piped to where it can impact decisions in real time, its value will soar. If data can be properly shared across countries and societies and made accessible in the places where analytics are most useful, then it will become a true game changer, altering the way we live and work. For that to happen, governments need to design, refine and master a new set of capabilities, regulations and shape a new culture. Nothing less than a new ecosystem will do in this case.
Most of this data currently remains locked up and proprietary (private property of companies, governments and other organizations). This severely limits its public value. Data is now a new social good and governments will need to think of some form of data responsibility legislation that guides the private sector and other data owners on their duties in the data age: the duty to collect, manage and share in a timely manner, as well as the duty to protect. This legislation is needed over and above a government’s own open and big data management systems, and will need to cover all data stakeholders (irrespective of ownership or other governing rules).
Once a clear legal framework is in place, governments need to develop, and quickly master, a new core capability: data curation. The challenge for governments today is that the core skills and systems needed in the data age are far from the current government systems and regulations. Despite years of political attention, and billions invested, most governments around the world still struggle with legacy databases that are incompatible with each other, and work against any kind of data-sharing or data-driven design. Laws and regulations are still in their infancy and struggling to cope with the pace of change. More importantly, the talent needed to manage this new capability is not typically attracted to public service and is in high demand in the private sector.
Government organizations need to design advanced processes for data management. They should be able to capture and process overwhelming amounts of data and store it in a way that captures its context (contextual factors are critical as big data may lead to counter effects on decisions made consequentially). Governments also need robust processes to ensure and assure data quality. The value of data for decision-making purposes may be jeopardised if the data is not accurate or timely.
To enable such processes, governments must review a vast number of laws and regulations. From harmonising and enforcing privacy regulations and protecting against data breaches, to regulations that ensure net neutrality and data flows. Today’s debates over the future of big data are based on the assumption that the internet will remain a series of open networks through which data easily flows. Some countries have begun to harden their internet systems, and the concept of net neutrality is uncertain. If the internet becomes a network of closed networks, the full potential of big data may not be realised.
Governments must also improve their capabilities when it comes to citizen engagement to effectively and actively engage with both providers and users of data. This requires governments to create a culture of open data – something governments are starting to do with various degrees of success. The level of citizen engagement is not the typical government communication function, but a more open, horizontal, and fast-paced G2C platform.
Finally, and probably most critically, is the need to attract and retain the talent needed for the data age. Two decades ago, a statistician did not have many job prospects around the world. Today, the same skill set (rebranded as data scientist) is probably the hottest job on the market. IT firms (from start-ups to global leaders), financial services, retailers, defence companies and governments are all competing to recruit such talent. Those who will survive and thrive in the age of the Fourth Industrial Revolution will be the organisations that can attract, retain and continually develop those skills and capabilities.
Yasar Jarrar is a Professor of Business and Global Society, Hult International Business School.