Banking on Data: the World’s First-Ever Common Currency

banking on data

By Hamilton Mann

There is no shortage of descriptors when it comes to unveiling the considerable importance of data in our societies. While some refer to it as the new black gold, this comparison is somewhat appropriate but not entirely accurate. Just as oil is vital for energy, data has become indispensable and inherent to the functioning of our digital and artificially intelligent economy. But unlike oil, which diminishes as it is used, data can be utilized and shared infinitely.

As odd as it may seem or appear, at the dawn of the 21st century, the entire world is undergoing one of its greatest societal transformations since the invention of currency, yet it is not truly regarded with the same level of significance.
Data is the world’s first-ever common currency. And like money, it plays and will play a fundamental role in the economy and society.

Data is a unit of measurement

As money serves as a standard of value, data serves as a unit of measurement for insights and business performance. As soon as companies began using databases to track their operations in the 60s and 70s, data became a unit of measurement. With the development of analytical tools in the 80s and 90s, companies began measuring their performances through data in much more sophisticated ways.

The ‘total quality management’ movement of the 80s required intensive use of data. Simultaneously, the development of systems such as integrated management software (ERP) enabled companies to track and measure aspects of their operations in unprecedented ways. Data now allows for unprecedented opportunities in capital funding, underscoring its transformative role as a pivotal asset in modern finance.
The most striking modern example is probably the rise of Silicon Valley big-tech companies.

These companies-built empires by measuring and analyzing user behaviors on a scale never seen before, making data not only a unit of measurement but also the very foundation of their business model.

Data is a medium of exchange

As currency facilitates transactions, data allows businesses to better understand their customers and tailor their offerings. It is exchanged between entities for various services, such as personalizing advertising. The concept of data as a medium of exchange dates to the advent of the first computer systems, but its widespread adoption and recognition truly took off with the emergence of the Internet and, more recently, the rise of e-commerce and online services in the 90s and 2000s. As more and more businesses began offering online services, they realized that the data generated by users was valuable for improving their services, creating new products, or selling it to third parties.

A prime example of this transformation is the ascent of the online data search economy. Each online search performed by a user provides information about user interests, behaviors, and desires deriving massive revenue from targeted advertising using users’ search data. Data has thus become a form of currency with which users “pay” for services.

Data is a store of value

As money retains its value, relevant and well-preserved data can offer long-term strategic benefits to a company, even years after its collection. Companies quickly understood that the information they collected about their users was valuable in and of itself, not only for improving their services but also as a source of revenue.

Customer data aids in understanding buying patterns, preferences, and habits to recommend products, leading to increased sales. Besides, just as money acts as a reserve of value, safeguarding wealth for future investments, data too holds intrinsic worth, anchoring the potential for innovation.

Without this reservoir of data, pioneering breakthroughs in AI technologies—enabling the development of systems from autonomous vehicles to smart healthcare diagnostics and real-time language translation—would remain beyond our grasp.
The recognition that data can be used as a store of value was a turning point, leading to the era of the so-called “Big Data” where companies of all sizes and from all sectors seek to capture, store, and analyze data in hopes of deriving future value from it.

Data is a representation of sovereignty

Owning and controlling one’s own data has become a vital component of digital sovereignty, just as having one’s own currency is a symbol of national sovereignty.
As nations have become aware of the strategic implications of data concerning its storage, cross-border transfer, and access by foreign governments, it has become integral to national sovereignty.

China is perhaps the most emblematic example of data as a representation of sovereignty. With the adoption of its cybersecurity law in 2017, China implemented strict data localization rules, demanding that “personal information and critical data” collected by core information infrastructure operators be stored within its borders.
Many other countries, from Russia to India, have since adopted similar rules, underscoring how possession, control, and access to data have become central in contemporary notions of national sovereignty.

Data is an economic policy instrument

As currency is regulated to influence the economy, data is used by governments and businesses to inform their decisions and strategies. Particularly with the rise of tech giants, governments quickly grasped the strategic importance of data for economic development, competition, and regulation.

With the introduction of the General Data Protection Regulation (GDPR) in 2018, the EU established strict rules on data collection, storage, and sharing, thereby recognizing not only its economic value but also its importance in terms of human rights and individual freedoms.

Discussions about competition, data monopolization, and the impact of tech giants on the digital economy are now at the heart of political and economic debates.
The use of data as an economic policy tool is also evident in the regulation of artificial intelligence, digital privacy standards, and antitrust measures against data monopolies.

Data is an element of credit facilitation

Currency allows for the granting of credits. Similarly, quality data can open opportunities for partnerships and funding for businesses. Data became a credit facilitation tool with the rise of financial technologies, or “fintech”, in the 2010s. The surge of peer-to-peer lending platforms and fintech companies that use advanced algorithms to assess creditworthiness based on a variety of data – from financial histories to online shopping habits – was the harbinger of this transformation.

China’s Ant Financial, the owner of Alipay, stands as an iconic example of this shift. With its “Zhima Credit” product (also known as “Sesame Credit”), Ant Financial offers a credit scoring system based on data analysis sourced from user activities on Alibaba Group’s platforms and other sources. This score can then be used to secure loans, rent apartments, and even for certain government services.

The use of data in this manner has revolutionized access to credit, particularly for individuals and small businesses who previously struggled to obtain loans due to a lack of traditional credit history.

Data is a foundation of the tax system

While currency is essential for tax collection, data is increasingly used to monitor tax compliance and prevent fraud. Data became foundational to the tax system as governments began using digital technology to collect, process, and analyze tax information. This shift also gained momentum in the early 2000s, with the increasing digitalization of public services.

The adoption of online platforms by tax administrations for tax declaration and payment was a turning point. The Internal Revenue Service in the United States serves as an example. Another example is India’s introduction of the Goods and Services Tax in 2017. In France, the implementation of tax-at-source in 2019 also stands as a symbolic representation of the use of data in the French tax system.
These developments signify how data has become crucial to modernize and streamline tax systems globally.

Data is foundational to trust and stability

Proper data management strengthens the trust of customers, partners, and investors, just as a stable currency bolsters confidence in the economy. Data became a key element of trust and stability with the advent of the digital revolution, especially with the development of blockchain technologies in the 2010s.

Bitcoin, created in 2009, is arguably the most prominent example as a decentralized currency where trust is established not by a central financial institution, but by network consensus. The value and stability of Bitcoin rest on the transparency and immutability of transaction data recorded in the blockchain. Thus, data, when processed and stored in a transparent and secure manner, can serve as the foundation for trust and stability in a decentralized system. More broadly, data holds the potential to create trust in various fields, from smart contracts to online voting systems and many other applications.

Data is a facilitator of international trade

Much like currency facilitates international trade, data plays a growing role in global commerce, with the transfer of data between countries becoming a key element of trade agreements. Integrated supply chain management systems, e-commerce platforms, and online payment solutions are among the major innovations that have helped facilitate international trade.

The rise of the dominant e-commerce global marketplaces is another prime example of how data has propelled international trade. Spanning multiple continents, they leverage user data to recommend products, predict demand, set pricing strategies, and optimize logistics. Sellers, from different corners of the globe, utilize their data-driven insights to forecast product demand, manage inventory, and target customers. Through its comprehensive logistics and fulfillment services, these companies use data analytics to streamline international shipping, customs, and storage processes, making it easier than ever for sellers to reach global audiences.

It underscores the indispensable role of data in simplifying cross-border transactions, predicting global market trends, and democratizing access to international markets for businesses of all scales.

Data is a vector for regulating liquidity

As monetary policy regulates the amount of currency, regulations on data determine how they can be stored, shared, and utilized. The rapid expansion of digital financial markets has enabled the use of real-time data to analyze and predict market movements, as well as to automatically regulate liquidity.

Investment banks and hedge funds were among the first to adopt high-frequency trading, using algorithms to execute orders at a speed and frequency that are beyond a human trader. The May 6, 2010, flash crash, often referred to as the “Flash Crash,” is a notable example of the consequences of intensive data use in regulating liquidity.
While this event highlighted the risks associated with an excessive reliance on algorithms and data for liquidity regulation, it also underscored the critical importance of data in the modern functioning of financial markets.

Overall, data has emerged as a pivotal factor driving global economic structures, paralleling the influence once held exclusively by traditional currency.

It underscores its central role in a multitude of sectors, from economic policymaking to international trade. Drawing on its historical trajectory and expansive influence, it becomes evident that our current understanding of data’s value is only scratching the surface.

As we acknowledge the transformative power of data, it’s crucial to offer recommendations to harness its potential responsibly, ensuring a sustainable and equitable global data economy.

Let’s delve into strategic insights to bank on this newfound currency of the digital age:

Building central data backbones for a modern data economy

Central banks, such as the European Central Bank or the US Federal Reserve, play a major role in regulating and stabilizing currency. There is no equivalent entity to regulate data on such a scale. Today, just as there are Central Banks for currency, Central Data Banks are necessary.

Currently, vast amounts of data are held by a few tech giants. A central data bank could help decentralize the ownership of these data, thus reducing the power and control concentrated in a few hands. A central data bank could ensure equitable access to information, preventing certain businesses or entities from monopolizing data for profit.

The central data bank would be responsible for overseeing institutions that hold, process, and exchange data, just as central banks supervise financial institutions. It would establish standards for data protection, their ethical use, and would ensure compliance with these standards through audits and inspections.

Determining the rate at which data should be universally accessible

Inspired by the interest rate benchmarks used by central banks in the financial world, the benchmark access rate to data (BARD) would serve as a regulatory mechanism to control access to data stored in a central data bank. This rate would represent a measure of the ease (or difficulty) with which external entities can access this data. The lower the BARD, the more affordable it would be for entities to view or use the data stored in the central bank. Conversely, a high BARD would mean that access to the data is more restricted and costly.

It would be a strategic tool for promoting Research and Innovation: when the bank wishes to stimulate research, innovation, or competitiveness, it could lower the BARD. This would allow researchers, startups, and companies to take advantage of the available data, thereby fostering technological and economic development.
The establishment of the BARD would be the responsibility of a regulatory authority, likely a governmental entity or an independent body mandated for this function.

Balancing concerns about data privacy with contingency planning for data security

Drawing inspiration from the mandatory reserves imposed on banks by monetary authorities, Mandatory Data Reserves (MDR) would refer to a minimum portion of data that businesses and institutions would be required to store within a central data bank. This mechanism would aim to ensure the security, transparency, and regulation of data flow.

Just as banks are required to hold a fraction of their deposits in reserve, entities that collect, process, and store data would be obliged to deposit a certain proportion of these data in the central data bank.

The amount of data to be kept could be defined in terms of a percentage of the entity’s total storage capacity or the total volume of data processed.
These deposited data would remain the property of the originating entity but would be stored securely and centrally for various reasons, including regulation, oversight, and security. Storing data in a central reserve would promote greater transparency and enhanced accountability for entities.

Navigating the fine line between data accessibility and data exploitation

Similar to the open market operations used by central banks to regulate the money supply, Open Data Market Operations (ODMO) would refer to the transactions initiated by the central data bank on an open data market. The goal would be to regulate the quantity, quality, and availability of data in the digital economy.

ODMO would allow the central data bank to actively intervene in a data market, where datasets are exchanged. This intervention could take the form of purchases to inject data into the market or sales to withdraw data from the market or generate revenue. The price of these datasets would be determined by demand and supply in the market, just like securities in financial markets.

By purchasing high-value or rare datasets, the central data bank could make them available to researchers, innovators, and decision-makers, thereby promoting innovation and informed decision-making.

Ensuring individuals are fairly valued and compensated for their data

Every citizen could have a personal data account with the central data bank where they can voluntarily deposit some of their data. These accounts would be protected and secure, offering citizens complete control over who can access their data and under what conditions. Access to certain data could be subject to a remuneration system for the data owners. Companies, researchers, or other entities wishing to access specific data might pay fees. A portion of these fees could be redistributed to the citizens whose data are used. This remuneration would be proportional to the use and value of the data in question.

The central data bank could establish a mechanism to assess the value of different types of data based on their rarity, utility, etc. Citizens could then have an idea of the monetary value of their data, encouraging them to knowingly share more valuable or rare information. At the end of each period (month, quarter, year), the central data bank could redistribute a portion of its profits to citizens in the form of a “data dividend”. This dividend would be a recognition of the collective value of the data provided by the citizens and would be distributed based on each individual’s contribution.

Lending data responsibly

The concept of “Data Lending Facilities”, inspired by the lending facilities that central banks provide to financial institutions, would enable the provision of data for specific uses over a defined period, grounded in the idea that data can be treated as an asset, akin to money.

In the modern data-driven economy, not all institutions necessarily have the resources to collect, process, and store vast data sets. However, they might need this data for specific projects, studies, or innovations. Rather than forcing them to purchase or access these data on a permanent basis, a lending facility would allow them to borrow this data for a limited duration.

This access would often be limited to a specific platform to ensure security and monitor usage. This could be useful for institutions that need specific data for a temporary research project but don’t necessarily require permanent access to such data.

Standardizing the relative value of different data sets

Just as currencies have relative values to each other in the foreign exchange market, data could also be valued and traded based on certain criteria. This would introduce a form of standardization and regulation in data trading. Several factors could determine the value of data, such as its relevance, timeliness, rarity, specificity, quality, etc.

Specialized institutions or departments within the central data bank might be responsible for the regular evaluation of data sets. A centralized platform could be established where entities can offer their data sets for exchange, similar to a stock exchange.

Just like with currencies, the value of data would fluctuate based on supply and demand. Rare but highly demanded data sets could have a high exchange rate.
Such a system could introduce a form of standardization in how data is valued and traded.

Covering the intangible risks of Data breaches

In many countries, citizens’ bank deposits are insured up to a certain amount. There is no equivalent to “insure” personal data in the event of a breach or loss. The model of data deposit insurance has also become crucial. Data Deposit Guarantee Funds (DDGF) could be considered. Just as banks contribute to a deposit guarantee fund to protect customers’ money in case of a bank failure, companies that store and process data could be required to contribute to a similar fund for data. In case of data breach or loss, this fund could be used to compensate the affected individuals, whether through financial compensation or services.

Moreover, similar to bank deposit insurance that covers up to a certain amount per depositor, data deposit insurance could guarantee the security of the data up to a certain “quantity” or “value”. If someone loses data due to a breach, a predefined set of this data (for example, the most sensitive data) would be guaranteed or compensated.

Guaranteeing human rights take precedence over the surge in data collection

For many, the current rules and related sanction mechanisms for human’s data protection violations don’t seem to fully reflect the significance and sensitivity of personal data. In the financial sector, sanctions are designed to be as preventive as they are punitive. They are calculated to have a major financial impact on the offenders, while also deterring them from repeating their wrongdoings. Financial institutions can lose their ability to operate, which is a grave consequence.
A similar measure in the tech world could involve the suspension of certain activities, or even the shutdown of parts of a service.

Furthermore, citizens should be better informed about their data rights and how their data are used. Strengthening individuals’ rights to request the deletion of their data could limit companies’ abilities to indefinitely store information without valid reason. This should involve providing clear information to every data owner about all users of their data.

And just as with international financial standards, there could be a benefit to having global standards for data protection and sanctions, thus avoiding “data havens” where companies might try to relocate to escape regulation. Close collaboration between countries would be essential to ensure the effectiveness of sanctions and prevent companies from merely shifting their operations.

Curbing the negative impact of data speculation in the market

Speculation is a well-known concept in the financial world, where players buy and sell assets hoping to realize future profits. While “data speculation” isn’t a commonly used term, the idea captures the essence of a growing phenomenon where data is collected, stored, and traded with the aim of profiting from its future use.

Companies might collect data without an immediate or specific use in mind, hoping that it might be useful or profitable in the future. This is particularly true for tech companies that have the capabilities to store vast amounts of data. Furthermore, just as excessive speculation can create financial bubbles, a “data bubble” might emerge, where the perceived value of the data far surpasses its actual utility.

In the same way that certain financial mechanisms impose limits on speculation, caps could be implemented to restrict the amount of data a company can collect without justification. Just as financial transactions can be taxed to discourage speculation, a tax on the collection, storage, or trade of data could be considered. Companies might be required to disclose the nature, quantity, and usage of the data they collect, thus allowing regulators and the public to monitor speculation.

Ensuring transparent reporting without hindering data-driven industries

The reporting obligation for financial institutions regarding suspicious activities aims to combat money laundering, terrorist financing, and other illicit activities. In the world of data, the notion of “suspicious data” is different, but the underlying principle – accountability and transparency – remains. This might include unauthorized access to databases, accidental exposures or data theft, unusual data access patterns, unexpected requests for large amounts of data, or data transfers to unknown destinations that might be deemed suspicious.

Regulations concerning reporting obligations vary considerably from one country to another. This can create confusion for international companies and allow some to avoid reporting by exploiting these inconsistencies. Moreover, in some places, fines or penalties for non-reporting or late reporting are minimal, offering little incentive for compliance. Promoting international guidelines or treaties on data breach reporting could help establish a minimum compliance baseline.

The emergence of data as a form of currency redefines traditional paradigms of value and exchange. This transformation unfolds with unmatched opportunities and risks, intertwined with pressing ethical concerns.
While financial regulatory mechanisms have been refined over centuries in response to crises and innovations, data, in its newfound monetary stature, is in its infancy.

Concepts such as transparency, fairness, security, and accountability, fundamental in the financial sector, can serve as cornerstones in designing regulatory frameworks for data. In essence, while acknowledging data’s uniqueness as a currency, the financial regulatory system provides an opportunity to learn from its effectiveness and its limits. 

By marrying these lessons with a nuanced understanding of data’s specifics, we can hope to establish a balance that maximizes the benefits of this new currency while minimizing its potential risks to individuals and society at large.

About the Author

Hamilton Mann - AuthorHamilton Mann is the Group VP of Digital Marketing and Digital Transformation at Thales. He is also the President of the Digital Transformation Club of INSEAD Alumni Association France (IAAF), a mentor at the MIT Priscilla King Gray (PKG) Center, and Senior Lecturer at INSEAD, HEC and EDHEC Business School.

LEAVE A REPLY

Please enter your comment!
Please enter your name here