The data economy or digital economy refers to the production, distribution and consumption of digital data. It’s a global system spanning the creation of vast amounts of behavioral, social, locational, and biometric data; the analysis that turns this information into insights; and the markets that profit from these processes.
In the data economy, companies and institutions collect and analyze the data we all generate online, such as our clicks, searches, purchases, and location, and use it to make business decisions, target ads, and develop new products. The more information companies can gather, the better they can turn everyday online activity into economic value.
Personalized advertising is key part of the data economy.
The processes behind the data economy is often described as surveillance capitalism, a term made popular by Harvard scholar Shoshana Zuboff. Surveillance capitalism refers to the way companies make money by watching what people do online, collecting their data, and using it to sell ads or shape their choices. It’s called “surveillance” because it relies on constant tracking of what people do online (and increasingly offline), and “capitalism” because this information is then used to generate revenue, mainly through targeted advertising and data-driven products.
While surveillance capitalism focuses on the business model of behavioral extraction and prediction, the data economy is broader. It covers:
For most people, the data economy and surveillance capitalism can make services more convenient, relevant, and accessible, but they also carry significant risks, including loss of privacy, less autonomy, exposure to security breaches, discrimination, and power imbalances between individuals and large technology platforms.
The roots of the data economy lie in the explosive growth of the internet, social media, and mobile computing. As digital platforms became part of daily life, companies began collecting not only the data users knowingly provided, such as search terms, purchases, and posts, but also the invisible traces of their online behavior.
This background data, sometimes called digital exhaust, includes every click, scroll, pause, and GPS movement. Zuboff refers to this as behavioural surplus – the extra information collected beyond what’s necessary to provide a service. This surplus data allows companies to build detailed user profiles, predict future actions, and monetize those predictions through targeted advertising and other behavioral markets.
The rapid development of machine learning, artificial intelligence, and cloud storage has quickly scaled and expanded the data economy.
Once collected, data is transformed by algorithms and analytics into predictions about people’s future behavior: what they might buy, where they might go, or how they might vote. These predictions are valuable to businesses that want to influence decisions and maximize engagement or sales.
The global market for predictive analytics is expanding rapidly and expected to grow from about USD 19 billion in 2024 to over USD 80 billion by 2030, highlighting the huge commercial value placed on understanding and shaping human behavior.
Datafication is the process of turning everyday human activities into measurable, analyzable data. Every digital action, from tapping a fitness tracker to liking a post, generates a data point that can be recorded and studied.
This transformation extends beyond the internet to the Internet of Things (IoT): smart appliances, wearables, vehicles, and even cities that continuously collect data about their environment and users. Datafication enables innovation and efficiency, but it also raises concerns about continuous monitoring, the erosion of private space, and the issue of opting out of constant data capture.
Behind the scenes, a vast network of platforms, data brokers, ad-tech systems, analytics companies, and cloud providers powers the data economy.
Platforms such as search engines, social media networks, and e-commerce sites act as major data collectors.
Data brokers gather information from public records, purchases, and digital traces to build detailed consumer profiles that are sold or shared with third parties.
These operations are often opaque, and users rarely know which organisations hold their data or how it’s being used.
In surveillance capitalism, the traditional relationship between user and service provider changes. The service itself is no longer the main product; the user’s behavior is. As the saying goes, “If you’re not paying for the product, you are the product.”
Platforms profit by turning human experience into data that can be analyzed, packaged, and sold. The more detailed and predictive the data, the more valuable it becomes.
Zuboff uses the term instrumentarian power to describe the new form of control that arises when companies can monitor, predict, and shape human behavior through data-driven systems. Unlike totalitarian power, which demands obedience, instrumentarian power aims to guide behavior quietly and automatically, through nudges, notifications, and algorithmic suggestions.
This power operates invisibly, embedded in digital tools that appear neutral or helpful but are designed to influence choices. Over time, it can shift control away from individuals and towards the systems that anticipate and steer their actions.
The overlap between the data economy and surveillance capitalism creates a range of risks for individuals and society:
Constant and large-scale data collection erodes privacy. Most people don’t know how much information is being gathered about them or how it is combined to reveal intimate details such as health status, emotions, or political leanings. Even small, seemingly harmless data points can be aggregated to produce detailed personal insights. Making matters worse, privacy policies are often long and complex, and how to opt out can be difficult and unclear.
When algorithms predict and influence behavior, users may feel they are making free choices when in fact their experiences have been carefully shaped. Zuboff warns that this “architecture of behavioral modification” can undermine independent thought and even democratic processes, as people’s decisions become increasingly guided by invisible feedback loops and targeted messages.
The more data that is collected and stored, the larger the target for hackers and cybercriminals. Massive data breaches have exposed hundreds of millions of individuals to identity theft and fraud.
Data is also frequently reused in new contexts; for example, information from a health app could be analysed for insurance or employment purposes. These secondary uses can occur without consent and increase the risk of re-identification or misuse.
Algorithms that profile users can unintentionally reinforce biases or discrimination. People may receive different prices, job ads, or opportunities based on hidden factors like age, race, or location.
This creates power asymmetries: large corporations and governments hold vast amounts of knowledge about individuals, while individuals have little visibility or recourse. The balance of control shifts away from the user.
Beyond personal harm, widespread data monitoring can have chilling effects on free expression. When people feel watched, they may censor themselves or avoid controversial topics.
In the political arena, data-driven targeting has been used to influence public opinion and voting behaviour. Microtargeting, propaganda, and algorithmic governance challenge democratic transparency and trust.
Most digital systems operate on “consent by default” with users agreeing to complex terms they rarely read. Once data enters corporate systems or is sold to brokers, its future use is unknown.
A simple app or smart device may connect to multiple third-party systems, enabling combined profiling that extends far beyond the user’s expectations.
The European Union’s General Data Protection Regulation (GDPR), introduced in 2018, is still one of the world’s most comprehensive privacy frameworks. It gives individuals rights to access, correct, and delete their data, and requires companies to demonstrate a lawful basis for processing.
But some argue while GDPR is a landmark achievement, it doesn’t fully address the underlying business model of behavioural prediction, and enforcement has been inconsistent.
The United States has a more disjointed approach. Privacy laws are state-based, such as California’s Consumer Privacy Act (CCPA) and California Privacy Rights Act (CPRA), and rely heavily on consumer complaints or agency enforcement.
Other regions, including Australia, Canada, and parts of Asia, have introduced or updated data protection laws, but there’s wide variation in the level of protection and enforcement.
Some large technology companies have responded to regulatory and consumer pressure by adding (or considering adding) privacy controls; for example, Apple’s “opt-in” tracking permissions and Google’s planned (but not materialized) phase-out of third-party cookies (and now new proposed change). But some argue these steps often serve as PR measures rather than solid changes to surveillance-based business models.
It’s fair to say that not all data-driven innovation is exploitative; it enables important services such as medical research, navigation, and communication, and fuels advances in AI, healthcare, and logistics and regulation. But the exchange of data for access to services must be fair and transparent, and choice and competition must exist.
Reforming business models so that they respect human dignity and autonomy could help to protect consumers. New models could focus on data minimisation, value sharing, or subscription-based services that rely less on behavioural surveillance.
Decentralized identity and digital identity technologies give individuals greater control over their personal information by allowing them to share only what’s necessary, without relying on centralized databases that can be breached or misused. These privacy-enhancing tools help shift power away from corporations and back to consumers, creating a more secure and transparent foundation for participation in the data economy. Explore Anonyome Labs’ solutions for individuals and businesses.
Data Economy