Skip to main content

Monitoring your internet behavior has consequences – you may just not be seeing them yet.

The consequences of mass surveillance: How the collected data is used

Commercial and state mass surveillance collects absurd quantities of data about people all over the world. But what is all the data used for?

Quite often, we encounter people who say something like: “Yeah, yeah, so they’re collecting loads of data, but why should I worry?” There are several answers to that question, but one of them is quite simply that data can leak. The ‘normal internet user’ may not care that personal data is stored by one of the world’s biggest companies or by a state authority, but they may have more of a problem with personal information ending up in what’s usually called ‘the wrong hands’. You may not be worried about a pharmacy storing the medicines you buy, but think it feels creepy when the headlines scream about data breaches. Because it’s exactly that simple: Collected data equals data that can leak. If the state, a company or an organization hold sensitive data, they are responsible for keeping it secure in an unpredictable future. That’s a difficult task, particularly when technology is developing quickly and companies and authorities (normal authorities, not the ones carrying out mass surveillance) are struggling to keep up. Over and over again, history has shown how databases are used in the worst possible way when new leaders come to power. We have far too often seen hackers and enemy powers gaining access to data they absolutely shouldn’t have. And how carelessness, poor structures and human factors have led to leaks. Our attitude to this is extremely simple, and our message to anyone storing data is clear: minimize your data storage. Data you don’t have can’t leak.

But unfortunately right now the recurring scandal headlines about data leaks aren’t the big problem. The big problem is that there’s essentially a constant leak, when commercial and state mass surveillance deliberately collects data, shares it and uses it to manipulate and control people for financial or political gain.

We’ve previously described the enormous quantities of data that state mass surveillance systems collect about their own populations – and those of other countries. And we’ve looked at how commercial mass surveillance companies have made it their business model to map your life online. But what happens next? Apart from the fact that you get annoying ads targeting you, how is your data actually used?

The short answer when it comes to state mass surveillance is that several countries in the world have the capacity to look at your collected internet behavior, whenever they like. Depending on where you live, this can have disastrous consequences for you. But mass surveillance also has an effect on the whole of society, which you can read more about here.

You may think: who cares what websites I click on? But if you live in the USA, insurance companies care. They use purchase histories to bump up the prices for your premiums.

When it comes to the commercial mass surveillance companies, there’s also a very short, simple answer to what they do with your data: they sell it. In 2021, it was revealed that data brokers had purchased location data from Life360, an app in which 33 million parents keep track of where their children are by tracking the child’s phone. The following year, a lawsuit was brought against Kochava, another data broker, for having tracked hundreds of millions of people and sold sensitive data about their location. Here you can read more about data brokers and their business model, which is based on selling your internet behavior.

Depending on the country you live in, your internet provider may also log your traffic and share it through a variety of business agreements. A report from the American Federal Trade Commission (FDT) described how at least six large American internet providers were sharing their customers’ location data with third-party companies. The report noted that even though several of the ISPs promised not to sell consumers personal data, they allowed it to be used, transferred, and monetized by others and hid disclosures about such practices in the fine print of their privacy policies.

And this is a tactic even the biggest tech companies employ. Meta and Google may not sell their (your) data, but they exchange it freely. But above all, the tech giants use data collection to optimize their advertising tools. Meta and Google have become two of the highest valued companies in world history through revenues from their advertising networks, and their business concept is clear – it’s all about mapping your behavior and predicting what you’re going to want in the future to tailor ads as accurately as possible. You can read more about the consequences of this here.

Data on medical histories and sexual orientations is sold and exploited

You may be asking the question ‘Who cares if Facebook keeps track of what sites I click on?’ You may also like seeing ads tailored to you. But it may not feel quite as innocent when the data is bought by an insurance company, for example.

The FDT has reported how data is sold to insurance companies, which in turn use purchase histories to raise the premiums for couples paying for couples therapy.

Another example is health apps sharing data with hundreds of different partners about users’ herpes, HIV and diabetes, and data brokers that can easily construct profiles under categories such as ‘depressed’. The question is what happens to people who are cataloged like this: do their insurance premiums go up, do they become the target of information and ads that can lead to them becoming addicted to medication, does the interest rate on their mortgage go up?

Another example: the Catholic priest exposed as homosexual through location data sold by a data broker.

You can read here about how easy it is to buy data from data brokers and how ‘anonymous data’ really isn’t anonymous. The consequences of this include vulnerable women having their real-time location data revealed to stalkers. And as early as 2013, it was possible to purchase information about people who had been raped and lists of people with drug and alcohol dependencies. Once again: who are the buyers and how is the information being used? It’s difficult to speculate any positive outcomes from this type of data list.

It’s a fact in today’s world that socially vulnerable people suffer as a result of the collection and sale of data. But if you want to contemplate the ultimate outcome of this development, you can look at China and the country’s social credit score system.

There is no single, nationally coordinated system. There are several. But if [the Chinese system] does come together as envisioned, it would still be something very unique. It's both unique and part of a global trend.

Mareike Ohlberg

China’s social credit score system gives you minus points if you play too many computer games.

There are many misconceptions about the Chinese social credit score system. The most common one can be seen in the last sentence: because there isn’t only one Chinese social credit score system. As a researcher Mareike Ohlberg from the Mercator Institute for China Studies expressed it in an article in Wired.

She says that the idea itself isn’t a Chinese phenomenon, and neither is the use and misuse of collected data and behavioral analyses. Nor is there a single, nationally coordinated system, but instead several different pilot projects that don’t work in exactly the same way. But if they manage to put them together, as they intend, it will create something truly unique. In this way, says Ohlberg, the Chinese social credit score programs are unique but also part of a global trend.

In other words, the Chinese social credit score programs record slightly different things, but overall cover everything from late payment of your bills and running a red light to playing your music too loud on a train or making a scene in a taxi. You probably recognize this type of scoring system from the western world’s credit checks and the ratings in services such as Uber. What makes China stand out is perhaps the ambition to collect everything into one system. For example, Mareike Ohlberg describes the Chinese city of Rongcheng, which gave every inhabitant 1000 points to start with, and where deductions take place, for example when residents commit a traffic violation, but where they can earn more points by giving money to charity.

Several of the pilot projects are being run by giants such as Alibaba. Sesame Credit runs one of them, and has become famous for collecting data about its 400 million customers and allocating scores based on how much time they spend on video games and whether or not they are parents. The social credit score is included as a parameter in the company’s dating app.

Another well-known example is how investigative journalist Liu Hu was refused the right to buy an airline ticket because he had been allocated the status ‘not qualified’.

Parallels with the fictional series Black Mirror are only too evident. Of course, you can joke about the irony in your social score falling because you went to the wrong parties or lost your temper in the grocery store. The problem is that this is happening in reality, here and now, and that the ultimate goal of this type of mass surveillance is total control over people. And of course it will be worst for those who are already the most vulnerable in society. Here you can read about how China already uses mass surveillance against minorities in the country. But you don’t need to look as far as China to discover truly frightening contemporary examples.

Perhaps you’ll say that you ‘have nothing to hide’. But what happens when the laws change?

When people justify mass surveillance with ‘I have nothing to hide’, there are several arguments that disprove their reasoning and you can read more about that here. But nothing has put as many holes in this argument as contemporary events in the USA. A big problem with ‘I have nothing to hide’ is that it isn’t unchanging. You may change your political view, become an activist and suddenly find yourself, through your online searches, getting extra attention from the authorities. You may become depressed, buy tons of junk food and see your insurance premiums rocket. Perhaps you’re homosexual and find a partner in a country where it’s prohibited by law.

Perhaps you live under the delusion that you ‘have nothing to hide’ but then the law changes and you’re a criminal. In 2022, life suddenly changed for millions of American women when they could no longer google for abortion doctors, buy abortion pills online or visit abortion clinics (with their phone in their pocket) without risking it becoming proof in a potential indictment against them. Suddenly they did have something to hide, and the USA’s digital infrastructure means the odds are stacked against them. If, as a society, you’ve long permitted the internet to become a place where both state and commercial actors can map human lives, it becomes tough for those humans when the law suddenly takes a new turn.

Immediately after Roe vs Wade was overturned in June 2022, we saw one story after another about women deleting their pregnancy apps (at least the women who used them as aids to avoid becoming pregnant). And that was a sensible decision by all of them, given how researchers have reported that the majority of pregnancy apps share large quantities of personal data with other companies.

The tone in the discussions about location data also changed. In 2019, the New York Times released its Privacy Project. The newspaper had obtained a dataset containing location data for more than 12 million Americans. The data contained more than 50 million location pings that were claimed to be anonymous. And yet it took only a few minutes for the newspaper to work out which of the movement patterns belonged to Donald Trump. Of course, when it comes to location data it’s child’s play to de-anonymize it, because there aren’t many people who sleep in the same place as you and then go to the same workplace as you.

Now take that type of database and pull out all the location pings linked to an abortion clinic and then follow their journeys home. This isn’t a hypothetical exercise. Vice reported that for a measly 160 USD it’s possible to buy a full week’s list of the people who visited a specific clinic linked to pregnancy – and that it’s even possible to see where the visitors came from and where they went afterwards. This is data that absolutely anyone can buy.

We’ve already seen the perfect storm caused by a combination of data brokers and their dubious records, the willingness of US states to imprison women who have abortions and greedy bounty hunters. In Texas and Oklahoma, an inhabitant – absolutely any inhabitant whatsoever – can get up to ten thousand dollars’ reward by reporting women who have broken the abortion laws.

The harsh reality is that while we're now worried about women who seek abortions being targeted, the same apparatus could be used to target any group [...] at any moment, for any reason that it chooses.

Shoshana Zuboff

A digital infrastructure has been constructed that makes it possible to map peoples’ lives and work out what they will do next. And in a country like the USA, the authorities have access not only to their own tools, but also to the commercial companies that follow every step we take. Once such a system is in place, it’s very easy to shine the spotlight wherever you want. As an article in the New York Times puts it: “A woman who regularly eats sushi and suddenly stops, or stops taking Pepto-Bismol, or starts taking vitamin B6 may be easily identified as someone following guidelines for pregnancy. If that woman doesn’t give birth she might find herself being questioned by the police, who may think she had an abortion.”

AI systems have even been developed to calculate the probability that young girls will become pregnant. In a 2018 collaboration between Microsoft and an Argentinian organization, algorithms were developed that they claimed were 86% accurate at calculating which girls would become pregnant within a six year period. Behind the Argentinian organization was a well-known anti-abortionist.

The abortion issue is a clear example of how ‘I have nothing to hide’ can change. But that’s ‘only’ one example of a much more widespread phenomenon. As Shoshana Zuboff said in an interview in the Washington Post:

“The harsh reality is that while we’re now worried about women who seek abortions being targeted, the same apparatus could be used to target any group or any subset of our population – or our entire population – at any moment, for any reason that it chooses. No one is safe from this.”

The digital infrastructure of today can map the lives of people worldwide. But it could get worse. With the development of AI, we risk ending up in a society where everyone’s online behavior is analyzed at a much faster pace. There is much to be said about AI, but if artificial intelligence is good at anything, it’s sorting through large amounts of data. As Edward Snowden put it during a talk in 2024: “Metadata could be collected automatically by ingesting the world’s internet communication through these machines but somebody still had to put their cup in the big bucket, pull it out and lay it on their desk and make sense of it. Machine learning models are going to change this. In my opinion, this is already being done. We simply don’t have the evidence yet, it’s going to be testing, it’s going to be development, we don’t know how it’s being applied, when it’s going to be truly operational – but it’s fantasy to imagine that they’re not doing this, and there is zero regulation that I’m aware of in the United States to prevent this. Nobody is thinking about what these agencies are doing, and we’re not just talking about the NSA here, we’re not just talking about the FBI, we’re talking about the IRS right, and it’s not just the United States, it’s every country. You may go ‘oh I love the United States government doing this, I hope they spy on me as much as possible’, well what about China, what about Russia, what about North Korea, what about Iran, what about every little government that you don’t like, that you don’t agree with. Suddenly they can have every life of every person every day at every moment on a live feed being interpreted at machine speed. And then you start feeding those inferences into a decision-making process. This is the reason that I bring up the protesters at Columbia University, the protesters in Canada, it doesn’t matter whether you’re a liberal, it doesn’t matter whether you’re a conservative, if you stick out, if you stand out, you are going to become non-normative, or rather, anomalous.”

What can you do to reduce data collection? Read more about how a credible VPN and a privacy-focused web browser can contribute to freeing the internet from mass surveillance.

But the absurd quantities of data being collected don’t merely come with direct consequences. Here we look at the incredibly bad effects mass surveillance has on the whole of society.

Still think you have nothing to hide? Here we demolish your argument and explain how mass surveillance affects us all.

Want to learn more about the business model behind data collection? Read more about the financial motivation behind today’s mass surveillance.