Issues of Privacy in the New Big Data Economy
Companies are collecting vast volumes of consumer data to flourish in the new data economy. However, these data collection initiatives rarely include transparent explanations of data usage, which is a real cause for concern among consumers and privacy activists.
I initially became aware of the issues raised by massive data collection in 2012. Target discovered that she was pregnant after analysing past purchasing data (for example, unscented lotion, nutritional supplements, and cocoa butter) of one adolescent in Minneapolis. The company sent her a mailer with nursery and maternity clothes options. Her father intercepted the advertisement, marched into his local Target, and confronted the store manager about sending this type of advertisement to his daughter while she was still a high school student.
The manager apologised profusely and followed up a few days later with a phone call in which he planned to apologise again. During that call, the father explained that after chatting with his daughter, he discovered that she was indeed pregnant but had not informed him. Aside from the family drama, Target’s ability to study the girl’s seemingly innocuous purchasing habits and make such a bold prediction was both fascinating and alarming.
That was in 2012, and given how quickly technology advances, imagine how much more sophisticated and advanced big data collection and use has become.
The question isn’t so much what firms know as it is what they don’t know.
The Collection of Big Data
Consumer data collection has become a top priority for businesses across all industries. Many digital business models rely on the collection and exchange of personal information, and this has been the foundation for the rise of large technology brands such as Google and Facebook, says Dr John Bates, CEO of Eggplant, a software testing and monitoring firm. “In exchange for free services, consumers share data, and this data is bought and sold.”
People recognise — and accept — that many organisations collect customer data on their websites, according to Tony Anscombe, Chief Security Evangelist at ESET, which provides multilayered internet security solutions.
However, when a firm has access to your surfing history and is aware of every page you’ve visited, “People may have a different perspective now that the collector has a more complete data picture,” he explains.”When this is combined with private details that could define the data subject, the collector may be able to develop a full profile, especially when combined with other data sources.”Acxiom is a large data broker that helps businesses purchase, sell and trade data. Financial services, automotive, insurance, retail and consumer goods, media, technology, and telecommunications are among the areas with which the organisation collaborates. Identity resolution is one of the company’s services. As per the company site, it offers “the ability to recognise an entity, whether it is a human, place, or object, as well as affiliated interactions.” Considering both physical and digital qualities, consistently and accurately, regardless of channel, location, or device, with contextually appropriate degrees of precision. “
That’s a bold proposition, and the ability to deliver on it necessitates huge volumes of data.
Other Types of Collection
In addition, he claims that if you use Apple CarPlay or Google’s Android Auto, you may be revealing data from your car to third-party app providers. Bates explains that “Mobile applications collect data, which is then mined and sold.”
Of course, Alexa and other intelligent assistants are listening in on your chats. “Corporations like Amazon and Google have a history of misusing our digital data,” says Raullen Chai, CEO of IoTeX, a Silicon Valley startup that creates smart gadgets that preserve users’ privacy.
And, according to him, these titans are now infiltrating physical homes and neighbourhoods. “They’re on a spending frenzy, acquiring startups like Ring, Nest, and Fitbit to gain access to our homes and bodies,” Chai argues. “With millions of computers in our homes paying close attention to us around the clock, protecting our data and privacy is more important than ever.” Because of this, linked devices can be a double-edged sword. While the Internet of Things might make your house safer and more secure, it is also true that the IoT can jeopardise the safety and security of your house.
The Risks of Centralized Profiles
This is especially troubling for several reasons. “The most concerning parts of the problem are not what our data directly says about us—what we buy, our socioeconomic traits, who we know, and so on,” says Dr Jen Golbeck, associate professor and director of the Human-Computer Interaction Lab at the University of Maryland in College Park.
What concerns her most is the collection and analysis of that data. Almost all of our details, including our devices, software, and required tasks, can be linked to a centralised profile. ” All of them reveal distinct hints that can be matched to other information containing true identifiers such as email address, name, street address, and so on,” Golbeck explains.
“This means that our grocery purchases, computer browsing habits, television viewing habits, and what we say to smart devices can all be combined.” She also claims that when there is cooperation, they have so much data that they can employ computers to generate fresh insights. Some of these insights could be products or services that we would be interested in.
However, Golbeck adds, “this information may also be used to assess whether or not we should receive a job, a mortgage, or be admitted to a school.” In China, these types of characteristics are utilised in a social credit score, which can influence the speed of your at-home internet connection, what apartments you can rent, which aircraft or trains you can travel on, and whether you have access to loans.
While this is not being used in the United States — at least not yet — she believes it demonstrates how big data can be used to make judgments.”And the fact is that these procedures are occasionally incorrect, with no clear medium for changing them, raising concerns among businesses about their use.” “And the reality is that these techniques are every once in a while untrue, with no clear media format for changing them, raising enterprises about their use,” according to the report. following Federal Trade Commission, 25 consumers discovered inaccuracies on their credit reports that could harm their credit conditions. This might result in them paying further for bus loans and insurance, or indeed being denied.
Sequestration in Healthcare
Bates sees an issue in healthcare where sequestration takes an alternate seat to the invention.” The quantum of data generated by linked medical bias, EHRs, wearables, and health apps However, it can be used to identify people, and insurance providers can change healthcare decorations, “if deposited health care data is made public.” And, given the cybersecurity battle in the healthcare business, this is a veritably real issue.
Another illustration is the contact tracing apps being created to track COVID-19’s spread.” These apps have raised enterprises’ concerns about data segregation and people’s willingness to accept their data being tracked,” he says. Telehealth has been praised for its effectiveness in treating patients. manage costs and provide accessible medical care. Still, the expansion of telehealth is raising sequestration issues regarding cases’ medical data as well as the need to ensure that healthcare providers follow sequestration regulations.
Another source of concern is the position of security handed down by these apps and services. However, data may be revealed in the event of a breach, “If they’ve inadequate security.”
Translucency and Regulation
The secretiveness around data use and the lack of substantial guidelines worsen the problem. ” In the U.S., there are reportedly many legal restrictions on the use of big data analysis,” Priebe explains. Still, several policymakers and organisations in the United States, including the Federal Trade Commission and the National Conference of Officers on Uniform State Laws, are trying to resolve the problems.
Getting businesses to be transparent is much easier said than done.
Priebe says that big data analysis companies, such as large hunting machines, social media platforms, mobile phone service and operation providers, and digital advertising companies, take the position that the computer algorithms they use to assay particular information relating to consumers are intellectual property and nonpublic trade secrets.
As an outgrowth, “big data diligence” does not want to give up what they see as their complex and delicate technological invention advantage over other companies by being open about what their big entails data She explains “what automated systems do, how they work, what data they determine on, and how credible they are.”
Issues of Bias and Demarcation
And, unless and until fresh regulation is enforced, Bates expects that bias and demarcation enterprises will worsen.” Further data legislation is demanded to cover individuals, businesses, governments, and, eventually, society as a whole,” he contends.”Instead of providing poisonous, misleading, or terrible results, we must be prepared to rely on data to help someone.” He does recognise, however, that overregulation may impede frugality. There are numerous positive advances that big data may produce.