ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > Living Room

Privacy (collected references)

<< < (7/10) > >>

IainB:
Could you please LIKE me?
There is a series of short spoofs from "Black Mirror", offering a glimpse of where we might potentially be heading.

wraith808:
^ Black Mirror is, in general, a very twisted view of several small concepts we take for granted in everyday life.  I recommend the series.  That particular one is very insightful into our current state of the world.

IainB:
The amusing Black Mirror video above, though artificial, is arguably a prescient comment on the implicit risks inherent in a tendency for "oversharing" and/or "data leakage" in the IT-enabled SNM (Social Network Marketplace) and other personal-data-related databases (e.g., health, insurance, banking, education), where private data that one might have previously perceived as being peculiar and useful/relevant in one context only is subsequently seen to be useful/relevant in another, or maybe many other contexts. These are typically the data connections and interconnections that the SNM operators and data miners would tend to seek/exploit, for financial gain.

When I was contracted in 2003 to get a failed data analysis project back on the rails, I learned quite a lot. It was a complex IT project to implement a BI (Business Intelligence) system and we had to train the users in the application of BI (it's actually quite powerful and hairy stuff) to meet the growing and complex business needs of the power (energy) company who had contracted me into the project recovery role. I learned that what the Defence sector had always taught was true - that all data/information can be interconnected at some stage - and that, for BI, the world could be simply envisaged as one or more universes of dynamic data - each having its own peculiar descriptive and dynamic data model and that, as in the popular SF concept of parallel universes, there was the potential to interlink these data universes (mass aggregations of dynamic data sets), constantly combining/recombining and drawing data from one to the other, enabling the BI analyst to discover data relationships in a way that would probably not have previously been feasible on such a mass scale, using the then prevailing technologies.

Fast forward to 2018, where we can perhaps now better understand why we might have the apparent privacy shambles that we see around us. It was a gold-rush, opportunistic, every man for himself. Presumably the Google/Facebook founders (and others) would have seen it coming. There were little/no regulations to limit or constrain the progress of BI and its application in the field of mass demographics. Now that some regulations have belatedly been or are being implemented, it arguably may be too late anyway - locking the stable door after the horse has bolted; Pandora's box has already been opened.

IainB:
Following up on the BI comment above: I came across a link to what seemed an interesting viewpoint from the Economist, though from experience I'd suggest taking a pinch of salt with anything they nowadays publish - just in case, like, and especially when prefixed with the ominous religio-politically ideological cliché "Open":
(Copied below sans embedded hyperlinks/images.)
Open Future
Toward defining privacy expectations in an age of oversharing
Our digital data deserves protection, writes Margot Kaminski, a law professor

Open Future
Aug 16th 2018by MARGOT KAMINSKI
What assurances of privacy do we have in this digital age? Until this year, the quick answer was: effectively none. We share personal data with companies, who in turn share it with other companies and the government, with few if any legal means of individual redress or control. But if 2018 will be remembered as the year of Cambridge Analytica—a British data-mining company that allegedly influenced both British and American elections by targeting voters using personal data—it will also be remembered as the year that privacy law finally started catching up to the Internet.

In America, the “third-party doctrine” has long governed privacy law. This view holds that one can have no privacy expectations in information shared with a third party. The government could obtain a list of phone numbers you called without a warrant, for instance, because you shared that information with a phone company.

This runs counter to most people’s expectations, especially today. Privacy is many things to many people, but one thing it is not is absolute secrecy. We share sensitive information with our doctors; we hold whispered conversations in public places; we rely on practical obscurity even in big cities; and we disclose our most intimate information by text and by email.

Helen Nissenbaum, an ethicist at Cornell University, refers to the foundation of our digital-privacy expectations as “contextual integrity”. When we reveal information in one context, we trust that it won’t pop up to surprise us in another.

Another way to think of it is that we regularly use perceived features of our environments, both physical and social, to manage degrees of disclosure. An email service that uses a mailbox as its logo signals that email will be kept between sender and recipient—just like a regular letter—even if it is in fact stored on a company’s servers.

In June 2018, however, the Supreme Court struck a serious and welcome blow to the third-party doctrine in its Carpenter v. United States ruling. That case asked whether police needed a warrant to access someone’s mobile-phone location data. The Court held that historic mobile-phone location data deserved privacy protections, even if it is shared (often unknowingly) with a mobile-phone service provider.

The Court recognised that what used to be non-sensitive data—a record of your travels through public places—has, in the age of big data, effectively been converted into sensitive information. When gathered en masse and analysed, where someone travels can reveal her religion, health problems, sexual preferences and political affiliations. The Court thus recognised that privacy harms can trigger a wealth of related harms, chilling freedom of speech and freedom of association.

While 2018 brought paradigm shifts to American privacy law, in Europe it brought the General Data Protection Regulation (GDPR). The significance of the GDPR goes beyond the annoying barrage of privacy notices that popped up in May. It establishes enforceable individual transparency and control rights.

But the GDPR’s real impact will be within companies, behind the scenes. Backed by significant penalties for breaches (up to 4% of worldwide annual revenue), the GDPR imposes a series of duties on companies, regardless of whether individuals invoke their privacy rights. It requires companies to articulate legitimate reasons for collecting data; to collect only the data that they need; to design new products in ways that protect individual rights; and sometimes to appoint privacy officers and conduct privacy impact-assessments.

At first glance, the gap between Europe and America still appears enormous. The EU now has the GDPR; America continues to lack comprehensive federal data privacy law, relying instead on a patchwork of consumer protection, state laws, and sector-specific laws like the Health Insurance Portability and Accountability Act (HIPAA).

But two recent events have resulted in a surprising array of commonalities: the above-mentioned Carpenter case, and California’s Consumer Privacy Act (CPA), which California passed less than a month after the Carpenter ruling, and which creates an American version of data-protection law.

The California CPA governs not just information that people share directly with companies, but also personal data held by commercial data-brokers. Just as Carpenter suggests that legal protections follow even shared personal data, the CPA imposes transparency and control requirements even on companies that have no direct relationship with consumers. In this way, the CPA represents a shift towards the data protection model embraced in the GDPR. Legal protection travels with the data, regardless of whether or not there is a consumer relationship.

This is not to say that Europe and America are converging. For one, the CPA applies only to California residents (although because California is such a big market the law may influence policies for all Americans—referred to in the context of automobile regulations as the “California effect”). America also has a robust, and in some situations increasingly deregulatory, free speech right in the First Amendment that will likely come into conflict with deletion and disclosure rights.

But there is a growing transatlantic consensus emerging on privacy in the digital age. Sharing data no longer obviates privacy. Privacy protections now increasingly travel with personal information, even if that information is something a company has inferred rather than collected. Both legal systems also increasingly recognise that privacy is, perhaps counterintuitively, deeply linked to transparency: people cannot exert control or request remedies if they do not know where their information is going.

Perhaps most significantly, both legal regimes now exhibit a growing awareness of how linked privacy is to other well-recognised legal harms such as chilling effects on free expression or discrimination against individuals. Even if America does not enact a federal privacy law, the age of free data and diminishing data privacy looks to be rapidly ending.

Margot Kaminski is a professor at Colorado Law. She teaches, researches and writes on the intersection of law and technology.

--- End quote ---

wraith808:
The amusing Black Mirror video above, though artificial, is arguably a prescient comment on the implicit risks inherent in a tendency for "oversharing" and/or "data leakage" in the IT-enabled SNM (Social Network Marketplace) and other personal-data-related databases (e.g., health, insurance, banking, education), where private data that one might have previously perceived as being peculiar and useful/relevant in one context only is subsequently seen to be useful/relevant in another, or maybe many other contexts. These are typically the data connections and interconnections that the SNM operators and data miners would tend to seek/exploit, for financial gain.

When I was contracted in 2003 to get a failed data analysis project back on the rails, I learned quite a lot. It was a complex IT project to implement a BI (Business Intelligence) system and we had to train the users in the application of BI (it's actually quite powerful and hairy stuff) to meet the growing and complex business needs of the power (energy) company who had contracted me into the project recovery role. I learned that what the Defence sector had always taught was true - that all data/information can be interconnected at some stage - and that, for BI, the world could be simply envisaged as one or more universes of dynamic data - each having its own peculiar descriptive and dynamic data model and that, as in the popular SF concept of parallel universes, there was the potential to interlink these data universes (mass aggregations of dynamic data sets), constantly combining/recombining and drawing data from one to the other, enabling the BI analyst to discover data relationships in a way that would probably not have previously been feasible on such a mass scale, using the then prevailing technologies.

Fast forward to 2018, where we can perhaps now better understand why we might have the apparent privacy shambles that we see around us. It was a gold-rush, opportunistic, every man for himself. Presumably the Google/Facebook founders (and others) would have seen it coming. There were little/no regulations to limit or constrain the progress of BI and its application in the field of mass demographics. Now that some regulations have belatedly been or are being implemented, it arguably may be too late anyway - locking the stable door after the horse has bolted; Pandora's box has already been been opened.
-IainB (August 22, 2018, 04:45 AM)
--- End quote ---

It's arguably too late for any of us that are already born and have any social media links.  You'd have to have someone aware of not only what they share, but what others share about them.  You can make it harder to make the links, but I'd posit that it's impossible to restrict data after it has already been ingested into the system of data that surrounds us.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version