topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Thursday March 28, 2024, 6:11 am
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Last post Author Topic: Privacy (collected references)  (Read 27202 times)

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: Privacy (collected references)
« Reply #25 on: August 08, 2018, 04:05 AM »
Some valid points from theregister.co.uk:
Facebook insists it has 'no plans' to exploit your personal banking info for ads – just as we have 'no plans' to trust it
After all, never say never!
By Kieren McCarthy in San Francisco 7 Aug 2018 at 20:4432

Image [Denial]

Analysis: Facebook has denied it is seeking to suck up netizens' bank account details, claiming it just wants to connect bank customers to their bank's chat accounts and give useful financial updates. ...

Copied from: Facebook insists it has 'no plans' to exploit your personal banking info for ads – just as we have 'no plans' to trust it • The Register - <https://www.theregister.co.uk/2018/08/07/facebook_banking_data/>

Yeah, right.

4wd

  • Supporting Member
  • Joined in 2006
  • **
  • Posts: 5,641
    • View Profile
    • Donate to Member
Re: Privacy (collected references)
« Reply #26 on: August 09, 2018, 08:04 PM »
Looks like the Ugandan government  could be in the vanguard when it comes to, uh, privacy...
 ...Uganda orders ISPs to block Ugandans from accessing Pornographic Websites   Nice one!    :Thmbsup:

Except they're 2 or 3 years behind Russia.

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: Privacy - discrimination against vulnerable minorities.
« Reply #27 on: August 19, 2018, 12:12 AM »
The other end of the loss of Privacy is where its loss makes the loser potentially vulnerable, or more vulnerable than they were before, leading to potential risk - e.g., exposure to demographic stratification and targeting, with subsequent stigmatisation, discrimination, harm/loss, even to the extent of circumventing laws that were expressly established to avoid such risks to these vulnerable groups/minorities.
So, right on cue, here's a classic recent example - and yes, Facebook are behind it, because they can make money out of it (of course) - and are doing so. What a surprise (NOT).    :o
From the U.S. Department of Housing and Urban Development
HUD No. 18-085
HUD Public Affairs
(202) 708-0685
FOR RELEASE
Friday
August 17, 2018
HUD FILES HOUSING DISCRIMINATION COMPLAINT AGAINST FACEBOOK
Secretary-initiated complaint alleges platform allows advertisers to discriminate

WASHINGTON – The U.S. Department of Housing and Urban Development (HUD) announced today a formal complaint against Facebook for violating the Fair Housing Act by allowing landlords and home sellers to use its advertising platform to engage in housing discrimination.

HUD claims Facebook enables advertisers to control which users receive housing-related ads based upon the recipient's race, color, religion, sex, familial status, national origin, disability, and/or zip code. Facebook then invites advertisers to express unlawful preferences by offering discriminatory options, allowing them to effectively limit housing options for these protected classes under the guise of 'targeted advertising.' Read HUD's complaint against Facebook.

"The Fair Housing Act prohibits housing discrimination including those who might limit or deny housing options with a click of a mouse," said Anna María Farías, HUD's Assistant Secretary for Fair Housing and Equal Opportunity. "When Facebook uses the vast amount of personal data it collects to help advertisers to discriminate, it's the same as slamming the door in someone's face."

The Fair Housing Act prohibits discrimination in housing transactions including print and online advertisement on the basis of race, color, national origin, religion, sex, disability, or familial status. HUD's Secretary-initiated complaint follows the Department's investigation into Facebook's advertising platform which includes targeting tools that enable advertisers to filter prospective tenants or homebuyers based on these protected classes.

For example, HUD's complaint alleges Facebook's platform violates the Fair Housing Act. It enables advertisers to, among other things:
  • display housing ads either only to men or women;
  • not show ads to Facebook users interested in an "assistance dog," "mobility scooter," "accessibility" or "deaf culture";   
  • not show ads to users whom Facebook categorizes as interested in "child care" or "parenting," or show ads only to users with children above a specified age;
  • to display/not display ads to users whom Facebook categorizes as interested in a particular place of worship, religion or tenet, such as the "Christian Church," "Sikhism," "Hinduism," or the "Bible."
  • not show ads to users whom Facebook categorizes as interested in "Latin America," "Canada," "Southeast Asia," "China," "Honduras," or "Somalia."
  • draw a red line around zip codes and then not display ads to Facebook users who live in specific zip codes.
Additionally, Facebook promotes its advertising targeting platform for housing purposes with "success stories" for finding "the perfect homeowners," "reaching home buyers," "attracting renters" and "personalizing property ads."

In addition, today the U.S. Attorney for the Southern District of New York (SDNY) filed a statement of interest, joined in by HUD, in U.S. District Court on behalf of a number of private litigants challenging Facebook's advertising platform.

HUD Secretary-Initiated Complaints

The Secretary of HUD may file a fair housing complaint directly against those whom the Department believes may be in violation of the Fair Housing Act. Secretary-Initiated Complaints are appropriate in cases, among others, involving significant issues that are national in scope or when the Department is made aware of potential violations of the Act and broad public interest relief is warranted or where HUD does not know of a specific aggrieved person or injured party that is willing or able to come forward. A Fair Housing Act complaint, including a Secretary initiated complaint, is not a determination of liability.

A Secretary-Initiated Complaint will result in a formal fact-finding investigation. The party against whom the complaint is filed will be provided notice and an opportunity to respond. If HUD's investigation results in a determination that reasonable cause exists that there has been a violation of the Fair Housing Act, a charge of discrimination may be filed. Throughout the process, HUD will seek conciliation and voluntary resolution. Charges may be resolved through settlement, through referral to the Department of Justice, or through an administrative determination.

This year marks the 50th anniversary of the Fair Housing Act. In commemoration, HUD, local communities, and fair housing organizations across the country have coordinated a variety of activities to enhance fair housing awareness, highlight HUD's fair housing enforcement efforts, and end housing discrimination in the nation. For a list of activities, log onto www.hud.gov/fairhousingis50.

Persons who believe they have experienced discrimination may file a complaint by contacting HUD's Office of Fair Housing and Equal Opportunity at (800) 669-9777 (voice) or (800) 927-9275 (TTY).

###

HUD's mission is to create strong, sustainable, inclusive communities and quality affordable homes for all.
More information about HUD and its programs is available on the Internet
at www.hud.gov and https://espanol.hud.gov.

You can also connect with HUD on social media and follow Secretary Carson on Twitter and Facebook or sign up for news alerts on HUD's Email List.

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: Privacy - the unmitigated gall of LivreVisage.
« Reply #28 on: August 19, 2018, 01:53 PM »
Ah! Found it! There's gold in them thar DCF datamines. I knew it was here somewhere. Just took me a while to find it though - a past comment on the DC Forum about LifeLock and which happens to be an apposite quote apropos of the recent HUD v. Facebook item, above: (my emphasis)
LifeLock has been running to center on the Equifax breach, with ads and press statements saying how the breach shows how important its own services cost: up to $29.99 a month can be to preserve you from identity theft.
...
Here’s what LifeLock isn’t spreading so widely: When you buy its security, you’re signing up for credit recording and monitoring services provided by, yes, Equifax.
...You just can't make this stuff up: Full Article here

Maybe it's just my rather dark sense of humor, but I just couldn't stop laughing after I read that. The unmitigated gall of corporations these days is just flat-out mind blowing.

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: Privacy - RUMPEL - it's YOUR data, after all!
« Reply #29 on: August 19, 2018, 01:59 PM »
RUMPEL web browser's aim : take back control of your personal data:
Interesting open source project led by the University of Warwick. Its aim is to help users keep track of where information about them is stored online so that they can actually -- personally -- benefit from it. An important issue; whether this is a viable solution or not is another one...   :)
[...] a marketing professor at the University of Warwick who led RUMPEL's development, said: "It's time for people to claim their data from the internet."
"The aim of RUMPEL is to empower users and enable them to be served by the ocean of data about them that's stored in all kinds of places online, so that it benefits them and not just the businesses and organisations that harvest it," she added.
"The strapline 'Your Data, Your Way' reflects our determination to let people lead smarter lives by bringing their digital lives back under their own control."
TechRadar article : New web browser lets you take back control of your personal data
And the GitHub RUMPEL project.

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: Privacy - Could you please LIKE me?
« Reply #30 on: August 20, 2018, 06:55 AM »
Could you please LIKE me?
There is a series of short spoofs from "Black Mirror", offering a glimpse of where we might potentially be heading.


wraith808

  • Supporting Member
  • Joined in 2006
  • **
  • default avatar
  • Posts: 11,186
    • View Profile
    • Donate to Member
Re: Privacy (collected references)
« Reply #31 on: August 20, 2018, 10:05 AM »
^ Black Mirror is, in general, a very twisted view of several small concepts we take for granted in everyday life.  I recommend the series.  That particular one is very insightful into our current state of the world.

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: Privacy - BI Rules?
« Reply #32 on: August 22, 2018, 04:45 AM »
The amusing Black Mirror video above, though artificial, is arguably a prescient comment on the implicit risks inherent in a tendency for "oversharing" and/or "data leakage" in the IT-enabled SNM (Social Network Marketplace) and other personal-data-related databases (e.g., health, insurance, banking, education), where private data that one might have previously perceived as being peculiar and useful/relevant in one context only is subsequently seen to be useful/relevant in another, or maybe many other contexts. These are typically the data connections and interconnections that the SNM operators and data miners would tend to seek/exploit, for financial gain.

When I was contracted in 2003 to get a failed data analysis project back on the rails, I learned quite a lot. It was a complex IT project to implement a BI (Business Intelligence) system and we had to train the users in the application of BI (it's actually quite powerful and hairy stuff) to meet the growing and complex business needs of the power (energy) company who had contracted me into the project recovery role. I learned that what the Defence sector had always taught was true - that all data/information can be interconnected at some stage - and that, for BI, the world could be simply envisaged as one or more universes of dynamic data - each having its own peculiar descriptive and dynamic data model and that, as in the popular SF concept of parallel universes, there was the potential to interlink these data universes (mass aggregations of dynamic data sets), constantly combining/recombining and drawing data from one to the other, enabling the BI analyst to discover data relationships in a way that would probably not have previously been feasible on such a mass scale, using the then prevailing technologies.

Fast forward to 2018, where we can perhaps now better understand why we might have the apparent privacy shambles that we see around us. It was a gold-rush, opportunistic, every man for himself. Presumably the Google/Facebook founders (and others) would have seen it coming. There were little/no regulations to limit or constrain the progress of BI and its application in the field of mass demographics. Now that some regulations have belatedly been or are being implemented, it arguably may be too late anyway - locking the stable door after the horse has bolted; Pandora's box has already been opened.
« Last Edit: September 23, 2018, 08:04 PM by IainB, Reason: Minor correction. »

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: Privacy (collected references)
« Reply #33 on: August 22, 2018, 05:39 AM »
Following up on the BI comment above: I came across a link to what seemed an interesting viewpoint from the Economist, though from experience I'd suggest taking a pinch of salt with anything they nowadays publish - just in case, like, and especially when prefixed with the ominous religio-politically ideological cliché "Open":
(Copied below sans embedded hyperlinks/images.)
Open Future
Toward defining privacy expectations in an age of oversharing

Our digital data deserves protection, writes Margot Kaminski, a law professor

Open Future
Aug 16th 2018by MARGOT KAMINSKI
What assurances of privacy do we have in this digital age? Until this year, the quick answer was: effectively none. We share personal data with companies, who in turn share it with other companies and the government, with few if any legal means of individual redress or control. But if 2018 will be remembered as the year of Cambridge Analytica—a British data-mining company that allegedly influenced both British and American elections by targeting voters using personal data—it will also be remembered as the year that privacy law finally started catching up to the Internet.

In America, the “third-party doctrine” has long governed privacy law. This view holds that one can have no privacy expectations in information shared with a third party. The government could obtain a list of phone numbers you called without a warrant, for instance, because you shared that information with a phone company.

This runs counter to most people’s expectations, especially today. Privacy is many things to many people, but one thing it is not is absolute secrecy. We share sensitive information with our doctors; we hold whispered conversations in public places; we rely on practical obscurity even in big cities; and we disclose our most intimate information by text and by email.

Helen Nissenbaum, an ethicist at Cornell University, refers to the foundation of our digital-privacy expectations as “contextual integrity”. When we reveal information in one context, we trust that it won’t pop up to surprise us in another.

Another way to think of it is that we regularly use perceived features of our environments, both physical and social, to manage degrees of disclosure. An email service that uses a mailbox as its logo signals that email will be kept between sender and recipient—just like a regular letter—even if it is in fact stored on a company’s servers.

In June 2018, however, the Supreme Court struck a serious and welcome blow to the third-party doctrine in its Carpenter v. United States ruling. That case asked whether police needed a warrant to access someone’s mobile-phone location data. The Court held that historic mobile-phone location data deserved privacy protections, even if it is shared (often unknowingly) with a mobile-phone service provider.

The Court recognised that what used to be non-sensitive data—a record of your travels through public places—has, in the age of big data, effectively been converted into sensitive information. When gathered en masse and analysed, where someone travels can reveal her religion, health problems, sexual preferences and political affiliations. The Court thus recognised that privacy harms can trigger a wealth of related harms, chilling freedom of speech and freedom of association.

While 2018 brought paradigm shifts to American privacy law, in Europe it brought the General Data Protection Regulation (GDPR). The significance of the GDPR goes beyond the annoying barrage of privacy notices that popped up in May. It establishes enforceable individual transparency and control rights.

But the GDPR’s real impact will be within companies, behind the scenes. Backed by significant penalties for breaches (up to 4% of worldwide annual revenue), the GDPR imposes a series of duties on companies, regardless of whether individuals invoke their privacy rights. It requires companies to articulate legitimate reasons for collecting data; to collect only the data that they need; to design new products in ways that protect individual rights; and sometimes to appoint privacy officers and conduct privacy impact-assessments.

At first glance, the gap between Europe and America still appears enormous. The EU now has the GDPR; America continues to lack comprehensive federal data privacy law, relying instead on a patchwork of consumer protection, state laws, and sector-specific laws like the Health Insurance Portability and Accountability Act (HIPAA).

But two recent events have resulted in a surprising array of commonalities: the above-mentioned Carpenter case, and California’s Consumer Privacy Act (CPA), which California passed less than a month after the Carpenter ruling, and which creates an American version of data-protection law.

The California CPA governs not just information that people share directly with companies, but also personal data held by commercial data-brokers. Just as Carpenter suggests that legal protections follow even shared personal data, the CPA imposes transparency and control requirements even on companies that have no direct relationship with consumers. In this way, the CPA represents a shift towards the data protection model embraced in the GDPR. Legal protection travels with the data, regardless of whether or not there is a consumer relationship.

This is not to say that Europe and America are converging. For one, the CPA applies only to California residents (although because California is such a big market the law may influence policies for all Americans—referred to in the context of automobile regulations as the “California effect”). America also has a robust, and in some situations increasingly deregulatory, free speech right in the First Amendment that will likely come into conflict with deletion and disclosure rights.

But there is a growing transatlantic consensus emerging on privacy in the digital age. Sharing data no longer obviates privacy. Privacy protections now increasingly travel with personal information, even if that information is something a company has inferred rather than collected. Both legal systems also increasingly recognise that privacy is, perhaps counterintuitively, deeply linked to transparency: people cannot exert control or request remedies if they do not know where their information is going.

Perhaps most significantly, both legal regimes now exhibit a growing awareness of how linked privacy is to other well-recognised legal harms such as chilling effects on free expression or discrimination against individuals. Even if America does not enact a federal privacy law, the age of free data and diminishing data privacy looks to be rapidly ending.

Margot Kaminski is a professor at Colorado Law. She teaches, researches and writes on the intersection of law and technology.
« Last Edit: September 23, 2018, 08:00 PM by IainB, Reason: Minor correction. »

wraith808

  • Supporting Member
  • Joined in 2006
  • **
  • default avatar
  • Posts: 11,186
    • View Profile
    • Donate to Member
Re: Privacy (collected references)
« Reply #34 on: August 22, 2018, 08:51 AM »
The amusing Black Mirror video above, though artificial, is arguably a prescient comment on the implicit risks inherent in a tendency for "oversharing" and/or "data leakage" in the IT-enabled SNM (Social Network Marketplace) and other personal-data-related databases (e.g., health, insurance, banking, education), where private data that one might have previously perceived as being peculiar and useful/relevant in one context only is subsequently seen to be useful/relevant in another, or maybe many other contexts. These are typically the data connections and interconnections that the SNM operators and data miners would tend to seek/exploit, for financial gain.

When I was contracted in 2003 to get a failed data analysis project back on the rails, I learned quite a lot. It was a complex IT project to implement a BI (Business Intelligence) system and we had to train the users in the application of BI (it's actually quite powerful and hairy stuff) to meet the growing and complex business needs of the power (energy) company who had contracted me into the project recovery role. I learned that what the Defence sector had always taught was true - that all data/information can be interconnected at some stage - and that, for BI, the world could be simply envisaged as one or more universes of dynamic data - each having its own peculiar descriptive and dynamic data model and that, as in the popular SF concept of parallel universes, there was the potential to interlink these data universes (mass aggregations of dynamic data sets), constantly combining/recombining and drawing data from one to the other, enabling the BI analyst to discover data relationships in a way that would probably not have previously been feasible on such a mass scale, using the then prevailing technologies.

Fast forward to 2018, where we can perhaps now better understand why we might have the apparent privacy shambles that we see around us. It was a gold-rush, opportunistic, every man for himself. Presumably the Google/Facebook founders (and others) would have seen it coming. There were little/no regulations to limit or constrain the progress of BI and its application in the field of mass demographics. Now that some regulations have belatedly been or are being implemented, it arguably may be too late anyway - locking the stable door after the horse has bolted; Pandora's box has already been been opened.

It's arguably too late for any of us that are already born and have any social media links.  You'd have to have someone aware of not only what they share, but what others share about them.  You can make it harder to make the links, but I'd posit that it's impossible to restrict data after it has already been ingested into the system of data that surrounds us.

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: Privacy - Pandora's box.
« Reply #35 on: August 22, 2018, 10:13 AM »
Mythological basis of the Pandora's box idiom:
According to Hesiod, when Prometheus stole fire from heaven, Zeus, the king of the gods, took vengeance by presenting Pandora to Prometheus' brother Epimetheus. Pandora opened a jar left in his care containing sickness, death and many other unspecified evils which were then released into the world.[4] Though she hastened to close the container, only one thing was left behind – usually translated as Hope, though it could also have the pessimistic meaning of "deceptive expectation".[5]

From this story has grown the idiom "to open (a) Pandora's box", meaning to do or start something that will cause many unforeseen problems.[6] Its modern, more colloquial equivalent is "to open a can of worms".[7]
Source: https://en.wikipedia...wiki/Pandora%27s_box
The thing about "Pandora's Box" was that the contents (the troubles), when once released into the world as a result of Pandora's curiosity, were enduring and timeless and apparently could never be put back into the box, thus adversely affecting all humankind over time, from that point onwards.

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: Privacy - 10 tips recommended for teachers' privacy sanitisation.
« Reply #36 on: September 23, 2018, 09:49 PM »
A salutary tale with a recommended privacy sanitisation list, from Samizdata.net:
(Copied below.)
The Shadow Education Secretary wants to make teachers more vulnerable
tags: Civil liberty & Regulation, Education & Academia, Internet, Privacy & Panopticon, UK affairs
Natalie Solent (Essex) - September 23rd, 2018

The Shadow Education Secretary, Angela Rayner MP (Lab), has called for a ban on anonymous online accounts.

The education spokesperson also called for social media companies to ban anonymous accounts, complaining at a fringe event organised by the Guardian in Liverpool that most of the people that abused her online did so without using their real names.

Rayner said that social media firms should take greater responsibility for their users and complained in particular that Facebook seemed to have indicated that politicians should accept a higher level of abuse.

When asked what she thought about social media, Rayner said: “One of the first things they should do is stop anonymous accounts. Most people who send me abuse me do so from anonymous accounts and wouldn’t dream of doing it in their own name.”

Rayner conceded that using real names would not stop abuse but “it would certainly help a little bit. I think they should do more, they do have a responsibility for online”.
___________________________________

As I mentioned earlier, Angela Rayner is the Shadow Education Secretary. That ought to mean that she is aware that teachers, like MPs, are often subject to harassment. The Times Educational Supplement had an article on that very subject just a few days ago: “Why your social account is not as private as you think”. It began:

The teacher’s Facebook account was set to private. She was certain of that. Yet, in the past week, she had received four friend requests from former pupils. She could not work out how they had found her.

So, as I am a researcher at the Greater Manchester Police – and her friend – she asked me to take a look. Within 10 minutes, I had not just found her, but I also had her full name, her partner’s name, the school she worked at, the name of one of her children and multiple images of the street she lives on.
___________________________________

The writer, Chris Glover, proceeded to give ten tips that teachers should employ to protect themselves:
  • 1. Keep accounts separate.
  • 2. Vary usernames.
  • 3. Check posts about you.
  • 4. Beware of public posts.
  • 5. Review privacy settings.\
  • 6. Don’t follow your school account.
  • 7. Avoid using your real name.
  • 8. Change the friends-list setting.
  • 9. Switch off location.
  • 10. Delete dormant accounts.

Following the above advice should help ensure that teachers can enjoy participating in life online while minimising the very real risk of being tracked down by former or current pupils bearing a grudge, or simply by people whom it is best to keep at arms length for professional or safeguarding reasons.

Until a Labour government gets in and makes Nos. (2) and (7) illegal outright, and demands that all of your personal details are held in one place by a social media company so as to be conveniently available for hackers and identity thieves.

The context here (for the benefit of non-British readers) is that the UK currently has a Conservative-led government, so the Labour party is the party "in opposition", as it were, and has "shadow ministers" for each of the main ministerial departments, of which Education is one.
There are 2 rather depressing things about this:
  • 1. Rayner - who is in the important role of Shadow Education Secretary  - would need to know about current issues in Education and would be expected to have her fingers on the Education pulse, as it were, yet she was apparently recommending a ban on anonymous online accounts, and she would have presumably been stating that as a Labour policy approach.

  • 2. However, despite being Shadow Education Secretary, Rayner seemed to have been unaware of the article in The Times Educational Supplement on this rather important matter, published a few days prior. This looks to be a classic clueless and foot-in-mouth response by the Shadow Education Secretary and it could have adverse consequences - e.g., tend to make floating voters (and possibly others) think twice before voting Labour in the next general election.

Though it is rather telling - and Labour voters could be forgiven for weeping or doing a face-palm over this, just as the other voters could be forgiven for having a LOL moment - if we look on the bright side, then the article  in The Times Educational Supplement gave us 10 very good points for improving privacy. These are points that we could extract and all follow to our advantage - i.e., not just teachers - and if Rayner had not made the gaff that she did, then we might never have heard of the 10 points and they would have remained buried in the article in The Times Educational Supplement.
« Last Edit: September 27, 2018, 12:05 PM by IainB »

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: Privacy (collected references) - more privacy sanitisation.
« Reply #37 on: September 24, 2018, 03:47 AM »
Another potentially helpful privacy sanitisation list from abine.com (too long to post here, so just the link): 8 Steps to Secure Your Facebook Privacy Settings

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Having worked in Defence and marketing and having managed the design, development and delivery of smart nationwide credit-card driven EFT-POS systems which collect, curate, manipulate and use user data for marketing advantage, I have learned some very good reasons why the individual needs to understand:
  • (a) the value and fragility of personal data-privacy and
  • (b) its relevance to freedom/liberty.

I generally try to think for myself and prefer to take a healthily skeptical and politically agnostic outlook on life. I am personally fed up to the back teeth with the incessant incitement to outrage and the bombardment of absurd political bias and being told how to behave or encouraged to moronically right-think all the time, as pushed by a majority media cohort apparently funded by vested interests (i.e., propaganda, AKA "fake news") seemingly hell-bent on manipulating us (e.g., including the Facebook - Cambridge Analytica fiasco and SnowdenGate.).

Though it inevitably seems/tends to push its own peculiar political bias a lot of the time (like many websites), the website innov8tiv.com occasionally publishes what seem to be relatively well-balanced posts on topics that could be of interest. I therefore keep it in my BazQux feed-reader and periodically check it out.
IMHO, the item copied below from innov8tiv.com is potentially informative and thus worth a read:
(Copied below sans embedded hyperlinks/images, with my emphasis.)
People browsing using Chrome were quietly logged into their Google accounts without their consents | So much for users’ Privacy
 Felix Omondi  September 24, 2018  Apps and Software

A professor at Johns Hopkins and a cryptography expert, Matthew Green, called out Google for making changes to Chrome, making the browser log in users into their Google account without the consent or even notifying them. A move security experts say puts the users’ privacy into jeopardy.

Historically, Chrome users have had the option of using the browser without logging in to their Google accounts. Although logging in does come with some obvious benefits such as having your bookmarks, passwords, and browsing history synced in the cloud and available across any device you are browsing on using the Chrome browser.

However, for security-conscious users who do not have Google – the most prominent advertising entity in the world – have their browsing data for purposes of sending them targeted Ads. Now that Google has made changes to the new Chrome to make the browser log users secretly into their Google Accounts means Google will get the data of users who would otherwise not have logged into their accounts.

Google has come out addressing these concerns raised by security experts stressing that users must have consented to the sync feature thus allowing the browser to transfer their data. Buried in the sync feature, is the revelation that for the sync feature as it works out will automatically also log you into your Google account.

So when a user logs in to their Gmail account on the browser, Chrome also automatically logs into their Google account. All that happens without the consent of the user or the user getting notifications.

“Now that I’m forced to log into Chrome,” wrote Green, “I’m faced with a brand new (sync consent) menu I’ve never seen before.”

Copied from: People browsing using Chrome were quietly logged into their Google accounts without their consents | So much for users’ Privacy | Innov8tiv - <http://innov8tiv.com/people-browsing-using-chrome-were-quietly-logged-into-their-google-accounts-without-their-consents-so-much-for-users-privacy/>
Interestingly enough, this would seem to be exactly the sort of thing that HAT (Hub of All Things) - referred to above per Armando (2016-07-29, 14:49:38) - is apparently designed to protect us from, whilst at the same time increasing our privacy and freedom of choice:

What is the Hub of all Things?


The Hub of All Things


Happy days.

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: Privacy- MEGA - General Data Protection Regulation Disclosure.
« Reply #39 on: September 27, 2018, 11:38 AM »
MEGA - MEGAsync - General Data Protection Regulation Disclosure:
(Copied below sans embedded hyperlinks/images.)
Spoiler
General Data Protection Regulation Disclosure

Introduction
In 2013 MEGA pioneered user-controlled end-to-end encryption through a web browser. It provides the same zero-knowledge security for its cloud storage and chat, whether through a web browser, mobile app, sync app or command line tool. MEGA, The Privacy Company, provides Privacy by Design.

As all files uploaded to MEGA are fully encrypted, their contents can’t be read or accessed in any manner by MEGA. Files can only be decrypted by the original uploader through a logged-in account, or by other parties who have been provided with file/folder keys generated by the account user.

Personal data is information relating to an identifiable natural person who can be directly or indirectly identified in particular by reference to an identifier.

MEGA stores the following categories of Personal Data
Contact Details
  • Email addresses
  • User’s name (if provided)

Transaction Details
  • IP address and Source Port for account creation and file uploads
  • Country location (inferred by matching IP to MaxMind IP database)
  • File size and date uploaded
  • Date that file/folder links are created
  • MEGA contacts
  • Chat destination contact(s) and time sent
  • Call destination contact(s), call start time and call duration
  • Subscriptions and payment attempts
  • Information provided to a payment processor when processing a subscription payment, such as Tax ID number, but not the credit/debit card number.

MEGA does not receive or store special categories of personal data or data relating to criminal convictions and offences, as any files that are uploaded to MEGA are fully encrypted at the user’s device so the encrypted data is not able to be decrypted by MEGA.

MEGA doesn’t share the data with any other party other than with competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences and as specified in the Privacy Policy clause 11.

Purpose
The purpose of storing the data is to manage account login and activity and to respond to information demands from authorities.

Processing
MEGA stores personal data but does not carry out any other processing activities on such data. This storage of personal data is necessary in order to provide the secure login to MEGA’s systems and to satisfy compliance obligations.

Lawful Basis of Processing: Contract
The processing of data is necessary for performance of the contract that MEGA has with each user, which they accepted through the Terms of Service when creating their account.

The Terms of Service clause 2 requires the user to agree to the Terms or otherwise to not use the service. Acknowledging and accepting the Terms of Service is a mandatory step in the signup process in all clients - web and mobile.

Clauses 50-51 of the Terms of Service incorporate the Privacy Policy by reference. The Privacy Policy specifies the personal information that is stored.

Retention of Personal Data
Personal data is retained indefinitely while the user’s account is open. After account closure, MEGA will retain all account information as long as there is any law enforcement request pending but otherwise for 12 months after account closure as users sometimes request that an account be re-activated. After 12 months, identifying information such as email and IP addresses will be anonymised (except that email address records will be retained for reference by the user’s contacts or where the user has participated in chats with other MEGA users) but other related database records may be retained.

After user deletion of a file all deleted files will be made inaccessible, marked for deletion and deleted fully when the next appropriate file deletion purging process is run.

After account closure all stored files will be marked for deletion and deleted fully when the next appropriate file deletion purging process is run.

Data Subject’s Rights
Each user has the rights specified in this disclosure notice.

Withdrawal of Consent
Users can only withdraw consent to MEGA collecting the specified personal information if they close their account.

Statutory and Contractual Obligations
Personal Information collected by MEGA is not collected because of any contractual or statutory obligation to third parties.

Automated Decision Making and Profiling
MEGA does not undertake any automated decision making or profiling.

The Right of Access
Individuals have the right to obtain:
  • confirmation that their data is being processed;
  • access to their personal data;
  • Any requests should be submitted to [email protected]. The information will be provided promptly, and at least within one month, without charge unless the request is manifestly unfounded or excessive.

Rectification
Individuals are entitled to have personal data rectified if it is inaccurate or incomplete. If MEGA has disclosed the personal data in question to any third party (such as a compliance authority), it will inform them of the rectification where possible and will also inform the individuals about the third parties to whom the data has been disclosed where appropriate. The only third parties that might have had disclosure are compliance authorities.

Erasure
The right to erasure does not provide an absolute ‘right to be forgotten’. Individuals have a right to have personal data erased and to prevent processing in specific circumstances:
  • The personal data is no longer necessary in relation to the purpose for which it was originally collected/processed.
  • The individual withdraws consent.
  • The individual objects to the processing and there is no overriding legitimate interest for continuing the processing.
  • The personal data was unlawfully processed (i.e. otherwise in breach of the GDPR).
  • The personal data has to be erased in order to comply with a legal obligation.
  • The personal data is processed in relation to the offer of information society services to a child.
Any requests for erasure will be considered in detail and would probably result in closure of the user’s account.

After account closure, MEGA will retain all account information as long as there is any law enforcement request pending but otherwise for 12 months after account closure as users sometimes request that an account be re-activated. After 12 months, identifying information such as email and IP address will be anonymised (except that email address records will be retained for reference by the user’s contacts or where the user has participated in chats with other MEGA users) but other related records may be retained.

After user deletion of a file all deleted files will be made inaccessible, marked for deletion and deleted fully when the next appropriate file deletion purging process is run.

After account closure all stored files will be marked for deletion and deleted fully when the next appropriate file deletion purging process is run.

In some cases a person may receive an email from MEGA asking the person to confirm their new account email address, but in fact they haven’t tried to open an account - someone else has started the process and used their email address either maliciously or by mistake. In these cases, MEGA has an ephemeral/incomplete account that might be used to upload files. On request, and after proving ownership of the email address, MEGA will arrange for the account to be deleted.

MEGA can refuse a request for erasure:
  • to comply with a legal obligation for the performance of a public interest task or exercise of official authority.
  • for public health purposes in the public interest;
  • for the exercise or defence of legal claims.
The Right to Restrict Processing
Individuals have a right to ‘block’ or suppress processing of personal data. When processing is restricted, MEGA is permitted to store the personal data, but not further process it. As MEGA only stores, and doesn’t further process the stored personal data, no action will be taken in response to a request to restrict processing.

Data Portability
The right to data portability only applies:
  • to personal data an individual has provided to a controller;
  • where the processing is based on the individual’s consent or for the performance of a contract; and
  • when processing is carried out by automated means.
On request by email to [email protected], MEGA will provide a user’s personal data in a structured, commonly used and machine readable form such as JSON files.

Note that all files in a user’s account can be downloaded and decrypted through any of the usual clients.

Lead Data Protection Supervisory Authority
The Lead Data Protection Supervisory Authority is the Luxembourg National Commission for Data Protection. This is the appropriate authority for accepting GDPR complaints about MEGA.

NATIONAL COMMISSION FOR DATA PROTECTION
1, avenue du Rock'n'Roll
L-4361 Esch-sur-Alzette
https://cnpd.public.lu

Controller
MEGA Limited
Level 21, Huawei Centre
120 Albert St
Auckland
New Zealand
Company number 4136598

Controller’s Representative
Mega Europe sarl
4 Rue Graham Bell
L-3235 Bettembourg
Luxembourg
Company number B182395
[email protected]

The Privacy Company. User-encrypted cloud services
 
MEGA
About us Plans & Pricing Resellers Service Policy Press & Media Credits Contact Us
Apps
iOS Android Windows Mobile Browser Extensions MEGAsync MEGAcmd MEGAbird
Support
Help centre Blog
Tools
SDK Source Code
Legal
Terms of Service Privacy policy Copyright Takedown Guidance General Data Protection Regulation Disclosure
© MEGA 2018 All rights reserved

« Last Edit: September 27, 2018, 12:11 PM by IainB »

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: Privacy - HAT (Hub of All Things) = Sovrin ?
« Reply #40 on: September 27, 2018, 11:55 AM »
EDIT: Oops! Forgot to post this initially:
Identity For All - Permanent Digital Identities that Don’t Require a Central Authority
The Sovrin Solution
Sovrin is a decentralized, global public utility for self-sovereign identity. Self-sovereign means a lifetime portable identity for any person, organization, or thing. It’s a smart identity that everyone can use and feel good about. Having a self-sovereign identity allows the holder to present verifiable credentials in a privacy-safe way. These credentials can represent things as diverse as an airline ticket or a driver's license.

Sovrin identities will transform the current broken online identity system. Sovrin identities will lower transaction costs, protect people’s personal information, limit opportunity for cybercrime, and simplify identity challenges in fields from healthcare to banking to IoT to voter fraud.

...Interestingly enough, this would seem to be exactly the sort of thing that HAT (Hub of All Things) - referred to above per Armando (2016-07-29, 14:49:38) - is apparently designed to protect us from, whilst at the same time increasing our privacy and freedom of choice:

What is the Hub of all Things?


The Hub of All Things
...
« Last Edit: November 26, 2018, 11:12 PM by IainB »

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: Privacy: IT companies intend to profit from your loss of privacy?
« Reply #41 on: November 15, 2018, 03:42 PM »
Well, whilst this news might not be too surprising to some, to me it comes as a complete surprise:
Twitter, Facebook, and Google are Fighting Internet Privacy Laws
WRITTEN BY: JULIANNE SUBIA - NOVEMBER 14, 2018

Recently, the Information Technology Industry Council, which represents companies like Amazon, Visa, Microsoft, Google, Facebook, and Apple, released its “Framework to Advance Interoperable Rules (FAIR) on Privacy”. On the surface, it looks like tech companies are trying to protect user privacy. In reality, they want to make sure that they can continue to profit off of our data. Using simple privacy tools like DeleteMe and Blur will help you stay in control of your privacy.
(Read the rest at the link.)

Copied from: Twitter, Facebook, and Google are Fighting Internet Privacy Laws - <https://www.abine.com/blog/2018/twitter-facebook-google-fighting-privacy-laws/>

Oh noes! Looks like we're gonna have to pay money to third parties like abine.com to protect our personal privacy...Oh wait, how did that happen?
Who would'a thunk it, eh?    :tellme:
« Last Edit: November 16, 2018, 12:30 AM by IainB »

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: Privacy - Are your thoughts really your "own" thoughts?
« Reply #42 on: December 05, 2018, 03:06 PM »
I had always considered that the privacy of my mind was unassailable and that my thoughts were my own, and nobody could take them away from me - even if I were in the Stalags. Now I am not so sure. I commented the other day to my now 17 y/o daughter that, as an experiment, I had for the first time deliberately allowed Google permission to use "my" data - data about me that it already captures and holds and has access to, by default - to aim targeted ads at me. I told her that I found the result interesting, but somewhat disquieting.

Here is a very interesting - if not alarming - review on what happens, apparently almost immediately, when we succumb to allowing this kind of access through our privacy walls, reported on by spreadprivacy.com. Such a loss/reduction in privacy effectively enables third parties to engineer algorithms that could manipulate/modify our paradigms, sometimes without our even being aware of it, and it is happening now, even as I write this. It goes far beyond subliminal advertising, since it can clearly be used - and is being used - to subtly control/manipulate our perception of the reality of the world about us:
Measuring the "Filter Bubble": How Google is influencing what you click

This goes far beyond merely allowing access to private data, being, in effect, more like giving permission to be brainwashed by a third party(ies). And we seem to be highly susceptible to it. It's very clever, and insidious, though I suppose it could be argued that it's not harmful, but merely a conditioning of one's thinking.

Again, Pandora's box has been well and truly opened:
Fast forward to 2018, where we can perhaps now better understand why we might have the apparent privacy shambles that we see around us. It was a gold-rush, opportunistic, every man for himself. Presumably the Google/Facebook founders (and others) would have seen it coming. There were little/no regulations to limit or constrain the progress of BI and its application in the field of mass demographics. Now that some regulations have belatedly been or are being implemented, it arguably may be too late anyway - locking the stable door after the horse has bolted; Pandora's box has already been opened.

4wd

  • Supporting Member
  • Joined in 2006
  • **
  • Posts: 5,641
    • View Profile
    • Donate to Member
Re: Privacy (collected references)
« Reply #43 on: December 18, 2018, 04:59 PM »
The death of the technology industry in Australia happened last week.

EDIT: A possibly better explanation of what the new laws involve: Australia’s horrific new encryption law likely to obliterate its tech scene

The new law gives Australian law enforcement agencies the power to issue cooperation notices to technology entities with the purpose of gaining access to specific users’ encrypted messages and data. These entities may include companies, websites, or anything else that transmits data to an end-user in Australia.

Australia: Controversial Australian Encryption Act Denounced By Privacy And Cryptography Advocates

Last week, Australia's parliament passed a controversial act that will enable law enforcement and intelligence agencies to compel access to encrypted communications. In an explanatory memorandum, the Australian Parliament stated that the new act, the Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018, is intended to combat "the challenges posed by ubiquitous encryption." Under the act, certain law enforcement and intelligence agencies will be able to approach "designated communication providers," using one of the mechanisms below, for the purpose of gaining access to specific users' encrypted messages and data.

  • Technical Assistance Requests (TARs) – These are voluntary requests that allow law enforcement and intelligence agencies to request access to communications and data while bypassing the oversight rules surrounding mandatory notices. TARs may be issued by the directors-general of the Australian Security and Intelligence Organization (ASIO), the Australian Secret Intelligence Service (ASIS), or the Australian Signals Directorate (ASD), or by the chief officer of an "interception agency," which includes the Australian Federal Police (AFP), the Australian Crime Commission (ACC), and the state and territory police forces, provided that they obtain the approval of the AFP commissioner.
  • Technical Assistance Notices (TANs) – These are compulsory notices requiring a "designated communication provider" to use existing interception or decryption capabilities to provide access to communications or user logs. TANs can be obtained only by the director-general of the ASIO or the chief officer of an interception agency.
  • Technical Capability Notices (TCNs) – These are compulsory notices requiring designated communication providers to build infrastructure to meet subsequent TANs. TCNs may be issued only by the attorney general, with the approval of the minister for communications, following a request from the ASIO or the chief officer of an interception agency, and require written notice to the communication provider, allowing them the opportunity to respond within 28 days.

The new act allows these agencies to directly approach specific individuals, such as engineers or IT administrators at an organization, rather than the organization itself. Companies that resist the demands could face a fine of up to $7.3 million, while individuals who refuse could face jail time.

Australia’s encryption law threatens NZ cloud data
Tech Companies Line Up To Pan Encryption Bill
Encrypted Messaging App Signal Won’t Comply With Australia’s New Backdoor Bill

... and more ...

Welcome to Australia, a country run by idiots elected by idiots.
« Last Edit: December 18, 2018, 06:05 PM by 4wd »

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: Privacy (collected references)
« Reply #44 on: December 20, 2018, 11:06 AM »
@4wd: Yes. Some people (not me, you understand) might say that we should have expected to see this sort of messing-about with the privacy rights/rules from the Aussies and that they can't even win a game of cricket without bowling under-arm, or something - but I couldn't possibly comment.    :o

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
Very interesting: DuckDuckGo Testimony (on Privacy) Before the US Senate.
(Copied below sans embedded hyperlinks/images.)
Below is the prepared testimony of Gabriel Weinberg, CEO & Founder of DuckDuckGo, before the United States Senate Judiciary Committee Hearing on GDPR & CCPA: Opt-ins, Consumer Control, and the Impact on Competition and Innovation.

March 12, 2019

Chairman Graham, Ranking Member Feinstein and Members of the Committee, thank you for holding this important hearing and inviting me to testify. I am here to explain that privacy legislation, like the GDPR and CCPA, is not only pro-consumer, but also pro-business, and pro-advertising.

DuckDuckGo's primary service is a search engine alternative to Google that allows you to search the web without being tracked. We are the fourth largest search engine in the US, serving over one billion searches a month globally. We also offer a mobile privacy browser that serves as an alternative to Google Chrome.

We regularly conduct rigorous consumer research on privacy issues, which we post at SpreadPrivacy.com. We also help educate consumers about online privacy from our Twitter account, @duckduckgo.

I founded DuckDuckGo in 2008, far outside of Silicon Valley, in Valley Forge, Pennsylvania. We now have a distributed workforce spread across the nation in twelve states, the District of Columbia, and in ten other countries.

As you know, people are tired of being watched everywhere they go online. They are fed up with all the intended and unintended consequences this online tracking creates, including invasive ads, identity theft, discrimination, and manipulation. Have you ever searched for something only to see an ad for that very thing pop up in a mobile app or on a different website? DuckDuckGo helps you avoid these types of scenarios by seamlessly reducing your online digital footprint.

Every time you search on DuckDuckGo, it's like you are searching on our site for the first time. We do not even have the concept of a search history.

And we also offer privacy protection beyond the search box. Many companies run sophisticated tracker networks that lurk on the websites you visit. DuckDuckGo’s browser technology blocks such hidden trackers.

In many ways I come to you from the future: I run a business that is already GDPR and CCPA-compliant. Our privacy policy is straightforward and doesn’t require a law degree to decipher: We simply do not collect or share any personal information at all. That’s it — no confusing settings to fiddle with, no jargon-filled pages to read. Yet, even with this simple privacy policy, we nonetheless are able to make money through advertising.

This brings me to my first point: Privacy legislation is not anti-advertising. Take our business for example: When you type in a search on DuckDuckGo, we simply show you ads related to that search. If you search for ‘car’, we show you car ads. But those ads won’t follow you around, because we don’t know who you are, where you’ve been, or where you go. It's contextual advertising versus behavioral advertising.

As a privately held company, our finances are private, though I’m proud to say we’ve been profitable using contextual advertising since 2014, and last year we earned substantially more than the $25 million revenue floor that subjects a company to CCPA.

And we are not alone. For example, in response to GDPR, when the New York Times in Europe switched from behavioral advertising to contextual advertising, it reported an increase in revenue. And just last week, Business Insider reported the Washington Post was looking into making a similar change. If Congress forced the digital advertising industry to return to its roots in contextual advertising, that would allow companies to remain profitable, or even become more profitable — all without the unintended consequences of behavioral advertising.

My second point is that privacy is becoming increasingly good for business. Consumers flock to brands they trust and respect, and according to Harris Poll, data privacy is the most pressing issue on Americans' minds, now for two years in a row. And again, we serve as a great case study, having grown exponentially during this period.

>Chart showing the increase in DuckDuckGo traffic from 2008 to 2019.

My third point is that well-drafted privacy legislation can spur more competition and innovation in one of the most foundational markets of the Internet: digital advertising. This market is currently a duopoly, and this reality is hurting everyone from small businesses to venture-backed startups to media companies. To restore competition and innovation in this market, the data monopolies at its core need to be addressed.

Fixing this digital-ad-market duopoly can take any number of forms. Here are three suggestions. First, consumers could be given a robust mechanism to opt-out of online tracking. Second, monopoly platforms could be prohibited from combining data across their different business lines. Third, acquisitions that strengthen existing data monopolies could be blocked.

Our mission at DuckDuckGo is to raise the standard of trust online. We support strong privacy legislation that does exactly that. We believe the Internet shouldn’t feel so creepy, and getting the privacy you deserve online should be as easy as closing the blinds.

I am pleased to answer your questions today and make myself available to Members in the future for more in-depth discussions. Thank you.

You can download the PDF version of this testimony here.