DonationCoder.com Forum

Main Area and Open Discussion => Living Room => Topic started by: IainB on June 28, 2018, 01:02 AM

Title: Privacy (collected references)
Post by: IainB on June 28, 2018, 01:02 AM
Original post:2018-06-28
Last updated:2018-11-16
Privacy - especially in the "Internet Age" - is something that has the potential sometimes (often?) to be overlooked/ignored or abused:

So I thought it might be useful to create a "Privacy thread" to collect/collate some salient privacy-related points that we come across and provide some kind of index to same.

About Privacy:
____________________________________________

DNS-related:    :Thmbsup:

DonationCoder forum (DCF) and user privacy:

GDPR (EU General Data Protection Regulation, 2018):    :Thmbsup:

Government-authorised privacy breaches:    :down:

Search engines and websites that are apparently committed to preserving the user's full right to privacy:

Search engines and websites that apparently rely on tracking/utilising the user's personal data/metadata to maintain their marketing and/or revenue streams:
  - including Bing search engine and most of its other assets - i.e., "free" and paid services like LinkedIn.com.

Software:

Tips & Tricks:

Vested interests antithetical with Privacy regulation:
Title: Three Reasons Why the "Nothing to Hide" Argument is Flawed.
Post by: IainB on June 28, 2018, 01:02 AM
This was the post I read today (2018-06-28) on the DuckDuckGo (https://spreadprivacy.com/) blog that caused me to start this thread. The whole ethos of DuckDuckGo is based on privacy, so it does not have an axe to grind, but it does differentiate its services because of that. I thought the post raised some valid, cogent and thought-provoking points. I've copied the post below in its entirety, together with embedded hyperlinks, rather than just provided the link, because it would seem worthy of discussion in and of itself.
(Copied below sans embedded images.)
Three Reasons Why the "Nothing to Hide" Argument is Flawed (https://spreadprivacy.com/three-reasons-why-the-nothing-to-hide-argument-is-flawed/)
27 JUNE 2018/PRIVACY

Over the years, we at DuckDuckGo have often heard a flawed counter-argument to online privacy: “Why should I care? I have nothing to hide.”

As Internet privacy has become more mainstream (https://spreadprivacy.com/mainstream/), this argument is rightfully fading away. However, it’s still floating around and so we wanted to take a moment to explain three key reasons why it's flawed.

1) Privacy isn’t about hiding information; privacy is about protecting information, and surely you have information that you’d like to protect.
  • Do you close the door when you go to the bathroom? Would you give your bank account information to anyone? Do you want all your search and browsing history made public? Of course not.

  • Simply put, everyone wants to keep certain things private and you can easily illustrate that by asking people to let you make all their emails, texts, searches, financial information, medical information, etc. public. Very few people will say yes.

2) Privacy is a fundamental right and you don't need to prove the necessity of fundamental rights to anyone.
  • You should have the right to free speech even if you feel you have nothing important to say right now. You should have the right to assemble even if you feel you have nothing to protest right now. These should be fundamental rights just like the right to privacy.

  • And for good reason. Think of commonplace scenarios in which privacy is crucial and desirable like intimate conversations, medical procedures, and voting. We change our behavior when we're being watched, which is made obvious when voting; hence, an argument can be made (https://www.ted.com/talks/glenn_greenwald_why_privacy_matters) that privacy in voting underpins democracy.

3) Lack of privacy creates significant harms that everyone wants to avoid.
  • You need privacy to avoid unfortunately common threats like identity theft, manipulation through ads, discrimination based on your personal information, harassment, the filter bubble (https://spreadprivacy.com/filter-bubble/), and many other real harms that arise from invasions of privacy.

  • In addition, what many people don’t realize is that several small pieces of your personal data can be put together to reveal much more about you than you would think is possible. For example, an analysis (http://news.mit.edu/2015/identify-from-credit-card-metadata-0129) conducted by MIT researchers found that “just four fairly vague pieces of information — the dates and locations of four purchases — are enough to identify 90 percent of the people in a data set recording three months of credit-card transactions by 1.1 million users.”

It’s critical to remember that privacy isn't just about protecting a single and seemingly insignificant piece of personal data, which is often what people think about when they say, “I have nothing to hide.” For example, some may say they don't mind if a company knows their email address while others might say they don't care if a company knows where they shop online.

However, these small pieces of personal data are increasingly aggregated by advertising platforms like Google and Facebook to form a more complete picture of who you are, what you do, where you go, and with whom you spend time. And those large data profiles can then lead much more easily to significant privacy harms. If that feels creepy, it’s because it is.

We can't stress enough that your privacy shouldn’t be taken for granted. The ‘I have nothing to hide’ response does just that, implying that government and corporate surveillance should be acceptable as the default.

Privacy should be the default. We are setting a new standard of trust online and believe getting the privacy you want online should be as easy as closing the blinds.

For more privacy advice, follow us on Twitter (https://twitter.com/duckduckgo) & get our privacy crash course (http://duckduckgo.com/newsletter).

Dax the duck
We are the Internet privacy company that lets you take control of your information, without any tradeoffs. Welcome to the Duck Side!
(Read more. (https://spreadprivacy.com/author/dax/))
Title: Re: Privacy (collected references)
Post by: Deozaan on June 28, 2018, 03:38 AM
Three Reasons Why the "Nothing to Hide" Argument is Flawed (https://spreadprivacy.com/three-reasons-why-the-nothing-to-hide-argument-is-flawed/)

Here's a more in depth paper on the subject:

"I've Got Nothing to Hide" and Other Misunderstandings of Privacy by Daniel J. Solove (https://www.donationcoder.com/forum/index.php?topic=20287.msg420924#msg420924)

Disclaimer: I haven't taken the time to read it yet, so I can't speak to its contents.
Title: Re: Privacy (collected references)
Post by: anandcoral on June 28, 2018, 07:51 AM
Long time ago, I upgraded to Win10 and it insist on online login and updating my os at it own whims. I started using Android mobile is always connected to my email to big brother and it knows what I type in keyboard. My laptop is ever connected to internet and many programs just update and throw ads and make merry to themselves.

Now I do not have much energy or time to even think of privacy. Obviously I do keep watch behind my back when I am doing online bank transaction.

Regards,

Anand
Title: Re: Privacy - California passes its own GDPR (2018-06-29)
Post by: IainB on June 29, 2018, 05:08 PM
This post at  TheRegister (https://www.theregister.co.uk) signals extremely good news for the privacy of the general public user of the Internet. The post is also rather enlightening: (my emphasis)
(Copied below sans embedded hyperlinks/images.)
Google weeps as its home state of California passes its own GDPR (https://www.theregister.co.uk/2018/06/29/california_data_privacy_law/)
The right to view and delete personal info is here – and you'll be amazed to hear why the law passed so fast
By Kieren McCarthy in San Francisco 29 Jun 2018 at 20:0213 Reg comments

Uh oh, someone just got some bad news
California has become the first state in the US to pass a data privacy law – with governor Jerry Brown signing the California Consumer Privacy Act of 2018 into law on Thursday.

The legislation will give new rights to the state's 40 million inhabitants, including the ability to view the data that companies hold on them and, critically, request that it be deleted and not sold to third parties. It's not too far off Europe's GDPR.

Any company that holds data on more than 50,000 people is subject to the law, and each violation carries a hefty $7,500 fine. Needless to say, the corporations that make a big chunk of their profits from selling their users' information are not overly excited about the new law.

"We think there's a set of ramifications that's really difficult to understand," said a Google spokesperson, adding: "User privacy needs to be thoughtfully balanced against legitimate business needs."

Likewise tech industry association the Internet Association complainedthat "policymakers work to correct the inevitable, negative policy and compliance ramifications this last-minute deal will create."

So far no word from Facebook, which put 1.5 billion users on a boat to California back in April in order to avoid Europe's similar data privacy regulations.

Don't worry if you are surprised by the sudden news that California, the home of Silicon Valley, has passed a new information privacy law – because everyone else is too. And this being the US political system there is, of course, an entirely depressing reason for that.

Another part of the statement by the Internet Association put some light on the issue: "Data regulation policy is complex and impacts every sector of the economy, including the internet industry," it argues. "That makes the lack of public discussion and process surrounding this far-reaching bill even more concerning. The circumstances of this bill are specific to California."

I see...
So this bill was rushed through?

Yes, it was. And what's more it was signed in law on Thursday by Governor Brown just hours after it was passed, unanimously, by both houses in Sacramento. What led lawmakers to push through privacy legislation at almost unheard-of speed? A ballot measure.

That’s right, since early 2016, a number of dedicated individuals with the funds and legislative know-how to make data privacy a reality worked together on a ballot initiative in order to give Californians the opportunity to give themselves their own privacy rights after every other effort in Sacramento and Washington DC has been shot down by the extremely well-funded lobbyists of Big Tech and Big Cable.

Hand locking door
GDPR forgive us, it's been one month since you were enforced…
READ MORE
Real estate developer Alastair Mactaggart put about $2m of his own money into the initiative following a chance conversation with a Google engineer in his home town of Oakland in which the engineer told him: "If people just understood how much we knew about them, they’d be really worried."

Mactaggart then spoke with a fellow dad at his kid's school, a finance guy called Rick Arney who had previously worked in the California State Senate, about it. And Arney walked him through California's unusual ballot measure system where anyone in the state can put forward an initiative and if it gets sufficient support will be put on the ballot paper at the next election.

If a ballot initiative gets enough votes, it becomes law. There have been some good and some bad outcomes from this exercise in direct democracy over the years but given the fact that both Mactaggart and Arney felt that there was no way a data privacy law would make its way through the corridors of power in Sacramento in the normal way, given the enormous influence of Silicon Valley, they decided a ballot measure was the way to go.

Beware the policy wonk
One other individual is worth mentioning: Mary Stone Ross was a former CIA employee and had been legal counsel for the House of Representatives Intelligence Committee and she also lives in Oakland. Mactaggart persuaded her to join the team to craft the actual policy and make sure it could make it through the system.

Together the three of them then spend the next year talking to relevant people, from lawyers to tech experts to academics to ordinary citizens to arrive at their overall approach and draft the initiative.

And it is at that point that, to be put in bluntly, the shit hit the fan. Because the truth is that consumers – and especially Californians who tend to be more tech-savvy than the rest of the country given the concentration of tech companies in the state – understand the issues around data privacy rules and they want more rights over it.

With the initiative well structured and the policy process run professionally, the ballot measure gained the required number of supporters to get it on the ballot. And thanks to the focus groups and polls the group carried out, they were confident that come November it would pass and data privacy become law through direct democracy.

At which point, it is fair to say, Big Internet freaked out and made lots of visits to lawmakers in Sacramento who also freaked out.

The following months have seen a scurry of activity but if you want to know why the bill became law in almost record time and was signed by Governor Brown on Thursday all you need to know is this single fact: the deadline for pulling the initiative from November's ballot as last night – Thursday evening – and Mactaggart said publicly that if the bill was signed, he would do exactly that and pull his ballot measure.

Privy see
You may be wondering why Sacramento was able to get it through unanimously without dozens of Google and Facebook-funded lawmakers continually derailing the effort, especially since it was still a ballot measure. After all, the tech giants could have spent millions campaigning against the measure in a bid to make sure people didn’t vote for it.

And the truth is that they had already lined up millions of dollars to do exactly that. Except they were going to lose because, thanks to massively increased public awareness of data privacy given the recent Facebook Russian election fake news scandal and the European GDPR legislation, it was going to be very hard to push back against the issue. And it has been structured extremely well – it was, frankly, good law.

There is another critical component: laws passed through the ballot initiative are much, much harder for lawmakers to change, especially if they are well structured.

So suddenly Big Tech and Sacramento were faced with a choice: pass data privacy legislation at record speed and persuade Mactaggart to pull his ballot initiative with the chance to change it later through normal legislative procedures; or play politics as usual and be faced with the same law but one that would be much harder to change in future.

And, of course, they went with the law. And Mactaggart, to his eternal credit, agreed to pull his ballot measure in order to allow the "normal" legislative approach to achieve the same goal.

And so the California Consumer Privacy Act of 2018 is now law and today is the first day that most Californians will have heard of it. Sausage making at its finest.


Of course, Google, Facebook et al are going to spend the next decade doing everything they can trying to unravel it. And as we saw just last week, lawmakers are only too willing to do the bidding of large corporate donors. But it is much harder to put a genie back in the bottle than it is to stop it getting out. ®

Copied from: Google weeps as its home state of California passes its own GDPR • The Register - <https://www.theregister.co.uk/2018/06/29/california_data_privacy_law/>
Title: I’ve Got Nothing to Hide” and Other Misunderstandings of Privacy.
Post by: IainB on June 29, 2018, 06:38 PM
@Deozaan: Where you write:
...Here's a more in depth paper on the subject:
"I've Got Nothing to Hide" and Other Misunderstandings of Privacy by Daniel J. Solove (https://www.donationcoder.com/forum/index.php?topic=20287.msg420924#msg420924)
Disclaimer: I haven't taken the time to read it yet, so I can't speak to its contents.
- thankyou!   :Thmbsup:

The post you link to is:
This is a tangentially related bit of irony:
I went to download a paper on privacy called "I've Got Nothing to Hide" and Other Misunderstandings of Privacy by Daniel J. Solove (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=998565), but since the website detected that I was using an anonymous proxy, they tried to get me to register for an account so they could track me, and made me complete the reCAPTCHA three times when I insisted on clicking the (almost hidden) link to continue downloading anonymously.

I downloaded the paper (.PDF file) via the link you gave to ssrn.com. It seems to be a very informative paper by Daniel J Solove:
* © Daniel J. Solove 2007.  Associate Professor, George Washington University
Law School; J.D., Yale Law School.  Thanks to Chris Hoofnagle, Adam Moore, and Michael
Sullivan  for  helpful  comments,  and  to  my  research  assistant  Sheerin  Shahinpoor.    I
develop some of the ideas in this essay in significantly more depth in my forthcoming
book, Understanding Privacy, to be published by Harvard University Press in May 2008.

(From the footnote to the cover page of: “I’ve Got Nothing to Hide” and Other
Misunderstandings of Privacy.


Note: The .PDF file is attached to this post, for convenience, as per link below. It can also easily be viewed/downloaded direct from ssrn.com - here. (https://poseidon01.ssrn.com/delivery.php?ID=199020086001024024117001112120121099104015006077091033071074030106070085067003066103097114000125006036111064120077077127068116062015046052031093101115021004065096048075035099091089121068093070006101084108079005083086001103096118100119100120064093096&EXT=pdf)
Title: UK.gov is not being advised by Google. Repeat. It is not...
Post by: IainB on June 29, 2018, 08:29 PM
Again from TheRegister, this time some possibly privacy-related news:
UK.gov is not being advised by Google. Repeat. It is not being advised by Google (https://www.theregister.co.uk/2018/06/29/ministry_of_fun_is_not_being_advised_by_google/)
DeepMind's 'Demis Hassabis is an individual' – Ministry of Fun
By Andrew Orlowski 29 Jun 2018 at 09:5517 Reg comments
demis hassabis
DeepMind co-founder Demis Hassabis (Pic: Debby Wong / Shutterstock.com)
Google is not advising the British government on AI, the Ministry of Fun assured this week, following the appointment of Google's Demis Hassabis as an advisor on AI.

The US ad, search and cloud biz acquired Hassabis' company DeepMind four years ago and he has since been a Google employee. In the wordsof The Guardian, Hassabis is "leading Google's project to build software more powerful than the human brain".

Earlier this week, the Department for Digital, Culture, Media and Sport – aka the Ministry of Fun – announced the creation of a new "AI Council" and appointed Hassabis as its advisor. The department seemed pleased with landing such a trophy, explaining that Hassabis "will provide expert industry guidance to help the country build the skills and capability it needs to capitalise on the huge social and economic potential of AI – a key part of the Government's modern Industrial Strategy."

But just because a Google employee is giving the government advice, that doesn't necessarily mean a Google employee is giving the government advice. You would be quite wrong to think that.
(Read the rest at the link.)

Copied from: UK.gov is not being advised by Google. Repeat. It is not being advised by Google • The Register - <https://www.theregister.co.uk/2018/06/29/ministry_of_fun_is_not_being_advised_by_google/>

Similarly, we can presumably be assured that US government is not being advised/influenced by Google...     :o
Title: Re: Privacy - Why can't the government do my taxes for me?
Post by: IainB on July 01, 2018, 01:54 AM
In 2008/9 I was contracted as a project manager to establish and commence a project that was going to transform the gathering of revenue/tax data by doing it online. This was for individuals and accounting agents of SMBs (Small to Medium-sized Businesses). It was to automate and dramatically improve the efficiency and speed of the processes involved, which, up until then, had been prone to massive manual processing holdups.

Fast forward 9 years. I was doing my personal online tax return the other day and was impressed with how easy it was,, as the Inland Revenue already knew an awful lot of the private details about my income. What potentially had been likely to take me hours by the old methods was now taking minutes. This was for my individual tax return. (I had read in the press that the SMB side of things was still having hiccups though.)

Then my train of thought reminded me of this silly humour post I made in 2014:
Scott Adams Blog: Message to My Government 03/06/2014 (http://dilbert.com/blog/entry/message_to_my_government/)
Mar 6, 2014

I never felt too violated by the news that my government can snoop on every digital communication and financial transaction I make. Maybe I should have been more bothered, but the snooping wasn't affecting my daily life, and it seemed like it might be useful for fighting terrorism, so I worried about other things instead.

This week, as I was pulling together all of my records to do taxes, I didn't get too upset that the process of taxpaying is unnecessarily frustrating and burdensome. As a citizen, I do what I need to do. I'm a team player.

I have also come to peace with the fact that my government now takes about half of my income. I figure most of it goes to good causes. I'm here to help.

I take pride in the fact that I don't let the little things get to me.

But the other day, as I was crawling my way through mountains of statements and receipts, trying to organize my records for my accountant, with several more days of this drudgery ahead, I had a disturbing thought. I must warn you in advance that this disturbing thought can only be expressed in all capital letters and it must include profanity. It goes like this.

Message to my government:

DO MY FUCKING TAXES FOR ME, YOU ASSHOLES!!! YOU ALREADY KNOW EVERY FUCKING THING I DID THIS YEAR!!!

Seriously.
Title: Re: Privacy (collected references)
Post by: tomos on July 01, 2018, 11:11 AM
You're on a roll Iain :up:

California post especially interesting.
Will have to catch up with earlier posts.
Title: Re: Privacy (collected references)
Post by: YannickDa on July 02, 2018, 04:14 AM
That sounds great.
They've been caught with their hands in the honeypot,
"cambridge analytica", "facebook", "google"
(with some of their employees refusing to
work along with government's agencies), etc...
So, some laws were written and voted.
You can even access all that private data that was collected and click on a big "Erase" button.
This was just a little mistake.
But that won't happen again.
Everything is under control now.

Or perhaps they still have all this information.
Maybe the big "Erase" button didn't worked as expected.
IMHO, they will go on with their data gathering.
But they will take extra care as to not beeing caught again...

"Want To Freak Yourself Out?" Here Is All The Personal Data That Facebook/Google Collect (https://www.zerohedge.com/news/2018-03-27/twitter-user-breaks-down-all-personal-data-facebook-and-google-collect)
Title: Re: Privacy (collected references)
Post by: IainB on July 02, 2018, 05:43 PM
@YannickDa: Yes, I suspect you're probably pretty much spot-on in what you write above. It would seem prudent for any individual to regard all/any protestations by whomever that "Oh no! Don't worry! Your 'right to privacy' and the security and confidentiality of all your personal data is our primary objective!", as being likely to be just so much cynical hokum - especially if/when voiced by, for example (say):
(Have I missed any out?)

Some people (not me, you understand) might put it in the New Zealand vernacular thus: "They couldn't give a rat's #rse about your stinking rights to privacy.", but I couldn't possibly comment.
Title: Re: Privacy (collected references)
Post by: IainB on July 02, 2018, 06:34 PM
There's a very good summary post of the Facebook fiasco in the bleepingcomputer.com website, by Catalin Cimpanu:
(Copied below sans embedded images; my emphasis.)
Facebook Acknowledges It Shared User Data With 61 Companies (https://www.bleepingcomputer.com/news/technology/facebook-acknowledges-it-shared-user-data-with-61-companies/)
tags: Technology
Catalin Cimpanu - 2018-07-02

Image: Facebook app login

In a 747-page document (https://docs.house.gov/meetings/IF/IF00/20180411/108090/HHRG-115-IF00-Wstate-ZuckerbergM-20180411-SD003.pdf) provided to the US House of Representatives' Energy and Commerce Committee on Friday, Facebook admitted that it granted special access to users' data to 61 tech companies.

According to the document (https://docs.house.gov/meetings/IF/IF00/20180411/108090/HHRG-115-IF00-Wstate-ZuckerbergM-20180411-SD003.pdf), these 61 companies received a "one-time" extension so they could update their apps in order to comply with a Terms of Service change the company applied in May 2015.

61 companies received API exemptions in 2015
The six-month extension was applied from May 2015, onward, when Facebook restricted its API so apps could not access too much data on its users, and especially the data of users' friends.

The API change came in a period when apps like the one developed by Cambridge Analytica were using the Facebook API to mass-harvest the data of Facebook users.

In May 2015, Facebook realized that apps were abusing this loophole in its permission system to trick one user into granting permission to the personal data of hundreds of his friends, and restricted the Facebook API to prevent indirect data harvesting.

But these 61 tech companies, because they ran popular apps, received an exemption to this API change, during which, theoretically, they could have abused the Facebook API to collect data on Facebook users and their friends. Data that could have been collected included name, gender, birthdate, location, photos, and page likes.

Facebook did not say if any of these companies abused this extension period to harvest data on users and their friends. The list of 61 companies who received an API extension includes:
Spoiler
1. ABCSocial, ABC Television Network
2. Actiance
3. Adium
4. Anschutz Entertainment Group
5. AOL
6. Arktan / Janrain
7. Audi
8. biNu
9. Cerulean Studios
10. Coffee Meets Bagel
11. DataSift
12. Dingtone
13. Double Down Interactive
14. Endomondo
15. Flowics, Zauber Labs
16. Garena
17. Global Relay Communications
18. Hearsay Systems
19. Hinge
20. HiQ International AB
21. Hootsuite
22. Krush Technologies
23. LiveFyre / Adobe Systems
24. Mail.ru
25. MiggoChat
26. Monterosa Productions Limited
27. never.no AS
28. NIKE
29. Nimbuzz
30. NISSAN MOTOR CO / Airbiquity Inc.
31. Oracle
32. Panasonic
33. Playtika
34. Postano, TigerLogic Corporation
35. Raidcall
36. RealNetworks, Inc.
37. RegED / Stoneriver RegED
38. Reliance/Saavn
39. Rovi
40. Salesforce/Radian6
41. SeaChange International
42. Serotek Corp. 
43. Shape Services
44. Smarsh
45. Snap
46. Social SafeGuard
47. Socialeyes LLC
48. SocialNewsdesk
49. Socialware / Proofpoint
50. SoundayMusic 
51. Spotify
52. Spredfast
53. Sprinklr / Sprinklr Japan
54. Storyful Limited / News Corp
55. Tagboard
56. Telescope
57. Tradable Bits, TradableBits Media Inc.
58. UPS
59. Vidpresso
60. Vizrt Group AS
61. Wayin

Of the list above, Serotek received an eight-month extension.

Facebook points the finger at five other companies
Facebook also said it identified five other companies that tested beta versions of their apps that had the "theoretical" capability of harvesting a users' friends data. The list includes.
  1. Activision / Bizarre Creations
  2. Fun2Shoot 
  3. Golden Union Co.
  4. IQ Zone / PicDial
  5. PeekSocial


"We are not aware that any of this handful of companies used this access, and we have now revoked any technical capability they may have had to access any friends' data", Facebook said.

Facebook slowly closing all loopholes
In addition, Facebook also announced it was discontinuing 38 partnerships with companies that it authorized to build versions of Facebook or Facebook features for custom devices and products, and which may have also gained extensive access to user data.

Last week, a security researcher discovered (https://medium.com/@intideceukelaire/this-popular-facebook-app-publicly-exposed-your-data-for-years-12483418eff8) another quiz app, similar to the one developed by Cambridge Analytica, which also gained access and later exposed the details of over 120 million Facebook users.

The app was named Nametests.com, associated with the eponymous website. Current evidence doesn't suggest the data collected by this second quiz app might have been used for political ads and influence campaigns such as the one collected by Cambridge Analytica.
_________________
CATALIN CIMPANU 
Catalin Cimpanu is the Security News Editor for Bleeping Computer, where he covers topics such as malware, breaches, vulnerabilities, exploits, hacking news, the Dark Web, and a few more. Catalin previously covered Web & Security news for Softpedia between May 2015 and October 2016. The easiest way to reach Catalin is via his XMPP/Jabber address at [email protected]. For other contact methods, please visit Catalin's author page.

Copied from: Facebook Acknowledges It Shared User Data With 61 Companies - <https://www.bleepingcomputer.com/news/technology/facebook-acknowledges-it-shared-user-data-with-61-companies/>
Title: Re: Privacy - GDPR + VPNs.
Post by: IainB on July 08, 2018, 02:02 AM
The AddictiveTips (https://www.addictivetips.com/) website is usually worth keeping an eye on because they often have some very useful tips in all sorts of categories of interest. One of these categories is Privacy+VPNs (Virtual Private Network providers), which they frequently plug - probably because they get a financial benefit, such as, (say) advertising revenue, or commission on sales, or something. However, where they do talk about VPN services, AddictiveTips usually seem to be pretty thorough and relatively objective.

A recent example is the post: Best VPNs for GDPR: Unblock Online Services in Europe (https://www.addictivetips.com/vpn/gdpr/), which covers various useful points, some of which I summarise below and with my own comments/perspective added (but please do read the whole thing at the link):
Title: Re: Privacy (collected references)
Post by: wraith808 on July 08, 2018, 09:32 AM
What a VPN can do for digital privacy: One of the best tools that users can deploy to improve their privacy online is arguably by using a VPN. The post provides a good overview of what a VPN is, its benefits and how it can be used in conjunction with the GDPR legislation to protect your privacy. There are recommendations for the "best" VPNs for GDPR.



Always remember, a VPN is only as good as your VPN provider.  If they roll over and play dead, or are a "false flag" provider, you might as well not be using VPN at all.
Title: Re: Privacy (collected references)
Post by: Deozaan on July 08, 2018, 05:46 PM
I'm surprised they didn't mention ProtonVPN (https://protonvpn.com/).
Title: Re: Privacy (collected references)
Post by: 4wd on July 08, 2018, 10:25 PM
I'm surprised they didn't mention ProtonVPN (https://protonvpn.com/).

Regarding their comment in Features (https://protonvpn.com/secure-vpn):
In addition to strong technical security, ProtonVPN also benefits from strong legal protection. Because we are based in Switzerland, ProtonVPN is protected by some of the world's strongest privacy laws and remains outside of US and EU jurisdiction. This means that unlike VPN providers based in a fourteen eyes country, we cannot be coerced into spying on our users.

A possible view from the other side of the coin: It doesn’t matter how many eyes you have (https://blog.windscribe.com/i-doesnt-matter-how-many-eyes-you-have-66f59fc1e777?source=collection_home---4------4----------------)

If you believe Wikipediaw:
Further intelligence sharing collaborations
As spelled out by Privacy International, there are a number of issue-specific intelligence agreements that include some or all the above nations and numerous others, such as:
  • An area specific sharing amongst the 41 nations that formed the allied coalition in Afghanistan;
  • A shared effort of the Five Eyes nations in "focused cooperation" on computer network exploitation with Austria, Belgium, Czech Republic, Denmark, Germany, Greece, Hungary, Iceland, Italy, Japan, Luxembourg, the Netherlands, Norway, Poland, Portugal, South Korea, Spain, Sweden, Switzerland and Turkey;
  • Club of Bernew: 17 members including primarily European States; the US is not a member;
  • The Counterterrorist Group: a wider membership than the 17 European States that make up the Club of Bernew, and includes the US;
  • NATO Special Committee: made up of the heads of the security services of NATO's 28 member countries;

If they want you bad enough I doubt whether a VPN provider anywhere is going to stop them.
Title: Re: Privacy (collected references)
Post by: IainB on July 09, 2018, 12:55 AM
Sorry, I hadn't been intending to suggest that this thread topic could usefully provide coincidentally relevant:
(a) details of/for a fully comprehensive coverage of VPNs (though directions to same could be useful), or
(b)comprehensive reviews of VPN Pros/Cons or "Which are the best/most trustworthy/etc. VPNs, and why?" (though directions to same could be useful).

Methinks those would probably be pretty extensive subject/topic areas or discussion threads in their own right!    :o

What could perhaps be more useful/relevant for inclusion in this thread are (and please say if you have other suggestions) our experiences/knowledge of those DNS/VPN methods/tools that meet the criteria of (say) being variously able to meet three criteria (and please suggest any other important criteria that I may have missed):

There are four such tools that immediately come to mind (and I feel sure there could be more listed or pointed to by other DCF members):

Though I have reviewed DNSCrypt and SoftEther VPNClient elsewhere on the DC Forum, my knowledge/understanding of the area of Privacy and alternative Privacy/Security tools (e.g., Tor) is necessarily limited to my personal experience and exposure to use of such tools. In regards to this discussion thread, I suspect that the collective experience of DCF members could comprise a "Brainstrust" which could contribute a great deal more than I might be able to on my own. Therefore any assistance in developing this thread could be most welcome.
Title: Re: Privacy (collected references)
Post by: IainB on July 09, 2018, 01:27 AM
For clarification, I have added this to the post I made above regarding the AddictiveTips article:
EDIT 2018-07-09:
NB: TRUST is a key issue here. There is a caveat that many organisations in the business of providing $PAID-for VPN services seem to  tend to conceal - not all the VPN providers are actually operating a trustworthy service, from the user's perspective, such that your logged VPN activity data could be made available to government or other authorities, through legal or other compulsion (even corruption/informal agreement).

Also, please note that this is probably a True statement:
If they want you bad enough I doubt whether a VPN provider anywhere is going to stop them.
Title: Re: Privacy (collected references)
Post by: YannickDa on July 12, 2018, 09:07 PM
There's a solution implementing VPN, independant DNS, Proxy, WebMail, VoIP, Cloud and your own surveillance cams.

It's called "eniKma", it's french and seems to be very reliable.

[ You are not allowed to view attachments ]

Try Google Translate this page (https://www.enikma.fr/presentation/) to learn more about it...
Title: Re: Privacy (collected references) - Enikma=VPN Lock-in??
Post by: IainB on July 23, 2018, 11:53 PM
@YannickDa: I'm not absolutely sure, but it seems from the Enikma website and introductory video that the Enikma box is a proprietary "black box" (hardware) approach to the encryption of 2-way traffic between the User PC (Client) and the proprietary designated Enikma VPN DNS node, where the Enikma box provides a WiFi Access Point for devices in range of that Enikma box.

Thus the user's ISP is just acting as a passthrough node to the encrypted traffic, so there can be no "man-in-the-middle" attacks.
The communication path would seem to be:
Client<-->Enikma box<-->modem/router<-->ISP DNS<-->designated Enikma VPN DNS node

 - and where the traffic between the two points Enikma box<-->designated Enikma VPN DNS node is encrypted.
This is actually quite simple, but seems to have been obfuscated in the website and details.

It would also seem to be a deliberate lock-in and rather kludgy/"overheady" alternative to the use of the public domain DNSCrypt software, which does the same thing (but more efficiently) except that:
(a) there is no obligation with DNSCrypt to have a given and/or proprietary VPN, because DNSCrypt is $FREE and works with any OpenDNS node, so the user is free to choose (not locked-in to) any VPN, and
(b) DNSCrypt encrypts traffic all the way from/to the Client (whereas Client Xmit/Receive is in clear with the Enikma box, potentially leaving some room for man-in-the-middle attacks).

If I have it correctly then, I am surprised that Enikma are apparently allowed under local consumer protection laws to get away with such misleading/obfuscated and lock-in practices, and the fact that they are misleading would be no accident - which would seem to be unethical - so I personally wouldn't touch them with a bargepole.
...Never trust it when they use smoke and mirrors.
Title: Re: Privacy (collected references) - Privacy per the Telegram FAQ.
Post by: IainB on July 24, 2018, 12:27 AM
Extracted notes from the Telegram FAQ:
(Copied from: Telegram F.A.Q. - <https://telegram.org/faq#q-how-are-you-going-to-make-money-out-of-this>)
Q: What are your thoughts on internet privacy?
Big internet companies like Facebook or Google have effectively hijacked the privacy discourse in the recent years. Their marketers managed to convince the public that the most important things about privacy are superficial tools that allow hiding your public posts or your profile pictures from the people around you. Adding these superficial tools enables companies to calm down the public and change nothing in how they are turning over private data to marketers and other third parties.

At Telegram we think that the two most important components of Internet privacy should be instead:

Protecting your private conversations from snooping third parties, such as officials, employers, etc.
Protecting your personal data from third parties, such as marketers, advertisers, etc.
This is what everybody should care about, and these are some of our top priorities. Telegram's aim is to create a truly free messenger, without the usual caveats (https://telegram.org/privacy). This means that instead of diverting public attention with low-impact settings, we can afford to focus on the real privacy issues that exist in the modern world.

Q: What about GDPR?
New regulations regarding data privacy called the General Data Protection Regulation (GDPR) came into force in Europe on May 25, 2018. Since taking back our right to privacy was the reason we made Telegram, there wasn‘t much we had to change. We don’t use your data for ad targeting, we don’t sell it to others, and we’re not part of any mafia family “family of companies.”

Telegram only keeps the information it needs to function as a feature-rich cloud service — for example, your cloud chats so that you can access them from any devices without using third-party backups, or your contacts so that you can rely on your existing social graph when messaging people on Telegram.

We're still working with our lawyers on an update to the Telegram Privacy Policy (https://telegram.org/privacy) that will lay this out in even more detail (don‘t expect any dramatic changes there though). We’ll notify you when it's ready.

For now, please feel free to use our new @GDPRbot (https://t.me/gdprbot) to:
  • Request a copy of all your data that Telegram stores.
  • Contact Telegram's Data Protection Officer.
Android (https://play.google.com/store/apps/details?id=org.telegram.messenger) users got a GDPR update with version 4.8.9 which allows more control over synced contacts and adds other privacy settings. On June, 1, Apple approved (https://t.me/durov/88) Telegram v.4.8.2 for iOS with these features.

Q: There's illegal content on Telegram. How do I take it down?
All Telegram chats and group chats are private amongst their participants. We do not process any requests related to them. ...

Q: A bot or channel is infringing on my copyright. What do I do?
All Telegram chats and group chats are private amongst their participants. We do not process any requests related to them. ...
Title: Re: Privacy (collected references)
Post by: f0dder on July 24, 2018, 11:04 AM
Please don't think a VPN is going to give you any form of privacy.

A VPN lets you access a remote network securely across an insecure line - this is the only thing it's guaranteed to do. It's the only thing you should be using it for. Stop spreading the damn misconception that it's useful for privacy.

If you want to watch Netflix content from a different region, fine, VPN will let you do that, but morally you might was as well then be torrenting the content.

If you're doing something shady and want to hide your tracks, a VPN is not what you want. Not even one of the paid ones. Not even one of the "WE DON'T LOG ANYTHING AND WE VALUE YOUR PRIVACY". Stop it. There's a few threat models where a VPN can be a viable solution, but for those you should be running it yourself on a cloud instance somewhere. If you don't know how to do that, or think it's too much bother, you shouldn't be doing something shady in the first place - or you're not doing something that warrants that use of VPN, and should just not be doing it.

And stay entirely away from the ones that don't require payment, the market is shady as fuck and they've been doing all sorts of nasty stuff.
Title: Re: Privacy (collected references)
Post by: Deozaan on July 24, 2018, 02:10 PM
Please don't think a VPN is going to give you any form of privacy.

A VPN lets you access a remote network securely across an insecure line - this is the only thing it's guaranteed to do. It's the only thing you should be using it for. Stop spreading the damn misconception that it's useful for privacy.

Doesn't it help prevent tracking? Or has that become so invasive these days that it doesn't matter what your IP is, they can still identify you by some unique ID in your browser or OS or something?
Title: Re: Privacy (collected references)
Post by: f0dder on July 24, 2018, 04:45 PM
Doesn't it help prevent tracking?
Not really, no. You have to consider that most people aren't on static global IPs, but will either have dynamic IPs, or even (a very large number) be behind cgnat. The tracking folks obviously want to be able to uniquely identify you even in spite of that, and across devices as well.

Trying to use VPN against that is absolutely useless.

You can avoid some of it if you use a combination of uMatrix (in whitelisting mode), conservative use of noscript, a decent adblocker like uBlock Origin, adding in HTTP Referer header control and Firefox Multi-Account Containers. But it's still not a 100% guarantee and it's a fair amount of work getting some sites to work the first time you visit them.
Title: Re: Privacy (collected references)
Post by: IainB on July 29, 2018, 08:14 PM
Looks like the Ugandan government  could be in the vanguard when it comes to, uh, privacy...
 ...Uganda orders ISPs to block Ugandans from accessing Pornographic Websites (http://innov8tiv.com/uganda-orders-isps-to-block-ugandans-from-accessing-pornographic-websites/)   Nice one!    :Thmbsup:

Title: Re: Privacy (collected references)
Post by: IainB on August 08, 2018, 04:05 AM
Some valid points from theregister.co.uk:
Facebook insists it has 'no plans' to exploit your personal banking info for ads – just as we have 'no plans' to trust it (https://www.theregister.co.uk/2018/08/07/facebook_banking_data/)
After all, never say never!
By Kieren McCarthy in San Francisco 7 Aug 2018 at 20:4432

Image [Denial]

Analysis: Facebook has denied it is seeking to suck up netizens' bank account details, claiming it just wants to connect bank customers to their bank's chat accounts and give useful financial updates. ...

Copied from: Facebook insists it has 'no plans' to exploit your personal banking info for ads – just as we have 'no plans' to trust it • The Register - <https://www.theregister.co.uk/2018/08/07/facebook_banking_data/>

Yeah, right.
Title: Re: Privacy (collected references)
Post by: 4wd on August 09, 2018, 08:04 PM
Looks like the Ugandan government  could be in the vanguard when it comes to, uh, privacy...
 ...Uganda orders ISPs to block Ugandans from accessing Pornographic Websites (http://innov8tiv.com/uganda-orders-isps-to-block-ugandans-from-accessing-pornographic-websites/)   Nice one!    :Thmbsup:

Except they're 2 or 3 years behind Russia (https://www.independent.co.uk/life-style/gadgets-and-tech/news/pornhub-youporn-russia-blocked-premium-vladimir-putin-meet-people-in-real-life-a7308391.html).
Title: Re: Privacy - discrimination against vulnerable minorities.
Post by: IainB on August 19, 2018, 12:12 AM
The other end of the loss of Privacy is where its loss makes the loser potentially vulnerable, or more vulnerable than they were before, leading to potential risk - e.g., exposure to demographic stratification and targeting, with subsequent stigmatisation, discrimination, harm/loss, even to the extent of circumventing laws that were expressly established to avoid such risks to these vulnerable groups/minorities.
So, right on cue, here's a classic recent example - and yes, Facebook are behind it, because they can make money out of it (of course) - and are doing so. What a surprise (NOT).    :o
From the U.S. Department of Housing and Urban Development (https://www.hud.gov/)
HUD No. 18-085
HUD Public Affairs
(202) 708-0685
FOR RELEASE
Friday
August 17, 2018
HUD FILES HOUSING DISCRIMINATION COMPLAINT AGAINST FACEBOOK (https://www.hud.gov/press/press_releases_media_advisories/HUD_No_18_085)
Secretary-initiated complaint alleges platform allows advertisers to discriminate

WASHINGTON – The U.S. Department of Housing and Urban Development (HUD) announced today a formal complaint against Facebook for violating the Fair Housing Act by allowing landlords and home sellers to use its advertising platform to engage in housing discrimination.

HUD claims Facebook enables advertisers to control which users receive housing-related ads based upon the recipient's race, color, religion, sex, familial status, national origin, disability, and/or zip code. Facebook then invites advertisers to express unlawful preferences by offering discriminatory options, allowing them to effectively limit housing options for these protected classes under the guise of 'targeted advertising.' Read HUD's complaint against Facebook (https://www.hud.gov/sites/dfiles/PIH/documents/HUD_01-18-0323_Complaint.pdf).

"The Fair Housing Act prohibits housing discrimination including those who might limit or deny housing options with a click of a mouse," said Anna María Farías, HUD's Assistant Secretary for Fair Housing and Equal Opportunity. "When Facebook uses the vast amount of personal data it collects to help advertisers to discriminate, it's the same as slamming the door in someone's face."

The Fair Housing Act prohibits discrimination in housing transactions including print and online advertisement on the basis of race, color, national origin, religion, sex, disability, or familial status. HUD's Secretary-initiated complaint follows the Department's investigation into Facebook's advertising platform which includes targeting tools that enable advertisers to filter prospective tenants or homebuyers based on these protected classes.

For example, HUD's complaint alleges Facebook's platform violates the Fair Housing Act. It enables advertisers to, among other things:
  • display housing ads either only to men or women;
  • not show ads to Facebook users interested in an "assistance dog," "mobility scooter," "accessibility" or "deaf culture";   
  • not show ads to users whom Facebook categorizes as interested in "child care" or "parenting," or show ads only to users with children above a specified age;
  • to display/not display ads to users whom Facebook categorizes as interested in a particular place of worship, religion or tenet, such as the "Christian Church," "Sikhism," "Hinduism," or the "Bible."
  • not show ads to users whom Facebook categorizes as interested in "Latin America," "Canada," "Southeast Asia," "China," "Honduras," or "Somalia."
  • draw a red line around zip codes and then not display ads to Facebook users who live in specific zip codes.
Additionally, Facebook promotes its advertising targeting platform for housing purposes with "success stories" (https://www.facebook.com/business/success/quadrant-homes) for finding "the perfect homeowners," "reaching home buyers," "attracting renters" and "personalizing property ads."

In addition, today the U.S. Attorney for the Southern District of New York (SDNY) filed a statement of interest, joined in by HUD, in U.S. District Court on behalf of a number of private litigants challenging Facebook's advertising platform.

HUD Secretary-Initiated Complaints

The Secretary of HUD may file a fair housing complaint directly against those whom the Department believes may be in violation of the Fair Housing Act. Secretary-Initiated Complaints are appropriate in cases, among others, involving significant issues that are national in scope or when the Department is made aware of potential violations of the Act and broad public interest relief is warranted or where HUD does not know of a specific aggrieved person or injured party that is willing or able to come forward. A Fair Housing Act complaint, including a Secretary initiated complaint, is not a determination of liability.

A Secretary-Initiated Complaint will result in a formal fact-finding investigation. The party against whom the complaint is filed will be provided notice and an opportunity to respond. If HUD's investigation results in a determination that reasonable cause exists that there has been a violation of the Fair Housing Act, a charge of discrimination may be filed. Throughout the process, HUD will seek conciliation and voluntary resolution. Charges may be resolved through settlement, through referral to the Department of Justice, or through an administrative determination.

This year marks the 50th anniversary of the Fair Housing Act. In commemoration, HUD, local communities, and fair housing organizations across the country have coordinated a variety of activities to enhance fair housing awareness, highlight HUD's fair housing enforcement efforts, and end housing discrimination in the nation. For a list of activities, log onto www.hud.gov/fairhousingis50.

Persons who believe they have experienced discrimination may file a complaint by contacting HUD's Office of Fair Housing and Equal Opportunity at (800) 669-9777 (voice) or (800) 927-9275 (TTY).

###

HUD's mission is to create strong, sustainable, inclusive communities and quality affordable homes for all.
More information about HUD and its programs is available on the Internet
at www.hud.gov and https://espanol.hud.gov.

You can also connect with HUD on social media and follow Secretary Carson on Twitter and Facebook or sign up for news alerts on HUD's Email List.
Title: Re: Privacy - the unmitigated gall of LivreVisage.
Post by: IainB on August 19, 2018, 01:53 PM
Ah! Found it! There's gold in them thar DCF datamines. I knew it was here somewhere. Just took me a while to find it though - a past comment on the DC Forum about LifeLock and which happens to be an apposite quote apropos of the recent HUD v. Facebook item, above: (my emphasis)
LifeLock has been running to center on the Equifax breach, with ads and press statements saying how the breach shows how important its own services cost: up to $29.99 a month can be to preserve you from identity theft.
...
Here’s what LifeLock isn’t spreading so widely: When you buy its security, you’re signing up for credit recording and monitoring services provided by, yes, Equifax.
...You just can't make this stuff up: Full Article here (https://latesthackingnews.com/2017/09/20/lifelock-gaining-lot-customer-base-taking-advantage-equifaxs-breach/)

Maybe it's just my rather dark sense of humor, but I just couldn't stop laughing after I read that. The unmitigated gall of corporations these days is just flat-out mind blowing.
Title: Re: Privacy - RUMPEL - it's YOUR data, after all!
Post by: IainB on August 19, 2018, 01:59 PM
RUMPEL web browser's aim : take back control of your personal data:
Interesting open source project led by the University of Warwick. Its aim is to help users keep track of where information about them is stored online so that they can actually -- personally -- benefit from it. An important issue; whether this is a viable solution or not is another one...   :)
[...] a marketing professor at the University of Warwick who led RUMPEL's development, said: "It's time for people to claim their data from the internet."
"The aim of RUMPEL is to empower users and enable them to be served by the ocean of data about them that's stored in all kinds of places online, so that it benefits them and not just the businesses and organisations that harvest it," she added.
"The strapline 'Your Data, Your Way' reflects our determination to let people lead smarter lives by bringing their digital lives back under their own control."
TechRadar article : New web browser lets you take back control of your personal data (http://www.techradar.com/us/news/world-of-tech/new-web-browser-lets-you-take-back-control-of-your-personal-data-1325545)
And the GitHub RUMPEL project (https://github.com/Hub-of-all-Things/rumpel).
Title: Re: Privacy - Could you please LIKE me?
Post by: IainB on August 20, 2018, 06:55 AM
Could you please LIKE me?
There is a series of short spoofs from "Black Mirror", offering a glimpse of where we might potentially be heading.

Title: Re: Privacy (collected references)
Post by: wraith808 on August 20, 2018, 10:05 AM
^ Black Mirror is, in general, a very twisted view of several small concepts we take for granted in everyday life.  I recommend the series.  That particular one is very insightful into our current state of the world.
Title: Re: Privacy - BI Rules?
Post by: IainB on August 22, 2018, 04:45 AM
The amusing Black Mirror video above, though artificial, is arguably a prescient comment on the implicit risks inherent in a tendency for "oversharing" and/or "data leakage" in the IT-enabled SNM (Social Network Marketplace) and other personal-data-related databases (e.g., health, insurance, banking, education), where private data that one might have previously perceived as being peculiar and useful/relevant in one context only is subsequently seen to be useful/relevant in another, or maybe many other contexts. These are typically the data connections and interconnections that the SNM operators and data miners would tend to seek/exploit, for financial gain.

When I was contracted in 2003 to get a failed data analysis project back on the rails, I learned quite a lot. It was a complex IT project to implement a BI (Business Intelligence) system and we had to train the users in the application of BI (it's actually quite powerful and hairy stuff) to meet the growing and complex business needs of the power (energy) company who had contracted me into the project recovery role. I learned that what the Defence sector had always taught was true - that all data/information can be interconnected at some stage - and that, for BI, the world could be simply envisaged as one or more universes of dynamic data - each having its own peculiar descriptive and dynamic data model and that, as in the popular SF concept of parallel universes, there was the potential to interlink these data universes (mass aggregations of dynamic data sets), constantly combining/recombining and drawing data from one to the other, enabling the BI analyst to discover data relationships in a way that would probably not have previously been feasible on such a mass scale, using the then prevailing technologies.

Fast forward to 2018, where we can perhaps now better understand why we might have the apparent privacy shambles that we see around us. It was a gold-rush, opportunistic, every man for himself. Presumably the Google/Facebook founders (and others) would have seen it coming. There were little/no regulations to limit or constrain the progress of BI and its application in the field of mass demographics. Now that some regulations have belatedly been or are being implemented, it arguably may be too late anyway - locking the stable door after the horse has bolted; Pandora's box has already been opened.
Title: Re: Privacy (collected references)
Post by: IainB on August 22, 2018, 05:39 AM
Following up on the BI comment above: I came across a link to what seemed an interesting viewpoint from the Economist, though from experience I'd suggest taking a pinch of salt with anything they nowadays publish - just in case, like, and especially when prefixed with the ominous religio-politically ideological cliché "Open":
(Copied below sans embedded hyperlinks/images.)
Open Future
Toward defining privacy expectations in an age of oversharing (https://www.economist.com/open-future/2018/08/16/toward-defining-privacy-expectations-in-an-age-of-oversharing)

Our digital data deserves protection, writes Margot Kaminski, a law professor

Open Future
Aug 16th 2018by MARGOT KAMINSKI
What assurances of privacy do we have in this digital age? Until this year, the quick answer was: effectively none. We share personal data with companies, who in turn share it with other companies and the government, with few if any legal means of individual redress or control. But if 2018 will be remembered as the year of Cambridge Analytica—a British data-mining company that allegedly influenced both British and American elections by targeting voters using personal data—it will also be remembered as the year that privacy law finally started catching up to the Internet.

In America, the “third-party doctrine” has long governed privacy law. This view holds that one can have no privacy expectations in information shared with a third party. The government could obtain a list of phone numbers you called without a warrant, for instance, because you shared that information with a phone company.

This runs counter to most people’s expectations, especially today. Privacy is many things to many people, but one thing it is not is absolute secrecy. We share sensitive information with our doctors; we hold whispered conversations in public places; we rely on practical obscurity even in big cities; and we disclose our most intimate information by text and by email.

Helen Nissenbaum, an ethicist at Cornell University, refers to the foundation of our digital-privacy expectations as “contextual integrity”. When we reveal information in one context, we trust that it won’t pop up to surprise us in another.

Another way to think of it is that we regularly use perceived features of our environments, both physical and social, to manage degrees of disclosure. An email service that uses a mailbox as its logo signals that email will be kept between sender and recipient—just like a regular letter—even if it is in fact stored on a company’s servers.

In June 2018, however, the Supreme Court struck a serious and welcome blow to the third-party doctrine in its Carpenter v. United States ruling. That case asked whether police needed a warrant to access someone’s mobile-phone location data. The Court held that historic mobile-phone location data deserved privacy protections, even if it is shared (often unknowingly) with a mobile-phone service provider.

The Court recognised that what used to be non-sensitive data—a record of your travels through public places—has, in the age of big data, effectively been converted into sensitive information. When gathered en masse and analysed, where someone travels can reveal her religion, health problems, sexual preferences and political affiliations. The Court thus recognised that privacy harms can trigger a wealth of related harms, chilling freedom of speech and freedom of association.

While 2018 brought paradigm shifts to American privacy law, in Europe it brought the General Data Protection Regulation (GDPR). The significance of the GDPR goes beyond the annoying barrage of privacy notices that popped up in May. It establishes enforceable individual transparency and control rights.

But the GDPR’s real impact will be within companies, behind the scenes. Backed by significant penalties for breaches (up to 4% of worldwide annual revenue), the GDPR imposes a series of duties on companies, regardless of whether individuals invoke their privacy rights. It requires companies to articulate legitimate reasons for collecting data; to collect only the data that they need; to design new products in ways that protect individual rights; and sometimes to appoint privacy officers and conduct privacy impact-assessments.

At first glance, the gap between Europe and America still appears enormous. The EU now has the GDPR; America continues to lack comprehensive federal data privacy law, relying instead on a patchwork of consumer protection, state laws, and sector-specific laws like the Health Insurance Portability and Accountability Act (HIPAA).

But two recent events have resulted in a surprising array of commonalities: the above-mentioned Carpenter case, and California’s Consumer Privacy Act (CPA), which California passed less than a month after the Carpenter ruling, and which creates an American version of data-protection law.

The California CPA governs not just information that people share directly with companies, but also personal data held by commercial data-brokers. Just as Carpenter suggests that legal protections follow even shared personal data, the CPA imposes transparency and control requirements even on companies that have no direct relationship with consumers. In this way, the CPA represents a shift towards the data protection model embraced in the GDPR. Legal protection travels with the data, regardless of whether or not there is a consumer relationship.

This is not to say that Europe and America are converging. For one, the CPA applies only to California residents (although because California is such a big market the law may influence policies for all Americans—referred to in the context of automobile regulations as the “California effect”). America also has a robust, and in some situations increasingly deregulatory, free speech right in the First Amendment that will likely come into conflict with deletion and disclosure rights.

But there is a growing transatlantic consensus emerging on privacy in the digital age. Sharing data no longer obviates privacy. Privacy protections now increasingly travel with personal information, even if that information is something a company has inferred rather than collected. Both legal systems also increasingly recognise that privacy is, perhaps counterintuitively, deeply linked to transparency: people cannot exert control or request remedies if they do not know where their information is going.

Perhaps most significantly, both legal regimes now exhibit a growing awareness of how linked privacy is to other well-recognised legal harms such as chilling effects on free expression or discrimination against individuals. Even if America does not enact a federal privacy law, the age of free data and diminishing data privacy looks to be rapidly ending.

Margot Kaminski is a professor at Colorado Law. She teaches, researches and writes on the intersection of law and technology.
Title: Re: Privacy (collected references)
Post by: wraith808 on August 22, 2018, 08:51 AM
The amusing Black Mirror video above, though artificial, is arguably a prescient comment on the implicit risks inherent in a tendency for "oversharing" and/or "data leakage" in the IT-enabled SNM (Social Network Marketplace) and other personal-data-related databases (e.g., health, insurance, banking, education), where private data that one might have previously perceived as being peculiar and useful/relevant in one context only is subsequently seen to be useful/relevant in another, or maybe many other contexts. These are typically the data connections and interconnections that the SNM operators and data miners would tend to seek/exploit, for financial gain.

When I was contracted in 2003 to get a failed data analysis project back on the rails, I learned quite a lot. It was a complex IT project to implement a BI (Business Intelligence) system and we had to train the users in the application of BI (it's actually quite powerful and hairy stuff) to meet the growing and complex business needs of the power (energy) company who had contracted me into the project recovery role. I learned that what the Defence sector had always taught was true - that all data/information can be interconnected at some stage - and that, for BI, the world could be simply envisaged as one or more universes of dynamic data - each having its own peculiar descriptive and dynamic data model and that, as in the popular SF concept of parallel universes, there was the potential to interlink these data universes (mass aggregations of dynamic data sets), constantly combining/recombining and drawing data from one to the other, enabling the BI analyst to discover data relationships in a way that would probably not have previously been feasible on such a mass scale, using the then prevailing technologies.

Fast forward to 2018, where we can perhaps now better understand why we might have the apparent privacy shambles that we see around us. It was a gold-rush, opportunistic, every man for himself. Presumably the Google/Facebook founders (and others) would have seen it coming. There were little/no regulations to limit or constrain the progress of BI and its application in the field of mass demographics. Now that some regulations have belatedly been or are being implemented, it arguably may be too late anyway - locking the stable door after the horse has bolted; Pandora's box has already been been opened.

It's arguably too late for any of us that are already born and have any social media links.  You'd have to have someone aware of not only what they share, but what others share about them.  You can make it harder to make the links, but I'd posit that it's impossible to restrict data after it has already been ingested into the system of data that surrounds us.
Title: Re: Privacy - Pandora's box.
Post by: IainB on August 22, 2018, 10:13 AM
Mythological basis of the Pandora's box idiom:
According to Hesiod, when Prometheus stole fire from heaven, Zeus, the king of the gods, took vengeance by presenting Pandora to Prometheus' brother Epimetheus. Pandora opened a jar left in his care containing sickness, death and many other unspecified evils which were then released into the world.[4] Though she hastened to close the container, only one thing was left behind – usually translated as Hope, though it could also have the pessimistic meaning of "deceptive expectation".[5]

From this story has grown the idiom "to open (a) Pandora's box", meaning to do or start something that will cause many unforeseen problems.[6] Its modern, more colloquial equivalent is "to open a can of worms".[7]
Source: https://en.wikipedia.org/wiki/Pandora%27s_box
The thing about "Pandora's Box" was that the contents (the troubles), when once released into the world as a result of Pandora's curiosity, were enduring and timeless and apparently could never be put back into the box, thus adversely affecting all humankind over time, from that point onwards.
Title: Re: Privacy - 10 tips recommended for teachers' privacy sanitisation.
Post by: IainB on September 23, 2018, 09:49 PM
A salutary tale with a recommended privacy sanitisation list, from Samizdata.net:
(Copied below.)
The Shadow Education Secretary wants to make teachers more vulnerable (https://www.samizdata.net/2018/09/the-shadow-education-secretary-wants-to-make-teachers-more-vulnerable/)
tags: Civil liberty & Regulation, Education & Academia, Internet, Privacy & Panopticon, UK affairs
Natalie Solent (Essex) - September 23rd, 2018

The Shadow Education Secretary, Angela Rayner MP (Lab), has called for a ban on anonymous online accounts (https://www.theguardian.com/politics/2018/sep/23/ban-anonymous-accounts-angela-rayner-tells-social-media-firms).

The education spokesperson also called for social media companies to ban anonymous accounts, complaining at a fringe event organised by the Guardian in Liverpool that most of the people that abused her online did so without using their real names.

Rayner said that social media firms should take greater responsibility for their users and complained in particular that Facebook seemed to have indicated that politicians should accept a higher level of abuse.

When asked what she thought about social media, Rayner said: “One of the first things they should do is stop anonymous accounts. Most people who send me abuse me do so from anonymous accounts and wouldn’t dream of doing it in their own name.”

Rayner conceded that using real names would not stop abuse but “it would certainly help a little bit. I think they should do more, they do have a responsibility for online”.
___________________________________

As I mentioned earlier, Angela Rayner is the Shadow Education Secretary. That ought to mean that she is aware that teachers, like MPs, are often subject to harassment. The Times Educational Supplement had an article on that very subject just a few days ago: “Why your social account is not as private as you think” (https://www.tes.com/news/why-your-social-account-not-private-you-think). It began:

The teacher’s Facebook account was set to private. She was certain of that. Yet, in the past week, she had received four friend requests from former pupils. She could not work out how they had found her.

So, as I am a researcher at the Greater Manchester Police – and her friend – she asked me to take a look. Within 10 minutes, I had not just found her, but I also had her full name, her partner’s name, the school she worked at, the name of one of her children and multiple images of the street she lives on.
___________________________________

The writer, Chris Glover, proceeded to give ten tips that teachers should employ to protect themselves:
  • 1. Keep accounts separate.
  • 2. Vary usernames.
  • 3. Check posts about you.
  • 4. Beware of public posts.
  • 5. Review privacy settings.\
  • 6. Don’t follow your school account.
  • 7. Avoid using your real name.
  • 8. Change the friends-list setting.
  • 9. Switch off location.
  • 10. Delete dormant accounts.

Following the above advice should help ensure that teachers can enjoy participating in life online while minimising the very real risk of being tracked down by former or current pupils bearing a grudge, or simply by people whom it is best to keep at arms length for professional or safeguarding reasons.

Until a Labour government gets in and makes Nos. (2) and (7) illegal outright, and demands that all of your personal details are held in one place by a social media company so as to be conveniently available for hackers and identity thieves.

The context here (for the benefit of non-British readers) is that the UK currently has a Conservative-led government, so the Labour party is the party "in opposition", as it were, and has "shadow ministers" for each of the main ministerial departments, of which Education is one.
There are 2 rather depressing things about this:

Though it is rather telling - and Labour voters could be forgiven for weeping or doing a face-palm over this, just as the other voters could be forgiven for having a LOL moment - if we look on the bright side, then the article  in The Times Educational Supplement gave us 10 very good points for improving privacy. These are points that we could extract and all follow to our advantage - i.e., not just teachers - and if Rayner had not made the gaff that she did, then we might never have heard of the 10 points and they would have remained buried in the article in The Times Educational Supplement.
Title: Re: Privacy (collected references) - more privacy sanitisation.
Post by: IainB on September 24, 2018, 03:47 AM
Another potentially helpful privacy sanitisation list from abine.com (too long to post here, so just the link): 8 Steps to Secure Your Facebook Privacy Settings (https://www.abine.com/blog/2018/8-steps-to-secure-your-facebook-privacy-settings/)
Title: Re: Privacy (collected references) - Google enforces its ownership of user data.
Post by: IainB on September 26, 2018, 02:07 PM
Having worked in Defence and marketing and having managed the design, development and delivery of smart nationwide credit-card driven EFT-POS systems which collect, curate, manipulate and use user data for marketing advantage, I have learned some very good reasons why the individual needs to understand:

I generally try to think for myself and prefer to take a healthily skeptical and politically agnostic outlook on life. I am personally fed up to the back teeth with the incessant incitement to outrage and the bombardment of absurd political bias and being told how to behave or encouraged to moronically right-think all the time, as pushed by a majority media cohort apparently funded by vested interests (i.e., propaganda, AKA "fake news") seemingly hell-bent on manipulating us (e.g., including the Facebook - Cambridge Analytica fiasco and SnowdenGate.).

Though it inevitably seems/tends to push its own peculiar political bias a lot of the time (like many websites), the website innov8tiv.com occasionally publishes what seem to be relatively well-balanced posts on topics that could be of interest. I therefore keep it in my BazQux feed-reader and periodically check it out.
IMHO, the item copied below from innov8tiv.com is potentially informative and thus worth a read:
(Copied below sans embedded hyperlinks/images, with my emphasis.)
People browsing using Chrome were quietly logged into their Google accounts without their consents | So much for users’ Privacy (http://innov8tiv.com/people-browsing-using-chrome-were-quietly-logged-into-their-google-accounts-without-their-consents-so-much-for-users-privacy/)
 Felix Omondi  September 24, 2018  Apps and Software

A professor at Johns Hopkins and a cryptography expert, Matthew Green, called out Google for making changes to Chrome, making the browser log in users into their Google account without the consent or even notifying them. A move security experts say puts the users’ privacy into jeopardy.

Historically, Chrome users have had the option of using the browser without logging in to their Google accounts. Although logging in does come with some obvious benefits such as having your bookmarks, passwords, and browsing history synced in the cloud and available across any device you are browsing on using the Chrome browser.

However, for security-conscious users who do not have Google – the most prominent advertising entity in the world – have their browsing data for purposes of sending them targeted Ads. Now that Google has made changes to the new Chrome to make the browser log users secretly into their Google Accounts means Google will get the data of users who would otherwise not have logged into their accounts.

Google has come out addressing these concerns raised by security experts stressing that users must have consented to the sync feature thus allowing the browser to transfer their data. Buried in the sync feature, is the revelation that for the sync feature as it works out will automatically also log you into your Google account.

So when a user logs in to their Gmail account on the browser, Chrome also automatically logs into their Google account. All that happens without the consent of the user or the user getting notifications.

“Now that I’m forced to log into Chrome,” wrote Green, “I’m faced with a brand new (sync consent) menu I’ve never seen before.”

Copied from: People browsing using Chrome were quietly logged into their Google accounts without their consents | So much for users’ Privacy | Innov8tiv - <http://innov8tiv.com/people-browsing-using-chrome-were-quietly-logged-into-their-google-accounts-without-their-consents-so-much-for-users-privacy/>
Interestingly enough, this would seem to be exactly the sort of thing that HAT (Hub of All Things) (https://www.hubofallthings.com/) - referred to above per Armando (2016-07-29, 14:49:38) (https://www.donationcoder.com/forum/index.php?topic=42877.msg401099#msg401099) - is apparently designed to protect us from, whilst at the same time increasing our privacy and freedom of choice:

What is the Hub of all Things? (https://www.youtube.com/watch?v=kgxKl_OCOaQ)


The Hub of All Things (https://www.youtube.com/watch?v=DAn2HB7FfmM)


Happy days.
Title: Re: Privacy- MEGA - General Data Protection Regulation Disclosure.
Post by: IainB on September 27, 2018, 11:38 AM
MEGA - MEGAsync - General Data Protection Regulation Disclosure: (https://mega.nz/gdpr)
(Copied below sans embedded hyperlinks/images.)
Spoiler
General Data Protection Regulation Disclosure (https://mega.nz/gdpr)

Introduction
In 2013 MEGA pioneered user-controlled end-to-end encryption through a web browser. It provides the same zero-knowledge security for its cloud storage and chat, whether through a web browser, mobile app, sync app or command line tool. MEGA, The Privacy Company, provides Privacy by Design.

As all files uploaded to MEGA are fully encrypted, their contents can’t be read or accessed in any manner by MEGA. Files can only be decrypted by the original uploader through a logged-in account, or by other parties who have been provided with file/folder keys generated by the account user.

Personal data is information relating to an identifiable natural person who can be directly or indirectly identified in particular by reference to an identifier.

MEGA stores the following categories of Personal Data
Contact Details
  • Email addresses
  • User’s name (if provided)

Transaction Details
  • IP address and Source Port for account creation and file uploads
  • Country location (inferred by matching IP to MaxMind IP database)
  • File size and date uploaded
  • Date that file/folder links are created
  • MEGA contacts
  • Chat destination contact(s) and time sent
  • Call destination contact(s), call start time and call duration
  • Subscriptions and payment attempts
  • Information provided to a payment processor when processing a subscription payment, such as Tax ID number, but not the credit/debit card number.

MEGA does not receive or store special categories of personal data or data relating to criminal convictions and offences, as any files that are uploaded to MEGA are fully encrypted at the user’s device so the encrypted data is not able to be decrypted by MEGA.

MEGA doesn’t share the data with any other party other than with competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences and as specified in the Privacy Policy clause 11.

Purpose
The purpose of storing the data is to manage account login and activity and to respond to information demands from authorities.

Processing
MEGA stores personal data but does not carry out any other processing activities on such data. This storage of personal data is necessary in order to provide the secure login to MEGA’s systems and to satisfy compliance obligations.

Lawful Basis of Processing: Contract
The processing of data is necessary for performance of the contract that MEGA has with each user, which they accepted through the Terms of Service when creating their account.

The Terms of Service clause 2 requires the user to agree to the Terms or otherwise to not use the service. Acknowledging and accepting the Terms of Service is a mandatory step in the signup process in all clients - web and mobile.

Clauses 50-51 of the Terms of Service incorporate the Privacy Policy by reference. The Privacy Policy specifies the personal information that is stored.

Retention of Personal Data
Personal data is retained indefinitely while the user’s account is open. After account closure, MEGA will retain all account information as long as there is any law enforcement request pending but otherwise for 12 months after account closure as users sometimes request that an account be re-activated. After 12 months, identifying information such as email and IP addresses will be anonymised (except that email address records will be retained for reference by the user’s contacts or where the user has participated in chats with other MEGA users) but other related database records may be retained.

After user deletion of a file all deleted files will be made inaccessible, marked for deletion and deleted fully when the next appropriate file deletion purging process is run.

After account closure all stored files will be marked for deletion and deleted fully when the next appropriate file deletion purging process is run.

Data Subject’s Rights
Each user has the rights specified in this disclosure notice.

Withdrawal of Consent
Users can only withdraw consent to MEGA collecting the specified personal information if they close their account.

Statutory and Contractual Obligations
Personal Information collected by MEGA is not collected because of any contractual or statutory obligation to third parties.

Automated Decision Making and Profiling
MEGA does not undertake any automated decision making or profiling.

The Right of Access
Individuals have the right to obtain:
  • confirmation that their data is being processed;
  • access to their personal data;
  • Any requests should be submitted to [email protected]. The information will be provided promptly, and at least within one month, without charge unless the request is manifestly unfounded or excessive.

Rectification
Individuals are entitled to have personal data rectified if it is inaccurate or incomplete. If MEGA has disclosed the personal data in question to any third party (such as a compliance authority), it will inform them of the rectification where possible and will also inform the individuals about the third parties to whom the data has been disclosed where appropriate. The only third parties that might have had disclosure are compliance authorities.

Erasure
The right to erasure does not provide an absolute ‘right to be forgotten’. Individuals have a right to have personal data erased and to prevent processing in specific circumstances:
  • The personal data is no longer necessary in relation to the purpose for which it was originally collected/processed.
  • The individual withdraws consent.
  • The individual objects to the processing and there is no overriding legitimate interest for continuing the processing.
  • The personal data was unlawfully processed (i.e. otherwise in breach of the GDPR).
  • The personal data has to be erased in order to comply with a legal obligation.
  • The personal data is processed in relation to the offer of information society services to a child.
Any requests for erasure will be considered in detail and would probably result in closure of the user’s account.

After account closure, MEGA will retain all account information as long as there is any law enforcement request pending but otherwise for 12 months after account closure as users sometimes request that an account be re-activated. After 12 months, identifying information such as email and IP address will be anonymised (except that email address records will be retained for reference by the user’s contacts or where the user has participated in chats with other MEGA users) but other related records may be retained.

After user deletion of a file all deleted files will be made inaccessible, marked for deletion and deleted fully when the next appropriate file deletion purging process is run.

After account closure all stored files will be marked for deletion and deleted fully when the next appropriate file deletion purging process is run.

In some cases a person may receive an email from MEGA asking the person to confirm their new account email address, but in fact they haven’t tried to open an account - someone else has started the process and used their email address either maliciously or by mistake. In these cases, MEGA has an ephemeral/incomplete account that might be used to upload files. On request, and after proving ownership of the email address, MEGA will arrange for the account to be deleted.

MEGA can refuse a request for erasure:
  • to comply with a legal obligation for the performance of a public interest task or exercise of official authority.
  • for public health purposes in the public interest;
  • for the exercise or defence of legal claims.
The Right to Restrict Processing
Individuals have a right to ‘block’ or suppress processing of personal data. When processing is restricted, MEGA is permitted to store the personal data, but not further process it. As MEGA only stores, and doesn’t further process the stored personal data, no action will be taken in response to a request to restrict processing.

Data Portability
The right to data portability only applies:
  • to personal data an individual has provided to a controller;
  • where the processing is based on the individual’s consent or for the performance of a contract; and
  • when processing is carried out by automated means.
On request by email to [email protected], MEGA will provide a user’s personal data in a structured, commonly used and machine readable form such as JSON files.

Note that all files in a user’s account can be downloaded and decrypted through any of the usual clients.

Lead Data Protection Supervisory Authority
The Lead Data Protection Supervisory Authority is the Luxembourg National Commission for Data Protection. This is the appropriate authority for accepting GDPR complaints about MEGA.

NATIONAL COMMISSION FOR DATA PROTECTION
1, avenue du Rock'n'Roll
L-4361 Esch-sur-Alzette
https://cnpd.public.lu

Controller
MEGA Limited
Level 21, Huawei Centre
120 Albert St
Auckland
New Zealand
Company number 4136598

Controller’s Representative
Mega Europe sarl
4 Rue Graham Bell
L-3235 Bettembourg
Luxembourg
Company number B182395
[email protected]

The Privacy Company. User-encrypted cloud services
 
MEGA
About us Plans & Pricing Resellers Service Policy Press & Media Credits Contact Us
Apps
iOS Android Windows Mobile Browser Extensions MEGAsync MEGAcmd MEGAbird
Support
Help centre Blog
Tools
SDK Source Code
Legal
Terms of Service Privacy policy Copyright Takedown Guidance General Data Protection Regulation Disclosure
© MEGA 2018 All rights reserved

Title: Re: Privacy - HAT (Hub of All Things) = Sovrin ?
Post by: IainB on September 27, 2018, 11:55 AM
EDIT: Oops! Forgot to post this initially:
Identity For All - Permanent Digital Identities that Don’t Require a Central Authority (https://sovrin.org/)
The Sovrin Solution
Sovrin is a decentralized, global public utility for self-sovereign identity. Self-sovereign means a lifetime portable identity for any person, organization, or thing. It’s a smart identity that everyone can use and feel good about. Having a self-sovereign identity allows the holder to present verifiable credentials in a privacy-safe way. These credentials can represent things as diverse as an airline ticket or a driver's license.

Sovrin identities will transform the current broken online identity system. Sovrin identities will lower transaction costs, protect people’s personal information, limit opportunity for cybercrime, and simplify identity challenges in fields from healthcare to banking to IoT to voter fraud.

...Interestingly enough, this would seem to be exactly the sort of thing that HAT (Hub of All Things) (https://www.hubofallthings.com/) - referred to above per Armando (2016-07-29, 14:49:38) (https://www.donationcoder.com/forum/index.php?topic=42877.msg401099#msg401099) - is apparently designed to protect us from, whilst at the same time increasing our privacy and freedom of choice:

What is the Hub of all Things? (https://www.youtube.com/watch?v=kgxKl_OCOaQ)


The Hub of All Things (https://www.youtube.com/watch?v=DAn2HB7FfmM)
...
Title: Re: Privacy: IT companies intend to profit from your loss of privacy?
Post by: IainB on November 15, 2018, 03:42 PM
Well, whilst this news might not be too surprising to some, to me it comes as a complete surprise:
Twitter, Facebook, and Google are Fighting Internet Privacy Laws (https://www.abine.com/blog/2018/twitter-facebook-google-fighting-privacy-laws/)
WRITTEN BY: JULIANNE SUBIA - NOVEMBER 14, 2018

Recently, the Information Technology Industry Council, which represents companies like Amazon, Visa, Microsoft, Google, Facebook, and Apple, released its “Framework to Advance Interoperable Rules (FAIR) on Privacy”. On the surface, it looks like tech companies are trying to protect user privacy. In reality, they want to make sure that they can continue to profit off of our data. Using simple privacy tools like DeleteMe and Blur will help you stay in control of your privacy.
(Read the rest at the link.)

Copied from: Twitter, Facebook, and Google are Fighting Internet Privacy Laws - <https://www.abine.com/blog/2018/twitter-facebook-google-fighting-privacy-laws/>

Oh noes! Looks like we're gonna have to pay money to third parties like abine.com to protect our personal privacy...Oh wait, how did that happen?
Who would'a thunk it, eh?    :tellme:
Title: Re: Privacy - Are your thoughts really your "own" thoughts?
Post by: IainB on December 05, 2018, 03:06 PM
I had always considered that the privacy of my mind was unassailable and that my thoughts were my own, and nobody could take them away from me - even if I were in the Stalags. Now I am not so sure. I commented the other day to my now 17 y/o daughter that, as an experiment, I had for the first time deliberately allowed Google permission to use "my" data - data about me that it already captures and holds and has access to, by default - to aim targeted ads at me. I told her that I found the result interesting, but somewhat disquieting.

Here is a very interesting - if not alarming - review on what happens, apparently almost immediately, when we succumb to allowing this kind of access through our privacy walls, reported on by spreadprivacy.com. Such a loss/reduction in privacy effectively enables third parties to engineer algorithms that could manipulate/modify our paradigms, sometimes without our even being aware of it, and it is happening now, even as I write this. It goes far beyond subliminal advertising, since it can clearly be used - and is being used - to subtly control/manipulate our perception of the reality of the world about us:
Measuring the "Filter Bubble": How Google is influencing what you click (https://spreadprivacy.com/google-filter-bubble-study/)

This goes far beyond merely allowing access to private data, being, in effect, more like giving permission to be brainwashed by a third party(ies). And we seem to be highly susceptible to it. It's very clever, and insidious, though I suppose it could be argued that it's not harmful, but merely a conditioning of one's thinking.

Again, Pandora's box has been well and truly opened:
Fast forward to 2018, where we can perhaps now better understand why we might have the apparent privacy shambles that we see around us. It was a gold-rush, opportunistic, every man for himself. Presumably the Google/Facebook founders (and others) would have seen it coming. There were little/no regulations to limit or constrain the progress of BI and its application in the field of mass demographics. Now that some regulations have belatedly been or are being implemented, it arguably may be too late anyway - locking the stable door after the horse has bolted; Pandora's box has already been opened.
Title: Re: Privacy (collected references)
Post by: 4wd on December 18, 2018, 04:59 PM
The death of the technology industry in Australia happened last week.

EDIT: A possibly better explanation of what the new laws involve: Australia’s horrific new encryption law likely to obliterate its tech scene (https://thenextweb.com/politics/2018/12/10/australias-horrific-new-encryption-law-likely-to-obliterate-its-tech-scene/)

The new law gives Australian law enforcement agencies the power to issue cooperation notices to technology entities with the purpose of gaining access to specific users’ encrypted messages and data. These entities may include companies, websites, or anything else that transmits data to an end-user in Australia.

Australia: Controversial Australian Encryption Act Denounced By Privacy And Cryptography Advocates (http://www.mondaq.com/australia/x/765064/Data+Protection+Privacy/Controversial+Australian+Encryption+Act+Denounced+by+Privacy+and+Cryptography+Advocates)

Last week, Australia's parliament passed a controversial act that will enable law enforcement and intelligence agencies to compel access to encrypted communications. In an explanatory memorandum, the Australian Parliament stated that the new act, the Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018, is intended to combat "the challenges posed by ubiquitous encryption." Under the act, certain law enforcement and intelligence agencies will be able to approach "designated communication providers," using one of the mechanisms below, for the purpose of gaining access to specific users' encrypted messages and data.

  • Technical Assistance Requests (TARs) – These are voluntary requests that allow law enforcement and intelligence agencies to request access to communications and data while bypassing the oversight rules surrounding mandatory notices. TARs may be issued by the directors-general of the Australian Security and Intelligence Organization (ASIO), the Australian Secret Intelligence Service (ASIS), or the Australian Signals Directorate (ASD), or by the chief officer of an "interception agency," which includes the Australian Federal Police (AFP), the Australian Crime Commission (ACC), and the state and territory police forces, provided that they obtain the approval of the AFP commissioner.
  • Technical Assistance Notices (TANs) – These are compulsory notices requiring a "designated communication provider" to use existing interception or decryption capabilities to provide access to communications or user logs. TANs can be obtained only by the director-general of the ASIO or the chief officer of an interception agency.
  • Technical Capability Notices (TCNs) – These are compulsory notices requiring designated communication providers to build infrastructure to meet subsequent TANs. TCNs may be issued only by the attorney general, with the approval of the minister for communications, following a request from the ASIO or the chief officer of an interception agency, and require written notice to the communication provider, allowing them the opportunity to respond within 28 days.

The new act allows these agencies to directly approach specific individuals, such as engineers or IT administrators at an organization, rather than the organization itself. Companies that resist the demands could face a fine of up to $7.3 million, while individuals who refuse could face jail time.

Australia’s encryption law threatens NZ cloud data (https://www.newsroom.co.nz/2018/12/17/363412/australias-new-encryption-law-threatens-nz-cloud-data)
Tech Companies Line Up To Pan Encryption Bill (https://www.lifehacker.com.au/2018/12/tech-companies-line-up-to-pan-encryption-bill/)
Encrypted Messaging App Signal Won’t Comply With Australia’s New Backdoor Bill (https://www.vice.com/en_au/article/nep5vb/signal-app-australia-encryption-backdoor-bill)

... and more (http://www.google.com/search?q=australia+encryption+bill&source=lnt&tbs=qdr:w) ...

Welcome to Australia, a country run by idiots elected by idiots.
Title: Re: Privacy (collected references)
Post by: IainB on December 20, 2018, 11:06 AM
@4wd: Yes. Some people (not me, you understand) might say that we should have expected to see this sort of messing-about with the privacy rights/rules from the Aussies and that they can't even win a game of cricket without bowling under-arm, or something - but I couldn't possibly comment.    :o
Title: Re: Privacy - DuckDuckGo Testimony Before the US Senate (March 12, 2019)
Post by: IainB on March 24, 2019, 01:23 AM
Very interesting: DuckDuckGo Testimony (on Privacy) Before the US Senate. (https://spreadprivacy.com/us-senate-testimony/)
(Copied below sans embedded hyperlinks/images.)
Below is the prepared testimony of Gabriel Weinberg, CEO & Founder of DuckDuckGo, before the United States Senate Judiciary Committee Hearing on GDPR & CCPA: Opt-ins, Consumer Control, and the Impact on Competition and Innovation.

March 12, 2019

Chairman Graham, Ranking Member Feinstein and Members of the Committee, thank you for holding this important hearing and inviting me to testify. I am here to explain that privacy legislation, like the GDPR and CCPA, is not only pro-consumer, but also pro-business, and pro-advertising.

DuckDuckGo's primary service is a search engine alternative to Google that allows you to search the web without being tracked. We are the fourth largest search engine in the US, serving over one billion searches a month globally. We also offer a mobile privacy browser that serves as an alternative to Google Chrome.

We regularly conduct rigorous consumer research on privacy issues, which we post at SpreadPrivacy.com. We also help educate consumers about online privacy from our Twitter account, @duckduckgo.

I founded DuckDuckGo in 2008, far outside of Silicon Valley, in Valley Forge, Pennsylvania. We now have a distributed workforce spread across the nation in twelve states, the District of Columbia, and in ten other countries.

As you know, people are tired of being watched everywhere they go online. They are fed up with all the intended and unintended consequences this online tracking creates, including invasive ads, identity theft, discrimination, and manipulation. Have you ever searched for something only to see an ad for that very thing pop up in a mobile app or on a different website? DuckDuckGo helps you avoid these types of scenarios by seamlessly reducing your online digital footprint.

Every time you search on DuckDuckGo, it's like you are searching on our site for the first time. We do not even have the concept of a search history.

And we also offer privacy protection beyond the search box. Many companies run sophisticated tracker networks that lurk on the websites you visit. DuckDuckGo’s browser technology blocks such hidden trackers.

In many ways I come to you from the future: I run a business that is already GDPR and CCPA-compliant. Our privacy policy is straightforward and doesn’t require a law degree to decipher: We simply do not collect or share any personal information at all. That’s it — no confusing settings to fiddle with, no jargon-filled pages to read. Yet, even with this simple privacy policy, we nonetheless are able to make money through advertising.

This brings me to my first point: Privacy legislation is not anti-advertising. Take our business for example: When you type in a search on DuckDuckGo, we simply show you ads related to that search. If you search for ‘car’, we show you car ads. But those ads won’t follow you around, because we don’t know who you are, where you’ve been, or where you go. It's contextual advertising versus behavioral advertising.

As a privately held company, our finances are private, though I’m proud to say we’ve been profitable using contextual advertising since 2014, and last year we earned substantially more than the $25 million revenue floor that subjects a company to CCPA.

And we are not alone. For example, in response to GDPR, when the New York Times in Europe switched from behavioral advertising to contextual advertising, it reported an increase in revenue. And just last week, Business Insider reported the Washington Post was looking into making a similar change. If Congress forced the digital advertising industry to return to its roots in contextual advertising, that would allow companies to remain profitable, or even become more profitable — all without the unintended consequences of behavioral advertising.

My second point is that privacy is becoming increasingly good for business. Consumers flock to brands they trust and respect, and according to Harris Poll, data privacy is the most pressing issue on Americans' minds, now for two years in a row. And again, we serve as a great case study, having grown exponentially during this period.

>Chart showing the increase in DuckDuckGo traffic from 2008 to 2019.

My third point is that well-drafted privacy legislation can spur more competition and innovation in one of the most foundational markets of the Internet: digital advertising. This market is currently a duopoly, and this reality is hurting everyone from small businesses to venture-backed startups to media companies. To restore competition and innovation in this market, the data monopolies at its core need to be addressed.

Fixing this digital-ad-market duopoly can take any number of forms. Here are three suggestions. First, consumers could be given a robust mechanism to opt-out of online tracking. Second, monopoly platforms could be prohibited from combining data across their different business lines. Third, acquisitions that strengthen existing data monopolies could be blocked.

Our mission at DuckDuckGo is to raise the standard of trust online. We support strong privacy legislation that does exactly that. We believe the Internet shouldn’t feel so creepy, and getting the privacy you deserve online should be as easy as closing the blinds.

I am pleased to answer your questions today and make myself available to Members in the future for more in-depth discussions. Thank you.

You can download the PDF version of this testimony here (https://duckduckgo.com/download/GDRP-CCPA-Hearing-Testimony_2019-03-12.pdf).