Original post: | 2018-06-28 |
Last updated: | 2018-11-16 |
Three Reasons Why the "Nothing to Hide" Argument is Flawed (https://spreadprivacy.com/three-reasons-why-the-nothing-to-hide-argument-is-flawed/)
27 JUNE 2018/PRIVACY
Over the years, we at DuckDuckGo have often heard a flawed counter-argument to online privacy: “Why should I care? I have nothing to hide.”
As Internet privacy has become more mainstream (https://spreadprivacy.com/mainstream/), this argument is rightfully fading away. However, it’s still floating around and so we wanted to take a moment to explain three key reasons why it's flawed.
1) Privacy isn’t about hiding information; privacy is about protecting information, and surely you have information that you’d like to protect.
- Do you close the door when you go to the bathroom? Would you give your bank account information to anyone? Do you want all your search and browsing history made public? Of course not.
- Simply put, everyone wants to keep certain things private and you can easily illustrate that by asking people to let you make all their emails, texts, searches, financial information, medical information, etc. public. Very few people will say yes.
2) Privacy is a fundamental right and you don't need to prove the necessity of fundamental rights to anyone.
- You should have the right to free speech even if you feel you have nothing important to say right now. You should have the right to assemble even if you feel you have nothing to protest right now. These should be fundamental rights just like the right to privacy.
- And for good reason. Think of commonplace scenarios in which privacy is crucial and desirable like intimate conversations, medical procedures, and voting. We change our behavior when we're being watched, which is made obvious when voting; hence, an argument can be made (https://www.ted.com/talks/glenn_greenwald_why_privacy_matters) that privacy in voting underpins democracy.
3) Lack of privacy creates significant harms that everyone wants to avoid.
- You need privacy to avoid unfortunately common threats like identity theft, manipulation through ads, discrimination based on your personal information, harassment, the filter bubble (https://spreadprivacy.com/filter-bubble/), and many other real harms that arise from invasions of privacy.
- In addition, what many people don’t realize is that several small pieces of your personal data can be put together to reveal much more about you than you would think is possible. For example, an analysis (http://news.mit.edu/2015/identify-from-credit-card-metadata-0129) conducted by MIT researchers found that “just four fairly vague pieces of information — the dates and locations of four purchases — are enough to identify 90 percent of the people in a data set recording three months of credit-card transactions by 1.1 million users.”
It’s critical to remember that privacy isn't just about protecting a single and seemingly insignificant piece of personal data, which is often what people think about when they say, “I have nothing to hide.” For example, some may say they don't mind if a company knows their email address while others might say they don't care if a company knows where they shop online.
However, these small pieces of personal data are increasingly aggregated by advertising platforms like Google and Facebook to form a more complete picture of who you are, what you do, where you go, and with whom you spend time. And those large data profiles can then lead much more easily to significant privacy harms. If that feels creepy, it’s because it is.
We can't stress enough that your privacy shouldn’t be taken for granted. The ‘I have nothing to hide’ response does just that, implying that government and corporate surveillance should be acceptable as the default.
Privacy should be the default. We are setting a new standard of trust online and believe getting the privacy you want online should be as easy as closing the blinds.
For more privacy advice, follow us on Twitter (https://twitter.com/duckduckgo) & get our privacy crash course (http://duckduckgo.com/newsletter).
Dax the duck
We are the Internet privacy company that lets you take control of your information, without any tradeoffs. Welcome to the Duck Side!
(Read more. (https://spreadprivacy.com/author/dax/))
Three Reasons Why the "Nothing to Hide" Argument is Flawed (https://spreadprivacy.com/three-reasons-why-the-nothing-to-hide-argument-is-flawed/)-IainB (June 28, 2018, 01:02 AM)
Google weeps as its home state of California passes its own GDPR (https://www.theregister.co.uk/2018/06/29/california_data_privacy_law/)
The right to view and delete personal info is here – and you'll be amazed to hear why the law passed so fast
By Kieren McCarthy in San Francisco 29 Jun 2018 at 20:0213 Reg comments
Uh oh, someone just got some bad news
California has become the first state in the US to pass a data privacy law – with governor Jerry Brown signing the California Consumer Privacy Act of 2018 into law on Thursday.
The legislation will give new rights to the state's 40 million inhabitants, including the ability to view the data that companies hold on them and, critically, request that it be deleted and not sold to third parties. It's not too far off Europe's GDPR.
Any company that holds data on more than 50,000 people is subject to the law, and each violation carries a hefty $7,500 fine. Needless to say, the corporations that make a big chunk of their profits from selling their users' information are not overly excited about the new law.
"We think there's a set of ramifications that's really difficult to understand," said a Google spokesperson, adding: "User privacy needs to be thoughtfully balanced against legitimate business needs."
Likewise tech industry association the Internet Association complainedthat "policymakers work to correct the inevitable, negative policy and compliance ramifications this last-minute deal will create."
So far no word from Facebook, which put 1.5 billion users on a boat to California back in April in order to avoid Europe's similar data privacy regulations.
Don't worry if you are surprised by the sudden news that California, the home of Silicon Valley, has passed a new information privacy law – because everyone else is too. And this being the US political system there is, of course, an entirely depressing reason for that.
Another part of the statement by the Internet Association put some light on the issue: "Data regulation policy is complex and impacts every sector of the economy, including the internet industry," it argues. "That makes the lack of public discussion and process surrounding this far-reaching bill even more concerning. The circumstances of this bill are specific to California."
I see...
So this bill was rushed through?
Yes, it was. And what's more it was signed in law on Thursday by Governor Brown just hours after it was passed, unanimously, by both houses in Sacramento. What led lawmakers to push through privacy legislation at almost unheard-of speed? A ballot measure.
That’s right, since early 2016, a number of dedicated individuals with the funds and legislative know-how to make data privacy a reality worked together on a ballot initiative in order to give Californians the opportunity to give themselves their own privacy rights after every other effort in Sacramento and Washington DC has been shot down by the extremely well-funded lobbyists of Big Tech and Big Cable.
Hand locking door
GDPR forgive us, it's been one month since you were enforced…
READ MORE
Real estate developer Alastair Mactaggart put about $2m of his own money into the initiative following a chance conversation with a Google engineer in his home town of Oakland in which the engineer told him: "If people just understood how much we knew about them, they’d be really worried."
Mactaggart then spoke with a fellow dad at his kid's school, a finance guy called Rick Arney who had previously worked in the California State Senate, about it. And Arney walked him through California's unusual ballot measure system where anyone in the state can put forward an initiative and if it gets sufficient support will be put on the ballot paper at the next election.
If a ballot initiative gets enough votes, it becomes law. There have been some good and some bad outcomes from this exercise in direct democracy over the years but given the fact that both Mactaggart and Arney felt that there was no way a data privacy law would make its way through the corridors of power in Sacramento in the normal way, given the enormous influence of Silicon Valley, they decided a ballot measure was the way to go.
Beware the policy wonk
One other individual is worth mentioning: Mary Stone Ross was a former CIA employee and had been legal counsel for the House of Representatives Intelligence Committee and she also lives in Oakland. Mactaggart persuaded her to join the team to craft the actual policy and make sure it could make it through the system.
Together the three of them then spend the next year talking to relevant people, from lawyers to tech experts to academics to ordinary citizens to arrive at their overall approach and draft the initiative.
And it is at that point that, to be put in bluntly, the shit hit the fan. Because the truth is that consumers – and especially Californians who tend to be more tech-savvy than the rest of the country given the concentration of tech companies in the state – understand the issues around data privacy rules and they want more rights over it.
With the initiative well structured and the policy process run professionally, the ballot measure gained the required number of supporters to get it on the ballot. And thanks to the focus groups and polls the group carried out, they were confident that come November it would pass and data privacy become law through direct democracy.
At which point, it is fair to say, Big Internet freaked out and made lots of visits to lawmakers in Sacramento who also freaked out.
The following months have seen a scurry of activity but if you want to know why the bill became law in almost record time and was signed by Governor Brown on Thursday all you need to know is this single fact: the deadline for pulling the initiative from November's ballot as last night – Thursday evening – and Mactaggart said publicly that if the bill was signed, he would do exactly that and pull his ballot measure.
Privy see
You may be wondering why Sacramento was able to get it through unanimously without dozens of Google and Facebook-funded lawmakers continually derailing the effort, especially since it was still a ballot measure. After all, the tech giants could have spent millions campaigning against the measure in a bid to make sure people didn’t vote for it.
And the truth is that they had already lined up millions of dollars to do exactly that. Except they were going to lose because, thanks to massively increased public awareness of data privacy given the recent Facebook Russian election fake news scandal and the European GDPR legislation, it was going to be very hard to push back against the issue. And it has been structured extremely well – it was, frankly, good law.
There is another critical component: laws passed through the ballot initiative are much, much harder for lawmakers to change, especially if they are well structured.
So suddenly Big Tech and Sacramento were faced with a choice: pass data privacy legislation at record speed and persuade Mactaggart to pull his ballot initiative with the chance to change it later through normal legislative procedures; or play politics as usual and be faced with the same law but one that would be much harder to change in future.
And, of course, they went with the law. And Mactaggart, to his eternal credit, agreed to pull his ballot measure in order to allow the "normal" legislative approach to achieve the same goal.
And so the California Consumer Privacy Act of 2018 is now law and today is the first day that most Californians will have heard of it. Sausage making at its finest.
Of course, Google, Facebook et al are going to spend the next decade doing everything they can trying to unravel it. And as we saw just last week, lawmakers are only too willing to do the bidding of large corporate donors. But it is much harder to put a genie back in the bottle than it is to stop it getting out. ®
Copied from: Google weeps as its home state of California passes its own GDPR • The Register - <https://www.theregister.co.uk/2018/06/29/california_data_privacy_law/>
...Here's a more in depth paper on the subject:- thankyou! :Thmbsup:
"I've Got Nothing to Hide" and Other Misunderstandings of Privacy by Daniel J. Solove (https://www.donationcoder.com/forum/index.php?topic=20287.msg420924#msg420924)
Disclaimer: I haven't taken the time to read it yet, so I can't speak to its contents.-Deozaan (June 28, 2018, 03:38 AM)
This is a tangentially related bit of irony:
I went to download a paper on privacy called "I've Got Nothing to Hide" and Other Misunderstandings of Privacy by Daniel J. Solove (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=998565), but since the website detected that I was using an anonymous proxy, they tried to get me to register for an account so they could track me, and made me complete the reCAPTCHA three times when I insisted on clicking the (almost hidden) link to continue downloading anonymously.-Deozaan (June 18, 2018, 07:18 PM)
* © Daniel J. Solove 2007. Associate Professor, George Washington University
Law School; J.D., Yale Law School. Thanks to Chris Hoofnagle, Adam Moore, and Michael
Sullivan for helpful comments, and to my research assistant Sheerin Shahinpoor. I
develop some of the ideas in this essay in significantly more depth in my forthcoming
book, Understanding Privacy, to be published by Harvard University Press in May 2008.
(From the footnote to the cover page of: “I’ve Got Nothing to Hide” and Other
Misunderstandings of Privacy.
DeepMind's 'Demis Hassabis is an individual' – Ministry of Fun
By Andrew Orlowski 29 Jun 2018 at 09:5517 Reg comments
demis hassabis
DeepMind co-founder Demis Hassabis (Pic: Debby Wong / Shutterstock.com)
Google is not advising the British government on AI, the Ministry of Fun assured this week, following the appointment of Google's Demis Hassabis as an advisor on AI.
The US ad, search and cloud biz acquired Hassabis' company DeepMind four years ago and he has since been a Google employee. In the wordsof The Guardian, Hassabis is "leading Google's project to build software more powerful than the human brain".
Earlier this week, the Department for Digital, Culture, Media and Sport – aka the Ministry of Fun – announced the creation of a new "AI Council" and appointed Hassabis as its advisor. The department seemed pleased with landing such a trophy, explaining that Hassabis "will provide expert industry guidance to help the country build the skills and capability it needs to capitalise on the huge social and economic potential of AI – a key part of the Government's modern Industrial Strategy."
But just because a Google employee is giving the government advice, that doesn't necessarily mean a Google employee is giving the government advice. You would be quite wrong to think that.
(Read the rest at the link.)
Copied from: UK.gov is not being advised by Google. Repeat. It is not being advised by Google • The Register - <https://www.theregister.co.uk/2018/06/29/ministry_of_fun_is_not_being_advised_by_google/>
Scott Adams Blog: Message to My Government 03/06/2014 (http://dilbert.com/blog/entry/message_to_my_government/)
Mar 6, 2014
I never felt too violated by the news that my government can snoop on every digital communication and financial transaction I make. Maybe I should have been more bothered, but the snooping wasn't affecting my daily life, and it seemed like it might be useful for fighting terrorism, so I worried about other things instead.
This week, as I was pulling together all of my records to do taxes, I didn't get too upset that the process of taxpaying is unnecessarily frustrating and burdensome. As a citizen, I do what I need to do. I'm a team player.
I have also come to peace with the fact that my government now takes about half of my income. I figure most of it goes to good causes. I'm here to help.
I take pride in the fact that I don't let the little things get to me.
But the other day, as I was crawling my way through mountains of statements and receipts, trying to organize my records for my accountant, with several more days of this drudgery ahead, I had a disturbing thought. I must warn you in advance that this disturbing thought can only be expressed in all capital letters and it must include profanity. It goes like this.
Message to my government:
DO MY FUCKING TAXES FOR ME, YOU ASSHOLES!!! YOU ALREADY KNOW EVERY FUCKING THING I DID THIS YEAR!!!
Seriously.-IainB (March 11, 2014, 05:57 AM)
Facebook Acknowledges It Shared User Data With 61 Companies (https://www.bleepingcomputer.com/news/technology/facebook-acknowledges-it-shared-user-data-with-61-companies/)
tags: Technology
Catalin Cimpanu - 2018-07-02
Image: Facebook app login
In a 747-page document (https://docs.house.gov/meetings/IF/IF00/20180411/108090/HHRG-115-IF00-Wstate-ZuckerbergM-20180411-SD003.pdf) provided to the US House of Representatives' Energy and Commerce Committee on Friday, Facebook admitted that it granted special access to users' data to 61 tech companies.
According to the document (https://docs.house.gov/meetings/IF/IF00/20180411/108090/HHRG-115-IF00-Wstate-ZuckerbergM-20180411-SD003.pdf), these 61 companies received a "one-time" extension so they could update their apps in order to comply with a Terms of Service change the company applied in May 2015.
61 companies received API exemptions in 2015
The six-month extension was applied from May 2015, onward, when Facebook restricted its API so apps could not access too much data on its users, and especially the data of users' friends.
The API change came in a period when apps like the one developed by Cambridge Analytica were using the Facebook API to mass-harvest the data of Facebook users.
In May 2015, Facebook realized that apps were abusing this loophole in its permission system to trick one user into granting permission to the personal data of hundreds of his friends, and restricted the Facebook API to prevent indirect data harvesting.
But these 61 tech companies, because they ran popular apps, received an exemption to this API change, during which, theoretically, they could have abused the Facebook API to collect data on Facebook users and their friends. Data that could have been collected included name, gender, birthdate, location, photos, and page likes.
Facebook did not say if any of these companies abused this extension period to harvest data on users and their friends. The list of 61 companies who received an API extension includes:Spoiler1. ABCSocial, ABC Television Network
2. Actiance
3. Adium
4. Anschutz Entertainment Group
5. AOL
6. Arktan / Janrain
7. Audi
8. biNu
9. Cerulean Studios
10. Coffee Meets Bagel
11. DataSift
12. Dingtone
13. Double Down Interactive
14. Endomondo
15. Flowics, Zauber Labs
16. Garena
17. Global Relay Communications
18. Hearsay Systems
19. Hinge
20. HiQ International AB
21. Hootsuite
22. Krush Technologies
23. LiveFyre / Adobe Systems
24. Mail.ru
25. MiggoChat
26. Monterosa Productions Limited
27. never.no AS
28. NIKE
29. Nimbuzz
30. NISSAN MOTOR CO / Airbiquity Inc.
31. Oracle
32. Panasonic
33. Playtika
34. Postano, TigerLogic Corporation
35. Raidcall
36. RealNetworks, Inc.
37. RegED / Stoneriver RegED
38. Reliance/Saavn
39. Rovi
40. Salesforce/Radian6
41. SeaChange International
42. Serotek Corp.
43. Shape Services
44. Smarsh
45. Snap
46. Social SafeGuard
47. Socialeyes LLC
48. SocialNewsdesk
49. Socialware / Proofpoint
50. SoundayMusic
51. Spotify
52. Spredfast
53. Sprinklr / Sprinklr Japan
54. Storyful Limited / News Corp
55. Tagboard
56. Telescope
57. Tradable Bits, TradableBits Media Inc.
58. UPS
59. Vidpresso
60. Vizrt Group AS
61. Wayin
Of the list above, Serotek received an eight-month extension.
Facebook points the finger at five other companies
Facebook also said it identified five other companies that tested beta versions of their apps that had the "theoretical" capability of harvesting a users' friends data. The list includes.
1. Activision / Bizarre Creations
2. Fun2Shoot
3. Golden Union Co.
4. IQ Zone / PicDial
5. PeekSocial
"We are not aware that any of this handful of companies used this access, and we have now revoked any technical capability they may have had to access any friends' data", Facebook said.
Facebook slowly closing all loopholes
In addition, Facebook also announced it was discontinuing 38 partnerships with companies that it authorized to build versions of Facebook or Facebook features for custom devices and products, and which may have also gained extensive access to user data.
Last week, a security researcher discovered (https://medium.com/@intideceukelaire/this-popular-facebook-app-publicly-exposed-your-data-for-years-12483418eff8) another quiz app, similar to the one developed by Cambridge Analytica, which also gained access and later exposed the details of over 120 million Facebook users.
The app was named Nametests.com, associated with the eponymous website. Current evidence doesn't suggest the data collected by this second quiz app might have been used for political ads and influence campaigns such as the one collected by Cambridge Analytica.
_________________
CATALIN CIMPANU
Catalin Cimpanu is the Security News Editor for Bleeping Computer, where he covers topics such as malware, breaches, vulnerabilities, exploits, hacking news, the Dark Web, and a few more. Catalin previously covered Web & Security news for Softpedia between May 2015 and October 2016. The easiest way to reach Catalin is via his XMPP/Jabber address at [email protected]. For other contact methods, please visit Catalin's author page.
Copied from: Facebook Acknowledges It Shared User Data With 61 Companies - <https://www.bleepingcomputer.com/news/technology/facebook-acknowledges-it-shared-user-data-with-61-companies/>
What a VPN can do for digital privacy: One of the best tools that users can deploy to improve their privacy online is arguably by using a VPN. The post provides a good overview of what a VPN is, its benefits and how it can be used in conjunction with the GDPR legislation to protect your privacy. There are recommendations for the "best" VPNs for GDPR.-IainB (July 08, 2018, 02:02 AM)
I'm surprised they didn't mention ProtonVPN (https://protonvpn.com/).-Deozaan (July 08, 2018, 05:46 PM)
In addition to strong technical security, ProtonVPN also benefits from strong legal protection. Because we are based in Switzerland, ProtonVPN is protected by some of the world's strongest privacy laws and remains outside of US and EU jurisdiction. This means that unlike VPN providers based in a fourteen eyes country, we cannot be coerced into spying on our users.
Further intelligence sharing collaborations
As spelled out by Privacy International, there are a number of issue-specific intelligence agreements that include some or all the above nations and numerous others, such as:
- An area specific sharing amongst the 41 nations that formed the allied coalition in Afghanistan;
- A shared effort of the Five Eyes nations in "focused cooperation" on computer network exploitation with Austria, Belgium, Czech Republic, Denmark, Germany, Greece, Hungary, Iceland, Italy, Japan, Luxembourg, the Netherlands, Norway, Poland, Portugal, South Korea, Spain, Sweden, Switzerland and Turkey;
- Club of Bernew: 17 members including primarily European States; the US is not a member;
- The Counterterrorist Group: a wider membership than the 17 European States that make up the Club of Bernew, and includes the US;
- NATO Special Committee: made up of the heads of the security services of NATO's 28 member countries;
EDIT 2018-07-09:
NB: TRUST is a key issue here. There is a caveat that many organisations in the business of providing $PAID-for VPN services seem to tend to conceal - not all the VPN providers are actually operating a trustworthy service, from the user's perspective, such that your logged VPN activity data could be made available to government or other authorities, through legal or other compulsion (even corruption/informal agreement).
If they want you bad enough I doubt whether a VPN provider anywhere is going to stop them.-4wd (July 08, 2018, 10:25 PM)
Q: What are your thoughts on internet privacy?
Big internet companies like Facebook or Google have effectively hijacked the privacy discourse in the recent years. Their marketers managed to convince the public that the most important things about privacy are superficial tools that allow hiding your public posts or your profile pictures from the people around you. Adding these superficial tools enables companies to calm down the public and change nothing in how they are turning over private data to marketers and other third parties.
At Telegram we think that the two most important components of Internet privacy should be instead:
Protecting your private conversations from snooping third parties, such as officials, employers, etc.
Protecting your personal data from third parties, such as marketers, advertisers, etc.
This is what everybody should care about, and these are some of our top priorities. Telegram's aim is to create a truly free messenger, without the usual caveats (https://telegram.org/privacy). This means that instead of diverting public attention with low-impact settings, we can afford to focus on the real privacy issues that exist in the modern world.
Q: What about GDPR?
New regulations regarding data privacy called the General Data Protection Regulation (GDPR) came into force in Europe on May 25, 2018. Since taking back our right to privacy was the reason we made Telegram, there wasn‘t much we had to change. We don’t use your data for ad targeting, we don’t sell it to others, and we’re not part of any mafia family “family of companies.”
Telegram only keeps the information it needs to function as a feature-rich cloud service — for example, your cloud chats so that you can access them from any devices without using third-party backups, or your contacts so that you can rely on your existing social graph when messaging people on Telegram.
We're still working with our lawyers on an update to the Telegram Privacy Policy (https://telegram.org/privacy) that will lay this out in even more detail (don‘t expect any dramatic changes there though). We’ll notify you when it's ready.
For now, please feel free to use our new @GDPRbot (https://t.me/gdprbot) to:Android (https://play.google.com/store/apps/details?id=org.telegram.messenger) users got a GDPR update with version 4.8.9 which allows more control over synced contacts and adds other privacy settings. On June, 1, Apple approved (https://t.me/durov/88) Telegram v.4.8.2 for iOS with these features.
- Request a copy of all your data that Telegram stores.
- Contact Telegram's Data Protection Officer.
Q: There's illegal content on Telegram. How do I take it down?
All Telegram chats and group chats are private amongst their participants. We do not process any requests related to them. ...
Q: A bot or channel is infringing on my copyright. What do I do?
All Telegram chats and group chats are private amongst their participants. We do not process any requests related to them. ...
Please don't think a VPN is going to give you any form of privacy.
A VPN lets you access a remote network securely across an insecure line - this is the only thing it's guaranteed to do. It's the only thing you should be using it for. Stop spreading the damn misconception that it's useful for privacy.-f0dder (July 24, 2018, 11:04 AM)
Doesn't it help prevent tracking?Not really, no. You have to consider that most people aren't on static global IPs, but will either have dynamic IPs, or even (a very large number) be behind cgnat. The tracking folks obviously want to be able to uniquely identify you even in spite of that, and across devices as well.-Deozaan (July 24, 2018, 02:10 PM)
Facebook insists it has 'no plans' to exploit your personal banking info for ads – just as we have 'no plans' to trust it (https://www.theregister.co.uk/2018/08/07/facebook_banking_data/)
After all, never say never!
By Kieren McCarthy in San Francisco 7 Aug 2018 at 20:4432
Image [Denial]
Analysis: Facebook has denied it is seeking to suck up netizens' bank account details, claiming it just wants to connect bank customers to their bank's chat accounts and give useful financial updates. ...
Copied from: Facebook insists it has 'no plans' to exploit your personal banking info for ads – just as we have 'no plans' to trust it • The Register - <https://www.theregister.co.uk/2018/08/07/facebook_banking_data/>
Looks like the Ugandan government could be in the vanguard when it comes to, uh, privacy...
...Uganda orders ISPs to block Ugandans from accessing Pornographic Websites (http://innov8tiv.com/uganda-orders-isps-to-block-ugandans-from-accessing-pornographic-websites/) Nice one! :Thmbsup:-IainB (July 29, 2018, 08:14 PM)
From the U.S. Department of Housing and Urban Development (https://www.hud.gov/)HUD FILES HOUSING DISCRIMINATION COMPLAINT AGAINST FACEBOOK (https://www.hud.gov/press/press_releases_media_advisories/HUD_No_18_085)
HUD No. 18-085
HUD Public Affairs
(202) 708-0685FOR RELEASE
Friday
August 17, 2018
Secretary-initiated complaint alleges platform allows advertisers to discriminate
WASHINGTON – The U.S. Department of Housing and Urban Development (HUD) announced today a formal complaint against Facebook for violating the Fair Housing Act by allowing landlords and home sellers to use its advertising platform to engage in housing discrimination.
HUD claims Facebook enables advertisers to control which users receive housing-related ads based upon the recipient's race, color, religion, sex, familial status, national origin, disability, and/or zip code. Facebook then invites advertisers to express unlawful preferences by offering discriminatory options, allowing them to effectively limit housing options for these protected classes under the guise of 'targeted advertising.' Read HUD's complaint against Facebook (https://www.hud.gov/sites/dfiles/PIH/documents/HUD_01-18-0323_Complaint.pdf).
"The Fair Housing Act prohibits housing discrimination including those who might limit or deny housing options with a click of a mouse," said Anna María Farías, HUD's Assistant Secretary for Fair Housing and Equal Opportunity. "When Facebook uses the vast amount of personal data it collects to help advertisers to discriminate, it's the same as slamming the door in someone's face."
The Fair Housing Act prohibits discrimination in housing transactions including print and online advertisement on the basis of race, color, national origin, religion, sex, disability, or familial status. HUD's Secretary-initiated complaint follows the Department's investigation into Facebook's advertising platform which includes targeting tools that enable advertisers to filter prospective tenants or homebuyers based on these protected classes.
For example, HUD's complaint alleges Facebook's platform violates the Fair Housing Act. It enables advertisers to, among other things:Additionally, Facebook promotes its advertising targeting platform for housing purposes with "success stories" (https://www.facebook.com/business/success/quadrant-homes) for finding "the perfect homeowners," "reaching home buyers," "attracting renters" and "personalizing property ads."
- display housing ads either only to men or women;
- not show ads to Facebook users interested in an "assistance dog," "mobility scooter," "accessibility" or "deaf culture";
- not show ads to users whom Facebook categorizes as interested in "child care" or "parenting," or show ads only to users with children above a specified age;
- to display/not display ads to users whom Facebook categorizes as interested in a particular place of worship, religion or tenet, such as the "Christian Church," "Sikhism," "Hinduism," or the "Bible."
- not show ads to users whom Facebook categorizes as interested in "Latin America," "Canada," "Southeast Asia," "China," "Honduras," or "Somalia."
- draw a red line around zip codes and then not display ads to Facebook users who live in specific zip codes.
In addition, today the U.S. Attorney for the Southern District of New York (SDNY) filed a statement of interest, joined in by HUD, in U.S. District Court on behalf of a number of private litigants challenging Facebook's advertising platform.
HUD Secretary-Initiated Complaints
The Secretary of HUD may file a fair housing complaint directly against those whom the Department believes may be in violation of the Fair Housing Act. Secretary-Initiated Complaints are appropriate in cases, among others, involving significant issues that are national in scope or when the Department is made aware of potential violations of the Act and broad public interest relief is warranted or where HUD does not know of a specific aggrieved person or injured party that is willing or able to come forward. A Fair Housing Act complaint, including a Secretary initiated complaint, is not a determination of liability.
A Secretary-Initiated Complaint will result in a formal fact-finding investigation. The party against whom the complaint is filed will be provided notice and an opportunity to respond. If HUD's investigation results in a determination that reasonable cause exists that there has been a violation of the Fair Housing Act, a charge of discrimination may be filed. Throughout the process, HUD will seek conciliation and voluntary resolution. Charges may be resolved through settlement, through referral to the Department of Justice, or through an administrative determination.
This year marks the 50th anniversary of the Fair Housing Act. In commemoration, HUD, local communities, and fair housing organizations across the country have coordinated a variety of activities to enhance fair housing awareness, highlight HUD's fair housing enforcement efforts, and end housing discrimination in the nation. For a list of activities, log onto www.hud.gov/fairhousingis50.
Persons who believe they have experienced discrimination may file a complaint by contacting HUD's Office of Fair Housing and Equal Opportunity at (800) 669-9777 (voice) or (800) 927-9275 (TTY).
###
HUD's mission is to create strong, sustainable, inclusive communities and quality affordable homes for all.
More information about HUD and its programs is available on the Internet
at www.hud.gov and https://espanol.hud.gov.
You can also connect with HUD on social media and follow Secretary Carson on Twitter and Facebook or sign up for news alerts on HUD's Email List.
LifeLock has been running to center on the Equifax breach, with ads and press statements saying how the breach shows how important its own services cost: up to $29.99 a month can be to preserve you from identity theft....Here’s what LifeLock isn’t spreading so widely: When you buy its security, you’re signing up for credit recording and monitoring services provided by, yes, Equifax....You just can't make this stuff up: Full Article here (https://latesthackingnews.com/2017/09/20/lifelock-gaining-lot-customer-base-taking-advantage-equifaxs-breach/)
Maybe it's just my rather dark sense of humor, but I just couldn't stop laughing after I read that. The unmitigated gall of corporations these days is just flat-out mind blowing.-Stoic Joker (September 20, 2017, 03:39 PM)
RUMPEL web browser's aim : take back control of your personal data:
Interesting open source project led by the University of Warwick. Its aim is to help users keep track of where information about them is stored online so that they can actually -- personally -- benefit from it. An important issue; whether this is a viable solution or not is another one... :)[...] a marketing professor at the University of Warwick who led RUMPEL's development, said: "It's time for people to claim their data from the internet."TechRadar article : New web browser lets you take back control of your personal data (http://www.techradar.com/us/news/world-of-tech/new-web-browser-lets-you-take-back-control-of-your-personal-data-1325545)
"The aim of RUMPEL is to empower users and enable them to be served by the ocean of data about them that's stored in all kinds of places online, so that it benefits them and not just the businesses and organisations that harvest it," she added.
"The strapline 'Your Data, Your Way' reflects our determination to let people lead smarter lives by bringing their digital lives back under their own control."
And the GitHub RUMPEL project (https://github.com/Hub-of-all-Things/rumpel).-Armando (July 29, 2016, 02:49 PM)
Open Future
Toward defining privacy expectations in an age of oversharing (https://www.economist.com/open-future/2018/08/16/toward-defining-privacy-expectations-in-an-age-of-oversharing)
Our digital data deserves protection, writes Margot Kaminski, a law professor
Open Future
Aug 16th 2018by MARGOT KAMINSKI
What assurances of privacy do we have in this digital age? Until this year, the quick answer was: effectively none. We share personal data with companies, who in turn share it with other companies and the government, with few if any legal means of individual redress or control. But if 2018 will be remembered as the year of Cambridge Analytica—a British data-mining company that allegedly influenced both British and American elections by targeting voters using personal data—it will also be remembered as the year that privacy law finally started catching up to the Internet.
In America, the “third-party doctrine” has long governed privacy law. This view holds that one can have no privacy expectations in information shared with a third party. The government could obtain a list of phone numbers you called without a warrant, for instance, because you shared that information with a phone company.
This runs counter to most people’s expectations, especially today. Privacy is many things to many people, but one thing it is not is absolute secrecy. We share sensitive information with our doctors; we hold whispered conversations in public places; we rely on practical obscurity even in big cities; and we disclose our most intimate information by text and by email.
Helen Nissenbaum, an ethicist at Cornell University, refers to the foundation of our digital-privacy expectations as “contextual integrity”. When we reveal information in one context, we trust that it won’t pop up to surprise us in another.
Another way to think of it is that we regularly use perceived features of our environments, both physical and social, to manage degrees of disclosure. An email service that uses a mailbox as its logo signals that email will be kept between sender and recipient—just like a regular letter—even if it is in fact stored on a company’s servers.
In June 2018, however, the Supreme Court struck a serious and welcome blow to the third-party doctrine in its Carpenter v. United States ruling. That case asked whether police needed a warrant to access someone’s mobile-phone location data. The Court held that historic mobile-phone location data deserved privacy protections, even if it is shared (often unknowingly) with a mobile-phone service provider.
The Court recognised that what used to be non-sensitive data—a record of your travels through public places—has, in the age of big data, effectively been converted into sensitive information. When gathered en masse and analysed, where someone travels can reveal her religion, health problems, sexual preferences and political affiliations. The Court thus recognised that privacy harms can trigger a wealth of related harms, chilling freedom of speech and freedom of association.
While 2018 brought paradigm shifts to American privacy law, in Europe it brought the General Data Protection Regulation (GDPR). The significance of the GDPR goes beyond the annoying barrage of privacy notices that popped up in May. It establishes enforceable individual transparency and control rights.
But the GDPR’s real impact will be within companies, behind the scenes. Backed by significant penalties for breaches (up to 4% of worldwide annual revenue), the GDPR imposes a series of duties on companies, regardless of whether individuals invoke their privacy rights. It requires companies to articulate legitimate reasons for collecting data; to collect only the data that they need; to design new products in ways that protect individual rights; and sometimes to appoint privacy officers and conduct privacy impact-assessments.
At first glance, the gap between Europe and America still appears enormous. The EU now has the GDPR; America continues to lack comprehensive federal data privacy law, relying instead on a patchwork of consumer protection, state laws, and sector-specific laws like the Health Insurance Portability and Accountability Act (HIPAA).
But two recent events have resulted in a surprising array of commonalities: the above-mentioned Carpenter case, and California’s Consumer Privacy Act (CPA), which California passed less than a month after the Carpenter ruling, and which creates an American version of data-protection law.
The California CPA governs not just information that people share directly with companies, but also personal data held by commercial data-brokers. Just as Carpenter suggests that legal protections follow even shared personal data, the CPA imposes transparency and control requirements even on companies that have no direct relationship with consumers. In this way, the CPA represents a shift towards the data protection model embraced in the GDPR. Legal protection travels with the data, regardless of whether or not there is a consumer relationship.
This is not to say that Europe and America are converging. For one, the CPA applies only to California residents (although because California is such a big market the law may influence policies for all Americans—referred to in the context of automobile regulations as the “California effect”). America also has a robust, and in some situations increasingly deregulatory, free speech right in the First Amendment that will likely come into conflict with deletion and disclosure rights.
But there is a growing transatlantic consensus emerging on privacy in the digital age. Sharing data no longer obviates privacy. Privacy protections now increasingly travel with personal information, even if that information is something a company has inferred rather than collected. Both legal systems also increasingly recognise that privacy is, perhaps counterintuitively, deeply linked to transparency: people cannot exert control or request remedies if they do not know where their information is going.
Perhaps most significantly, both legal regimes now exhibit a growing awareness of how linked privacy is to other well-recognised legal harms such as chilling effects on free expression or discrimination against individuals. Even if America does not enact a federal privacy law, the age of free data and diminishing data privacy looks to be rapidly ending.
Margot Kaminski is a professor at Colorado Law. She teaches, researches and writes on the intersection of law and technology.
The amusing Black Mirror video above, though artificial, is arguably a prescient comment on the implicit risks inherent in a tendency for "oversharing" and/or "data leakage" in the IT-enabled SNM (Social Network Marketplace) and other personal-data-related databases (e.g., health, insurance, banking, education), where private data that one might have previously perceived as being peculiar and useful/relevant in one context only is subsequently seen to be useful/relevant in another, or maybe many other contexts. These are typically the data connections and interconnections that the SNM operators and data miners would tend to seek/exploit, for financial gain.
When I was contracted in 2003 to get a failed data analysis project back on the rails, I learned quite a lot. It was a complex IT project to implement a BI (Business Intelligence) system and we had to train the users in the application of BI (it's actually quite powerful and hairy stuff) to meet the growing and complex business needs of the power (energy) company who had contracted me into the project recovery role. I learned that what the Defence sector had always taught was true - that all data/information can be interconnected at some stage - and that, for BI, the world could be simply envisaged as one or more universes of dynamic data - each having its own peculiar descriptive and dynamic data model and that, as in the popular SF concept of parallel universes, there was the potential to interlink these data universes (mass aggregations of dynamic data sets), constantly combining/recombining and drawing data from one to the other, enabling the BI analyst to discover data relationships in a way that would probably not have previously been feasible on such a mass scale, using the then prevailing technologies.
Fast forward to 2018, where we can perhaps now better understand why we might have the apparent privacy shambles that we see around us. It was a gold-rush, opportunistic, every man for himself. Presumably the Google/Facebook founders (and others) would have seen it coming. There were little/no regulations to limit or constrain the progress of BI and its application in the field of mass demographics. Now that some regulations have belatedly been or are being implemented, it arguably may be too late anyway - locking the stable door after the horse has bolted; Pandora's box has already been been opened.-IainB (August 22, 2018, 04:45 AM)
According to Hesiod, when Prometheus stole fire from heaven, Zeus, the king of the gods, took vengeance by presenting Pandora to Prometheus' brother Epimetheus. Pandora opened a jar left in his care containing sickness, death and many other unspecified evils which were then released into the world.[4] Though she hastened to close the container, only one thing was left behind – usually translated as Hope, though it could also have the pessimistic meaning of "deceptive expectation".[5]The thing about "Pandora's Box" was that the contents (the troubles), when once released into the world as a result of Pandora's curiosity, were enduring and timeless and apparently could never be put back into the box, thus adversely affecting all humankind over time, from that point onwards.
From this story has grown the idiom "to open (a) Pandora's box", meaning to do or start something that will cause many unforeseen problems.[6] Its modern, more colloquial equivalent is "to open a can of worms".[7]
Source: https://en.wikipedia.org/wiki/Pandora%27s_box
The Shadow Education Secretary wants to make teachers more vulnerable (https://www.samizdata.net/2018/09/the-shadow-education-secretary-wants-to-make-teachers-more-vulnerable/)
tags: Civil liberty & Regulation, Education & Academia, Internet, Privacy & Panopticon, UK affairs
Natalie Solent (Essex) - September 23rd, 2018
The Shadow Education Secretary, Angela Rayner MP (Lab), has called for a ban on anonymous online accounts (https://www.theguardian.com/politics/2018/sep/23/ban-anonymous-accounts-angela-rayner-tells-social-media-firms).The education spokesperson also called for social media companies to ban anonymous accounts, complaining at a fringe event organised by the Guardian in Liverpool that most of the people that abused her online did so without using their real names.
Rayner said that social media firms should take greater responsibility for their users and complained in particular that Facebook seemed to have indicated that politicians should accept a higher level of abuse.
When asked what she thought about social media, Rayner said: “One of the first things they should do is stop anonymous accounts. Most people who send me abuse me do so from anonymous accounts and wouldn’t dream of doing it in their own name.”
Rayner conceded that using real names would not stop abuse but “it would certainly help a little bit. I think they should do more, they do have a responsibility for online”.
___________________________________
As I mentioned earlier, Angela Rayner is the Shadow Education Secretary. That ought to mean that she is aware that teachers, like MPs, are often subject to harassment. The Times Educational Supplement had an article on that very subject just a few days ago: “Why your social account is not as private as you think” (https://www.tes.com/news/why-your-social-account-not-private-you-think). It began:The teacher’s Facebook account was set to private. She was certain of that. Yet, in the past week, she had received four friend requests from former pupils. She could not work out how they had found her.
So, as I am a researcher at the Greater Manchester Police – and her friend – she asked me to take a look. Within 10 minutes, I had not just found her, but I also had her full name, her partner’s name, the school she worked at, the name of one of her children and multiple images of the street she lives on.
___________________________________
The writer, Chris Glover, proceeded to give ten tips that teachers should employ to protect themselves:
- 1. Keep accounts separate.
- 2. Vary usernames.
- 3. Check posts about you.
- 4. Beware of public posts.
- 5. Review privacy settings.\
- 6. Don’t follow your school account.
- 7. Avoid using your real name.
- 8. Change the friends-list setting.
- 9. Switch off location.
- 10. Delete dormant accounts.
Following the above advice should help ensure that teachers can enjoy participating in life online while minimising the very real risk of being tracked down by former or current pupils bearing a grudge, or simply by people whom it is best to keep at arms length for professional or safeguarding reasons.
Until a Labour government gets in and makes Nos. (2) and (7) illegal outright, and demands that all of your personal details are held in one place by a social media company so as to be conveniently available for hackers and identity thieves.
People browsing using Chrome were quietly logged into their Google accounts without their consents | So much for users’ Privacy (http://innov8tiv.com/people-browsing-using-chrome-were-quietly-logged-into-their-google-accounts-without-their-consents-so-much-for-users-privacy/)Interestingly enough, this would seem to be exactly the sort of thing that HAT (Hub of All Things) (https://www.hubofallthings.com/) - referred to above per Armando (2016-07-29, 14:49:38) (https://www.donationcoder.com/forum/index.php?topic=42877.msg401099#msg401099) - is apparently designed to protect us from, whilst at the same time increasing our privacy and freedom of choice:
Felix Omondi September 24, 2018 Apps and Software
A professor at Johns Hopkins and a cryptography expert, Matthew Green, called out Google for making changes to Chrome, making the browser log in users into their Google account without the consent or even notifying them. A move security experts say puts the users’ privacy into jeopardy.
Historically, Chrome users have had the option of using the browser without logging in to their Google accounts. Although logging in does come with some obvious benefits such as having your bookmarks, passwords, and browsing history synced in the cloud and available across any device you are browsing on using the Chrome browser.
However, for security-conscious users who do not have Google – the most prominent advertising entity in the world – have their browsing data for purposes of sending them targeted Ads. Now that Google has made changes to the new Chrome to make the browser log users secretly into their Google Accounts means Google will get the data of users who would otherwise not have logged into their accounts.
Google has come out addressing these concerns raised by security experts stressing that users must have consented to the sync feature thus allowing the browser to transfer their data. Buried in the sync feature, is the revelation that for the sync feature as it works out will automatically also log you into your Google account.
So when a user logs in to their Gmail account on the browser, Chrome also automatically logs into their Google account. All that happens without the consent of the user or the user getting notifications.
“Now that I’m forced to log into Chrome,” wrote Green, “I’m faced with a brand new (sync consent) menu I’ve never seen before.”
Copied from: People browsing using Chrome were quietly logged into their Google accounts without their consents | So much for users’ Privacy | Innov8tiv - <http://innov8tiv.com/people-browsing-using-chrome-were-quietly-logged-into-their-google-accounts-without-their-consents-so-much-for-users-privacy/>
The Sovrin Solution Sovrin is a decentralized, global public utility for self-sovereign identity. Self-sovereign means a lifetime portable identity for any person, organization, or thing. It’s a smart identity that everyone can use and feel good about. Having a self-sovereign identity allows the holder to present verifiable credentials in a privacy-safe way. These credentials can represent things as diverse as an airline ticket or a driver's license. Sovrin identities will transform the current broken online identity system. Sovrin identities will lower transaction costs, protect people’s personal information, limit opportunity for cybercrime, and simplify identity challenges in fields from healthcare to banking to IoT to voter fraud. |
...Interestingly enough, this would seem to be exactly the sort of thing that HAT (Hub of All Things) (https://www.hubofallthings.com/) - referred to above per Armando (2016-07-29, 14:49:38) (https://www.donationcoder.com/forum/index.php?topic=42877.msg401099#msg401099) - is apparently designed to protect us from, whilst at the same time increasing our privacy and freedom of choice:
What is the Hub of all Things? (https://www.youtube.com/watch?v=kgxKl_OCOaQ)
The Hub of All Things (https://www.youtube.com/watch?v=DAn2HB7FfmM)...-IainB (September 26, 2018, 02:07 PM)
Twitter, Facebook, and Google are Fighting Internet Privacy Laws (https://www.abine.com/blog/2018/twitter-facebook-google-fighting-privacy-laws/)
WRITTEN BY: JULIANNE SUBIA - NOVEMBER 14, 2018
Recently, the Information Technology Industry Council, which represents companies like Amazon, Visa, Microsoft, Google, Facebook, and Apple, released its “Framework to Advance Interoperable Rules (FAIR) on Privacy”. On the surface, it looks like tech companies are trying to protect user privacy. In reality, they want to make sure that they can continue to profit off of our data. Using simple privacy tools like DeleteMe and Blur will help you stay in control of your privacy.
(Read the rest at the link.)
Copied from: Twitter, Facebook, and Google are Fighting Internet Privacy Laws - <https://www.abine.com/blog/2018/twitter-facebook-google-fighting-privacy-laws/>
Fast forward to 2018, where we can perhaps now better understand why we might have the apparent privacy shambles that we see around us. It was a gold-rush, opportunistic, every man for himself. Presumably the Google/Facebook founders (and others) would have seen it coming. There were little/no regulations to limit or constrain the progress of BI and its application in the field of mass demographics. Now that some regulations have belatedly been or are being implemented, it arguably may be too late anyway - locking the stable door after the horse has bolted; Pandora's box has already been opened.-IainB (August 22, 2018, 04:45 AM)
The new law gives Australian law enforcement agencies the power to issue cooperation notices to technology entities with the purpose of gaining access to specific users’ encrypted messages and data. These entities may include companies, websites, or anything else that transmits data to an end-user in Australia.
Last week, Australia's parliament passed a controversial act that will enable law enforcement and intelligence agencies to compel access to encrypted communications. In an explanatory memorandum, the Australian Parliament stated that the new act, the Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018, is intended to combat "the challenges posed by ubiquitous encryption." Under the act, certain law enforcement and intelligence agencies will be able to approach "designated communication providers," using one of the mechanisms below, for the purpose of gaining access to specific users' encrypted messages and data.
- Technical Assistance Requests (TARs) – These are voluntary requests that allow law enforcement and intelligence agencies to request access to communications and data while bypassing the oversight rules surrounding mandatory notices. TARs may be issued by the directors-general of the Australian Security and Intelligence Organization (ASIO), the Australian Secret Intelligence Service (ASIS), or the Australian Signals Directorate (ASD), or by the chief officer of an "interception agency," which includes the Australian Federal Police (AFP), the Australian Crime Commission (ACC), and the state and territory police forces, provided that they obtain the approval of the AFP commissioner.
- Technical Assistance Notices (TANs) – These are compulsory notices requiring a "designated communication provider" to use existing interception or decryption capabilities to provide access to communications or user logs. TANs can be obtained only by the director-general of the ASIO or the chief officer of an interception agency.
- Technical Capability Notices (TCNs) – These are compulsory notices requiring designated communication providers to build infrastructure to meet subsequent TANs. TCNs may be issued only by the attorney general, with the approval of the minister for communications, following a request from the ASIO or the chief officer of an interception agency, and require written notice to the communication provider, allowing them the opportunity to respond within 28 days.
The new act allows these agencies to directly approach specific individuals, such as engineers or IT administrators at an organization, rather than the organization itself. Companies that resist the demands could face a fine of up to $7.3 million, while individuals who refuse could face jail time.
Below is the prepared testimony of Gabriel Weinberg, CEO & Founder of DuckDuckGo, before the United States Senate Judiciary Committee Hearing on GDPR & CCPA: Opt-ins, Consumer Control, and the Impact on Competition and Innovation. March 12, 2019 Chairman Graham, Ranking Member Feinstein and Members of the Committee, thank you for holding this important hearing and inviting me to testify. I am here to explain that privacy legislation, like the GDPR and CCPA, is not only pro-consumer, but also pro-business, and pro-advertising. DuckDuckGo's primary service is a search engine alternative to Google that allows you to search the web without being tracked. We are the fourth largest search engine in the US, serving over one billion searches a month globally. We also offer a mobile privacy browser that serves as an alternative to Google Chrome. We regularly conduct rigorous consumer research on privacy issues, which we post at SpreadPrivacy.com. We also help educate consumers about online privacy from our Twitter account, @duckduckgo. I founded DuckDuckGo in 2008, far outside of Silicon Valley, in Valley Forge, Pennsylvania. We now have a distributed workforce spread across the nation in twelve states, the District of Columbia, and in ten other countries. As you know, people are tired of being watched everywhere they go online. They are fed up with all the intended and unintended consequences this online tracking creates, including invasive ads, identity theft, discrimination, and manipulation. Have you ever searched for something only to see an ad for that very thing pop up in a mobile app or on a different website? DuckDuckGo helps you avoid these types of scenarios by seamlessly reducing your online digital footprint. Every time you search on DuckDuckGo, it's like you are searching on our site for the first time. We do not even have the concept of a search history. And we also offer privacy protection beyond the search box. Many companies run sophisticated tracker networks that lurk on the websites you visit. DuckDuckGo’s browser technology blocks such hidden trackers. In many ways I come to you from the future: I run a business that is already GDPR and CCPA-compliant. Our privacy policy is straightforward and doesn’t require a law degree to decipher: We simply do not collect or share any personal information at all. That’s it — no confusing settings to fiddle with, no jargon-filled pages to read. Yet, even with this simple privacy policy, we nonetheless are able to make money through advertising. This brings me to my first point: Privacy legislation is not anti-advertising. Take our business for example: When you type in a search on DuckDuckGo, we simply show you ads related to that search. If you search for ‘car’, we show you car ads. But those ads won’t follow you around, because we don’t know who you are, where you’ve been, or where you go. It's contextual advertising versus behavioral advertising. As a privately held company, our finances are private, though I’m proud to say we’ve been profitable using contextual advertising since 2014, and last year we earned substantially more than the $25 million revenue floor that subjects a company to CCPA. And we are not alone. For example, in response to GDPR, when the New York Times in Europe switched from behavioral advertising to contextual advertising, it reported an increase in revenue. And just last week, Business Insider reported the Washington Post was looking into making a similar change. If Congress forced the digital advertising industry to return to its roots in contextual advertising, that would allow companies to remain profitable, or even become more profitable — all without the unintended consequences of behavioral advertising. My second point is that privacy is becoming increasingly good for business. Consumers flock to brands they trust and respect, and according to Harris Poll, data privacy is the most pressing issue on Americans' minds, now for two years in a row. And again, we serve as a great case study, having grown exponentially during this period. >Chart showing the increase in DuckDuckGo traffic from 2008 to 2019. My third point is that well-drafted privacy legislation can spur more competition and innovation in one of the most foundational markets of the Internet: digital advertising. This market is currently a duopoly, and this reality is hurting everyone from small businesses to venture-backed startups to media companies. To restore competition and innovation in this market, the data monopolies at its core need to be addressed. Fixing this digital-ad-market duopoly can take any number of forms. Here are three suggestions. First, consumers could be given a robust mechanism to opt-out of online tracking. Second, monopoly platforms could be prohibited from combining data across their different business lines. Third, acquisitions that strengthen existing data monopolies could be blocked. Our mission at DuckDuckGo is to raise the standard of trust online. We support strong privacy legislation that does exactly that. We believe the Internet shouldn’t feel so creepy, and getting the privacy you deserve online should be as easy as closing the blinds. I am pleased to answer your questions today and make myself available to Members in the future for more in-depth discussions. Thank you. You can download the PDF version of this testimony here (https://duckduckgo.com/download/GDRP-CCPA-Hearing-Testimony_2019-03-12.pdf). |