Congress is trying to strip away your online privacy rights.
Internet service provider knows a lot about you: the webpages you visit, the things you purchase, the people you talk to, and more. Last year, the federal government updated rules to ensure that the companies that act as gatekeepers to the Internet can’t compromise your privacy to make a profit. Those rules are set to go into effect this year.
Now some members of Congress are trying to change that.
Using a little-known tool called a Congressional Review Act (CRA) resolution, some lawmakers want to not only repeal your privacy protections but also effectively prohibit the FCC from creating similar rules in the future.
That could leave consumers without a federal agency to protect online privacy rights.
You need to let your representatives in Congress know that they can’t put ISPs’ demands ahead of their constituents’ privacy.
Data Driven Democracy Where Opinions, Policies or Convictions Don’t Matter Just The Targeted Message on Facebook Dark Posts.
Cambridge Analytica (Steve Bannon, Board Member) owned by SCL (Strategic Communication Laboratories) – the self styled “premier election management agency” – and how they “helped” Trump, Farage, Brexit, Cruz, Ukraine, Nigeria, Nepal & Afghanistan influence outcomes using data modelling and psychographic profiling.
It discusses a series of intersections between the work of Mr. Kosinski, a vaguely sinister guy called Alexander James Ashburner Nix, CEO of Cambridge Analytica (board member Steve “Ahem” Bannon) and a seemingly innocuous chap called (in 2014) Aleksandr Kogan (now quite unbelievable known as Dr. Spectre (seriously)) with associations to a definitely sinister company called SCL, or Strategic Communication Laboratories who describe themselves as “the premier election management agency”.
The main points are this, but I strongly recommend that you read the original article:
Kosinski and fellow student David Stillwell use data from a Facebook application called MyPersonality, that Stilwell developed in 2007, to create models from “personality profile” data acquired from users who opt-in to share their app answers with researchers. Kosinski and Stillwell are both doctoral candidates studying together in Cambridge University at the Psychometrics Centre;
The MyPersonality app is an unexpected hit with millions of people submitting answers;
They find that remarkably reliable deductions could be drawn from simple online actions. For example, men who “liked” the cosmetics brand MAC were slightly more likely to be gay; one of the best indicators for heterosexuality was “liking” Wu-Tang Clan. Followers of Lady Gaga were most probably extroverts, while those who “liked” philosophy tended to be introverts;
In 2012, Kosinski proved that on the basis of an average of 68 Facebook “likes” by a user, it was possible to predict their skin color (with 95 percent accuracy), their sexual orientation (88 percent accuracy), and their affiliation to the Democratic or Republican party (85 percent);
Kosinski continued to work on the models before long, he was able to evaluate a person better than the average work colleague, merely on the basis of ten Facebook “likes.” Seventy “likes” were enough to outdo what a person’s friends knew, 150 what their parents knew, and 300 “likes” what their partner knew. More “likes” could even surpass what a person thought they knew about themselves;
On the day that Kosinski published these findings, he received two phone calls. The threat of a lawsuit and a job offer. Both from Facebook;
Around this time, in early 2014, Kosinski was approached by a young assistant professor in the psychology department called Aleksandr Kogan. He said he was inquiring on behalf of a company that was interested in Kosinski’s method, and wanted to access the MyPersonality database. Kogan wasn’t at liberty to reveal for what purpose; he was bound to secrecy;
Kogan revealed the name of the company he was representing: SCL, or Strategic Communication Laboratories;
Kosinski came to suspect that Kogan and a company that he had formed might have reproduced the Facebook “Likes”-based Big Five measurement tool in order to sell it to this election-influencing firm;
Cambridge Analytica subsequently acted for Farage in the Brexit campaign and Republican Ted Cruz then they were hired by Trump;
Cambridge Analytica buys personal data from a range of different sources, like land registries, automotive data, shopping data, bonus cards, club memberships, what magazines you read, what churches you attend. Nix displays the logos of globally active data brokers like Acxiom and Experian—in the US, almost all personal data is for sale. For example, if you want to know where Jewish women live, you can simply buy this information, phone numbers included. Now Cambridge Analytica aggregates this data with the electoral rolls of the Republican party and online data and calculates a Big Five personality profile. Digital footprints suddenly become real people with fears, needs, interests, and residential addresses;
Trump’s striking inconsistencies, his much-criticized fickleness, and the resulting array of contradictory messages, suddenly turned out to be his great asset: a different message for every voter. The notion that Trump acted like a perfectly opportunistic algorithm following audience reactions is something the mathematician Cathy O’Neil observed in August 2016;
Why did he behave like this?;
“Pretty much every message that Trump put out was data-driven,” Alexander Nix remembers. On the day of the third presidential debate between Trump and Clinton, Trump’s team tested 175,000 different ad variations for his arguments, in order to find the right versions above all via Facebook. The messages differed for the most part only in microscopic details, in order to target the recipients in the optimal psychological way: different headings, colors, captions, with a photo or video. This fine-tuning reaches all the way down to the smallest groups, Nix explained in an interview with us. “We can address villages or apartment blocks in a targeted way. Even individuals.”;
When did having an opinion or a conviction matter in a “data driven” democracy – it certainly did not seem to matter to Trump;
In the Miami district of Little Haiti, for instance, Trump’s campaign provided inhabitants with news about the failure of the Clinton Foundation following the earthquake in Haiti, in order to keep them from voting for Hillary Clinton. This was one of the goals: to keep potential Clinton voters (which include wavering left-wingers, African-Americans, and young women) away from the ballot box, to “suppress” their vote, as one senior campaign official told Bloomberg in the weeks before the election. These “dark posts” – sponsored news-feed-style ads in Facebook timelines that can only be seen by users with specific profiles – seem to have been highly significant in Trump’s election;
In a statement after the German publication of this article, a Cambridge Analytica spokesperson said, “Cambridge Analytica does not use data from Facebook. It has had no dealings with Dr. Michal Kosinski. It does not subcontract research. It does not use the same methodology. Psychographics was hardly used at all. Cambridge Analytica did not engage in efforts to discourage any Americans from casting their vote in the presidential election. Its efforts were solely directed towards increasing the number of voters in the election.”;
Confusingly the Cambridge Analytica website states “Powered by smarter data modeling At Cambridge Analytica we use data modeling and psychographic profiling to grow audiences, identify key influencers, and connect with people in ways that move them to action. Our unique data sets and unparalleled modeling techniques help organizations across America build better relationships with their target audience across all media platforms.”