Tag Archives: VR

The USA, Narcissistic Rage, A Sense of Entitlement & Holding Our Rights Hostage

The US is taking a giant shit on all of us, and our rights. And we are letting them. This is a nation that is currently led by extremists who inherited the job from a crazily compromised administration.

I previously wrote in All The Presidents’ Messes:

“In my lifetime the American people have elected Nixon (Vietnam, Laos, Cambodia), Ford (by accident), Carter (Iranian Revolution & Iran Hostage debacle), Reagan (Funded the Taliban / Iran-Contra Affair / Nicaragua / El Salvador / Guatemala), Bush the First (Gulf War I), Clinton (Somalia, Rwanda, Haiti / Israel-Palestine / Ethnic Wars in Europe – Croats, Serbs and Bosnian Muslims / Kosovo & Albania), Bush the Second (Iraq / Afghanistan), Obama (IRANDEAL, global appeasement, the relatively unopposed rise of ISIS, and the disintegration of Syria and Libya and Egypt as a result of US Foreign Policy failures) and now Trump.”

All US policy decisions and their side-effects, one way or the other, cascade down into our European democracies. In the current climate that should worry you.

Privacy Is An Absolute Right

I am interested in Privacy. The abuse of Privacy (1) has far more fundamental negative effects than might seem to be the case at first glance.

I am an advocate for the right of every citizen to a private life, the preservation of civil liberties, and the defence of other hard won rights. Technology or rather its unfettered deployment is the single biggest threat to our personal freedoms and by extension to the proper administration of justice.

And so I write about it. Sometimes the writing is a bit technical but most of the time it’s referencing the technical results of other peoples work to support my arguments (which I always acknowledge – most important that is)

Orwell 4.0

Technology facilitated developments have created new tools for the State, Law Enforcement, and Intelligence Agencies to monitor not just person’s of interest but everyone (2). Software industry greed and software developer naivety is also driving an assault on our personal privacy and security (3).

These phenomena have already resulted in wholesale abuses (4) of habeas corpus, an alteration of the perception of what constitutes a fair trial, have worn down the right to silence of a suspect, made the avoidance of self-incrimination almost impossible, made illegal searches and seizures (5) acceptable, and encroached on the ability of defendants to construct a proper defence.

Recently, Graham Cluley (@gcluley) posted a clarification of a definition on Twitter“It’s always bugged me how people say “Innocent until proven guilty”. It’s “Innocent *unless* proven guilty” folks.” – that is worth thinking about in an age of trial by media and JTC-as-a-Service (JTC – Jumping to Conclusions a.k.a Fake News).

In parallel with this there is an increasing trend of “ordinary” crimes being tried in “extra-ordinary” courts, tribunals, or military courts. The checks and balances that used to notionally counter the power of the state and where the actions of government could be publicly scrutinized has almost ceased to effectively exist.

Surveillance politics, the rise of extremists on the left and the right, religious fanaticism, the re-emergence of censorship and even actual talk of “blasphemy laws” in the parliaments of Western democracies leaves one bewildered. How will we fare when even newer technologies such as VRSN, and AI with even greater capacity to embed themselves in our lives begin to mature from the novel stage into the deployment stage?

What will be the effect of kinematic fingerprinting, emotion detection (6), psychographic profiling (7), and thought extraction (8) on the right to privacy and basic freedoms. These are questions and concerns that get lost in the rush to innovate. Software companies and developers have a responsibility but they do not exercise it very often.

What are the ethics? What are the acceptable limits? What are the unforeseen by-products?

The US Has Claimed “Absolute Privilege”

The US is the bully on the block and its “bitch” friends the UK (9), Canada, New Zealand (10), & Australia (11) just follow its lead or actively facilitate them.

The opacity of US laws (12) and SIGINT collection methods is an abuse of the rights of every defendant that comes in front of their Courts. Increasingly, that is just about anybody that they can lay their hands on, from anywhere (13).

The election of Trump just solidified my view that the world has turned upside down and it seems that taking action to reverse the trend of the normalisation of the abnormal (14) is a Sisyphean task and just seems to encourage the buggers (15).

The US position on most of these matters is ephemeral – not just on data protection (16) – and US national interest, national security, or just plain duplicity (17) governs their agenda.

There is so much abuse of power by the US that it is impossible to keep tabs. These things used to matter (18). These things used to enrage us (19). The US has led a race to the bottom on so many fronts that the rest of the world seems to be suffering from bad news fatigue (20) and has zoned out (21).

It is individuals and NGO’s now that are the gatekeepers of our rights and the ones that hold governments to account and increasingly they are being marginalized.

References

(1) Anonymous Chronic; 21st Nov 2016; NSA, GCHQ, The Five Eyes Handing Ireland Cyber-Security Opportunity; AirGap Anonymity Collective

(2) Anonymous Chronic; 21st Nov 2016; Mass Surveillance & The Oxford Comma Analogy; AirGap Anonymity Collective

(3) Anonymous Chronic; 21st Nov 2016; Software Industry Greed is Driving the Assault on our Privacy & Security; AirGap Anonymity Collective

(4) Kim Zetter; 26th Oct 2017; The Most Controversial Hacking Cases of the Past Decade; Wired

(5) Andy Greenberg; 10th Oct 2014; Judge Rejects Defense That FBI Illegally Hacked Silk Road – On A Technicality; Wired

(6) Anonymous Chronic; 3rd Jan 2017; Orwell 4.0: The Stealth Advance of Kinematic Fingerprinting & Emotion Detection for Mass Manipulation; AirGap Anonymity Collective

(7) Anonymous Chronic; 4th Feb 2017; Is Kosinski “Tesla” to Nix’s “Marconi” for Big Data Psychographic Profiling?;AirGap Anonymity Collective

(8) Ian Johnston; 18th Apr 2017; Device that can literally read your mind invented by scientists; Independent

(9) Anonymous Chronic; 30th Nov 2016; My Privacy Lobotomy or How I Learned to Stop Worrying & Love the IP Act; AirGap Anonymity Collective

(10) Anonymous Chronic; 3rd Nov 2016; Overwatch – The Five Eyes Espionage Alliance; AirGap Anonymity Collective

(11) Anonymous Chronic; 21st Nov 2016; Australia Is A Proxy War for the Five Eyes & Also Hogwarts; AirGap Anonymity Collective

(12) American Civil Liberties Union & Human Rights Watch; 21st Nov 2016; Joint letter to European Commission on EU-US Privacy Shield; Human Right Watch)

(13) Tom O’Connor; 6th Jul 2017; Russia Accuses US of Hunting and Kidnapping Its Citizens After Latest Arrests; Newsweek

(14) Anonymous Chronic; 29th Jan 2017; Take Action To Reverse The Present Trend Of The Normalisation of the Abnormal; AirGap Anonymity Collective

(15) Anonymous Chronic; 2nd Dec 2016; Silencing the Canary & The Key Powers & Reach of The IPA; AirGap Anonymity Collective

(16) Mary Carolan; 10th Mar 2017; Max Schrems claims US data privacy protections ‘ephemeral’; The Irish Times

(17) Shelley Moore Capito – United States Senator for West Virginia; 2nd Jul 2017; Stop Enabling Sex Traffickers Act of 2017; https://www.capito.senate.gov/

(18) Adam Taylor; 23rd Apr 2015; The U.S. keeps killing Americans in drone strikes, mostly by accident; The Washington Post

(19) HRW; 9th Dec 2014; USA and Torture: A History of Hypocrisy; Human Rights Watch

(20) Shannon Sexton; 30th Aug 2016; Five Ways to Avoid ‘Bad-News Fatigue’ and Stay Compassionately Engaged; Kripalu Center for Yoga & Health

(21) Susanne Babbel Ph.D.; 4th Jul 2012; Compassion Fatigue; Psychology Today

AI Voice Cloning & Perceived Reality – Fake News Has A New Friend

A Canadian startup called Lyrebird has announced that it has developed a platform capable of mimicking human voice with a fraction of the audio samples required by other platforms such as Google DeepMind and Adobe Project VoCo.

The Lyrebird synthesis software requires only 60 seconds of sample audio to produce it’s synthetic sample. VoCo needs about 20 minutes to do the same.

The quality of the voice reproductions that the software can make are mixed. Some are better than others.

The three founders state that they are addressing possible misuse concerns by making the software publicly available. That may be a little optimistic.

“By releasing our technology publicly and making it available to anyone, we want to ensure that there will be no such risks. We hope that everyone will soon be aware that such technology exists and that copying the voice of someone else is possible. More generally, we want to raise attention about the lack of evidence that audio recordings may represent in the near future.”

James Vincent at The Verge neatly summarizes the worrying outcomes of the combination of trick biometric software, 3D mapping and voice synthesizers.

“There are more troubling uses as well. We already know that synthetic voice generators can trick biometric software used to verify identity. And, given enough source material, AI programs can generate pretty convincing fake pictures and video of anyone you like. For example, this research from 2016 uses 3D mapping to turn videos of famous politicians, including George W. Bush and Vladimir Putin, into real-time “puppets” controlled by engineers. Combine this with a realistic voice synthesizer and you could have a Facebook video of Donald Trump announcing that the US is bombing North Korea going viral before you know it.” 

Fake news has a new friend.

ENDS

Orwell 4.0: The Stealth Advance of Kinematic Fingerprinting & Emotion Detection for Mass Manipulation

I increasingly find myself developing a “Luddite” mindset where unregulated VRSNs are concerned. Digital footprinting is becoming passé. The core toolset of mass surveillance is beginning a fundamental shift whose focus is less about observation than it is about manipulation. I like to call it “Orwell 4.0”.

The “interpretative” and retrospective analysis of fibre optic intercepts, metadata, watchwords and data mining for pattern matches in legacy (cubed), “delayed” time or real time data to establish probabilities of certain types of subject behaviours is being augmented by Kinematic Fingerprinting, Biophysical Activity (and the sub-field of Thought Recognition), Emotion Detection, and Behavioural Biometrics.

[Data collection / mining apps in use by Alphabet Agencies have been well covered on this blog and include XKeyscore;  PRISM; ECHELON; Carnivore; DISHFIRE; STONEGHOST; Tempora; Frenchelon; Fairview; MYSTIC; DCSN; Boundless; Informant; BULLRUN; PINWALE; Stingray; SORM; DANCINGOASIS; SPINNERET; MOONLIGHTPATH; INCENSER; AZUREPHOENIX] 

A sort of post-Orwellian “Big Bro” application of subliminal advertising is emerging but this way round the subliminal message is not directed at the product preferences of a consumer but rather the individuals social, economic and political affiliations, opinions and reactions.

Where does this sit with the Federal Communications Commission findings over forty years ago that declared subliminal advertising “contrary to the public interest” because it involved “intentional deception” of the public.

It seems “intentional deception” is about to go mainstream with the support of the likes of Zuckerberg but now with a far more sinister raison d’être.

Are You In A Virtual Police State?

A pretty loose and old list of factors that can help to determine where a nation lies on The Electronic Police State standings does serve to demonstrate the arrival of these new “tools” (by their complete absence in the list):

  1. Daily Documents Requirement of state-issued identity documents and registration;
  2. Border Issues Inspections at borders, searching computers, demanding decryption of data;
  3. Financial Tracking State’s ability to search and record all financial transactions: Checks, credit card use, wires, etc;
  4. Gag Orders Criminal – penalties if you tell someone the state is searching their records;
  5. Anti-Crypto Laws Outlawing or restricting cryptography;
  6. Constitutional Protection – A lack of constitutional protections for the individual, or the overriding of such protections;
  7. Data Storage Ability – The ability of the state to store the data they gather;
  8. Data Search Ability – The ability to search the data they gather;
  9. ISP Data Retention States forcing Internet Service Providers to save detailed records of all their customers’ Internet usage;
  10. Telephone Data Retention States forcing telephone companies to record and save records of all their customers’ telephone usage;
  11. Cell Phone Records States forcing cellular telephone companies to record and save records of all their customers’ usage;
  12. Medical records States demanding records from all medical service providers and retaining the same;
  13. Enforcement Ability The state’s ability to use overwhelming force (exemplified by SWAT Teams) to seize anyone they want, whenever they want;
  14. Habeus Corpus Lack of habeus corpus – the right not to be held in jail without prompt due process. Or, the overriding of such protections;
  15. Police-Intel Barrier The lack of a barrier between police organizations and intelligence organizations. Or, the overriding of such barriers;
  16. Covert Hacking State operatives removing – or adding! – digital evidence to/from private computers covertly. Covert hacking can make anyone appear as any kind of criminal desired;
  17. Loose Warrants Warrants issued without careful examination of police statements and other justifications by a truly independent judge.

The NextGen Counter Measures Are Proactive Before The “Thought” Emerges

The background to these “new” tools are broadly discussed in Developing Next-Generation Countermeasures for Homeland Security Threat Prevention (Advances in Information Security, Privacy, and Ethics) (Publisher: IGI Global; 1 edition (August 30, 2016) Language: English ISBN-10: 1522507035 ISBN-13: 978-1522507031) by Maurice Dawson an Assistant Professor of Information Systems (Cyber Security) at the College of Business Administration at University of Missouri- St. Louis. Read the e-book abstract.

The author examines the concept of IoT to design the “novel” (his words) security architectures for multiple platforms for surveillance purposes.

The traditional tools of mass surveillance lack one very frightening feature that the emerging tech delivers in abundance – interference, conditioning and “attitude” programming – this blog post was inspired by an article in The Intercept titled “THE DARK SIDE OF VR: Virtual Reality Allows the Most Detailed, Intimate Digital Surveillance Yet“.

Traditional mass surveillance will ultimately be relegated to a support role by the emerging tech of augmented and virtual reality with the assistance of covert biometric data acquisition, facial and gait recognition data also extracted covertly from “innocuous” social media posts and AR/VR interactions on VRSN’s.

[which is not a new field in Perception and PsychoPhysics see Person Identification from Biological Motion – Structural and Kinematic but the ability to “collect” this data in a more sophisticated and reliable way (in the form of 3D visualization via AR, VR & AI) makes it all the more useful for less progressive purposes]

And of course the “carrot & stick” tools that will look to alter subjects attitudes and opinions by harvesting emotional responses (using retina-tracking for example) and “cleansing” these attitudes and opinions to what is the “preferred” [state] response / opinion / attitude / reaction (or more likely lack of reaction).

[As one chief data scientist at an unnamed Silicon Valley company told Harvard business professor Shoshanna Zuboff: “The goal of everything we do is to change people’s actual behavior at scale. … We can capture their behaviors, identify good and bad behaviors, and develop ways to reward the good and punish the bad.”] – The Secrets of Surveillance Capitalism; 05.03.2016, von SHOSHANA ZUBOFF.]

A research team* at one of my Alma Mater’s Dublin City University wrote a paper in 2014 that postulated that with AR, VR and AI in VRSN’s that subjects and their world view could be tweaked or changed.

The paper discussed how the field of VR is rapidly converging with the social media environment. The paper titled “The Convergence of Virtual Reality and Social Networks: Threats to Privacy and Autonomy” is summarized by the US National Library of Medicine National Institutes of Health in an abstract as follows:

[“The rapid evolution of information, communication and entertainment technologies will transform the lives of citizens and ultimately transform society. This paper focuses on ethical issues associated with the likely convergence of virtual realities (VR) and social networks (SNs), hereafter VRSNs. We examine a scenario in which a significant segment of the world’s population has a presence in a VRSN. Given the pace of technological development and the popularity of these new forms of social interaction, this scenario is plausible. However, it brings with it ethical problems. Two central ethical issues are addressed: those of privacy and those of autonomy. VRSNs pose threats to both privacy and autonomy. The threats to privacy can be broadly categorized as threats to informational privacy, threats to physical privacy, and threats to associational privacy. Each of these threats is further subdivided. The threats to autonomy can be broadly categorized as threats to freedom, to knowledge and to authenticity. Again, these three threats are divided into subcategories. Having categorized the main threats posed by VRSNs, a number of recommendations are provided so that policy-makers, developers, and users can make the best possible use of VRSNs.”]

Using VRSN Scenarios for Thought Manipulation & Conditioning

VRSN scenario manipulations are well suited to programming behaviour as well as altering opinion in the “target” or what we used to call the “user”. The “user” tag is no longer accurate in my opinion because the function of the “user” is to extract value from the experience. The “user” in now the “interactor”. In the new scenarios the value extraction (or injection) is enjoyed by the “publisher” or “controller”. [For publisher substitute “government”, “alphabet agency” or “despot”] – the emergent field of surveillance politics and mass manipulation.

The preferred “interactor” attitude and ultimate acceptance/agreement with ideas, opinions, reactions and points of view can be engineered by programming avatar responses to concepts in the form of gestures and facial expressions in response to these stimuli (simple applications being “happy”, “sad”, “neutral”, “angry” avatar responses).

When exposed to subject matter the VRSN can gauge the “interactors” opinions in broad terms using the analysis of the “interactors” emotional responses via eye-tracking or emotion capture and send the avatar the preferred reaction in line with the preferred opinion that the “controller” wishes the “interactor” to hold – if the kinematic fingerprinting suggests that the “interactor” does not hold the “correct” opinion.

The reality is that VRSN’s actual knowledge of the “interactors” affiliations increases exponentially over time as do the metrics which show the successful alteration / cleansing of these “opinions” over time and the A/B testing of experimental methods to produce that result in a “target”.

In an apparent contradiction the VRSN sort of goes back to the “old world” school of line of sight observation of a surveillance “target” (replacing digital footprints) but with one major difference – the observation is paired with “alteration” capabilities – all delivered while you enjoy your leisure time playing in your VRSN. Brave new virtual world.

The Convergence of Virtual Reality and Social Networks: Threats to Privacy and Autonomy Authors:

*Institute of Ethics, Dublin City University, Dublin, Ireland. Fiachra.obrolchain@dcu.ie. *Institute of Ethics, Dublin City University, Dublin, Ireland. tim.jacquemard@dcu.ie. *Insight Centre for Data Analytics, Dublin, Ireland. david.monaghan@insight-centre.org. *Insight Centre for Data Analytics, Dublin, Ireland. noel.oconnor@insight-centre.org. *Institute of Ethics, Dublin City University, Dublin, Ireland. pnovitzky@gmail.com. *Institute of Ethics, Dublin City University, Dublin, Ireland. bert.gordijn@dcu.ie.

The Convergence of Virtual Reality and Social Networks: Threats to Privacy and Autonomy References