Category Archives: Mass Manipulation

Does Legislation Stifle Innovation?

(From an article originally published in July 2017 on my peerlyst blog)

Does legislation stifle innovation? No. Why? Because it legislates in “catch up mode” mostly and on those rare occasions when the legislators do see something coming in advance (examples? I don’t have any actually) – then they fail to implement the legislation or put in place checks and balances to monitor compliance.

Legislators are better at legislating for the abuse of data – the IP Act in the UK – in favour of mass surveillance and warrantless omnipresent spying and eavesdropping. It’s a catch all bucket – much easier than putting your back into it and figuring it out with Privacy, Civil Liberty and Human Rights in mind.

Legislators are looking to heavily regulate IoT. One wonders what their approach will be since they have failed or chose to ignore (more likely), it would appear, to legislate and police the most basic elements of Data Protection despite some of the first statutes being enacted (in Europe) as far back as 1986.

Now we have the kerfuffle of the NIS Directive (compelling member states to “be appropriately equipped, e.g. via a Computer Security Incident Response Team (CSIRT) and a competent national NIS authority” – another agency just when we thought that the ones that we had were as bad as the disjointed un-joined up implementation of policy could get) and GDPR (which contains the bizarrely general statement in Clause 4 that “The processing of personal data should be designed to serve mankind”) – good luck implementing that.

Data Protection legislation for much of the intervening period was “lip-service” and PR driven. The DATA PROTECTION ACT, 1988 was publicised by the Irish government as an innovative “first of its kind” legislation that would set the Republic of Ireland apart and create a “privacy regulated” USP for RoI as an FDI (foreign direct investment) destination.

The IDA boasts on their website “We favour green lights over red tape, which is why we are one of the best countries in the world for ease of doing business (Forbes). New business is welcomed and supported by the flow of talent coming from our schools, universities and abroad, to work for high-performing companies across a range of cutting-edge sectors.”

What this really means is that regulation in Ireland with respect to Data Protection and Central Bank governance (both having a direct impact on the operations of the likes of EU headquartered tech giants based in Ireland – Google, eBay,Facebook, Twitter, HubSpot etc … pick a name – they are based in Ireland somewhere) was all about accommodating whatever these firms asked for, with scant or little regard to what the privacy protections in the legislation actually dictated in terms of consumer / end user protection.

Put the following statement in front of your local Data Protection commission and ask them to respond with respect to their view on the best way to protect the consumer while enabling innovation – prepare for an answer characterised by vanilla, non-committal prose peppered with out of context TLA’s.

“Dear Data Protection Commissioner, How Does Your Office Propose To Balance Classically-Conceived Privacy Concepts In Light Of The Business Imperative Of Providing The End User With Contextual Richness?”

The Office of the Data Protection Commission and the Central Bank of Ireland are widely regarded as complicit in the wholesale abuse of the data protection, privacy and tax obligations of tech companies operating in the country.

Understaffed, under-skilled and under-whelming, these outfits have presided over some of the most spectacular breaches of these obligations.

Now, they seek to add to their NP-Complete task and their ever expanding skills gap – the area of IoT regulation.

They will be tasked with creating law to govern how companies should implement security protocols and data protection measures to control the people who use the information generated by IoT (or those who seek to illegally acquire it) and the application of Big Data, IoT, AI, data analytics, and machine learning.

I have no faith that Ireland or Europe will stay on the edge of the curve of innovation in order to regulate its expansion in a controlled and understood manner. But I could be wrong. Do you think that I am wrong? I would love to hear counter arguments to my usual cynical stance on these issues.

ENDS

Orwell 4.0: The Stealth Advance of Kinematic Fingerprinting & Emotion Detection for Mass Manipulation

I increasingly find myself developing a “Luddite” mindset where unregulated VRSNs are concerned. Digital footprinting is becoming passé. The core toolset of mass surveillance is beginning a fundamental shift whose focus is less about observation than it is about manipulation. I like to call it “Orwell 4.0”.

The “interpretative” and retrospective analysis of fibre optic intercepts, metadata, watchwords and data mining for pattern matches in legacy (cubed), “delayed” time or real time data to establish probabilities of certain types of subject behaviours is being augmented by Kinematic Fingerprinting, Biophysical Activity (and the sub-field of Thought Recognition), Emotion Detection, and Behavioural Biometrics.

[Data collection / mining apps in use by Alphabet Agencies have been well covered on this blog and include XKeyscore;  PRISM; ECHELON; Carnivore; DISHFIRE; STONEGHOST; Tempora; Frenchelon; Fairview; MYSTIC; DCSN; Boundless; Informant; BULLRUN; PINWALE; Stingray; SORM; DANCINGOASIS; SPINNERET; MOONLIGHTPATH; INCENSER; AZUREPHOENIX] 

A sort of post-Orwellian “Big Bro” application of subliminal advertising is emerging but this way round the subliminal message is not directed at the product preferences of a consumer but rather the individuals social, economic and political affiliations, opinions and reactions.

Where does this sit with the Federal Communications Commission findings over forty years ago that declared subliminal advertising “contrary to the public interest” because it involved “intentional deception” of the public.

It seems “intentional deception” is about to go mainstream with the support of the likes of Zuckerberg but now with a far more sinister raison d’être.

Are You In A Virtual Police State?

A pretty loose and old list of factors that can help to determine where a nation lies on The Electronic Police State standings does serve to demonstrate the arrival of these new “tools” (by their complete absence in the list):

  1. Daily Documents Requirement of state-issued identity documents and registration;
  2. Border Issues Inspections at borders, searching computers, demanding decryption of data;
  3. Financial Tracking State’s ability to search and record all financial transactions: Checks, credit card use, wires, etc;
  4. Gag Orders Criminal – penalties if you tell someone the state is searching their records;
  5. Anti-Crypto Laws Outlawing or restricting cryptography;
  6. Constitutional Protection – A lack of constitutional protections for the individual, or the overriding of such protections;
  7. Data Storage Ability – The ability of the state to store the data they gather;
  8. Data Search Ability – The ability to search the data they gather;
  9. ISP Data Retention States forcing Internet Service Providers to save detailed records of all their customers’ Internet usage;
  10. Telephone Data Retention States forcing telephone companies to record and save records of all their customers’ telephone usage;
  11. Cell Phone Records States forcing cellular telephone companies to record and save records of all their customers’ usage;
  12. Medical records States demanding records from all medical service providers and retaining the same;
  13. Enforcement Ability The state’s ability to use overwhelming force (exemplified by SWAT Teams) to seize anyone they want, whenever they want;
  14. Habeus Corpus Lack of habeus corpus – the right not to be held in jail without prompt due process. Or, the overriding of such protections;
  15. Police-Intel Barrier The lack of a barrier between police organizations and intelligence organizations. Or, the overriding of such barriers;
  16. Covert Hacking State operatives removing – or adding! – digital evidence to/from private computers covertly. Covert hacking can make anyone appear as any kind of criminal desired;
  17. Loose Warrants Warrants issued without careful examination of police statements and other justifications by a truly independent judge.

The NextGen Counter Measures Are Proactive Before The “Thought” Emerges

The background to these “new” tools are broadly discussed in Developing Next-Generation Countermeasures for Homeland Security Threat Prevention (Advances in Information Security, Privacy, and Ethics) (Publisher: IGI Global; 1 edition (August 30, 2016) Language: English ISBN-10: 1522507035 ISBN-13: 978-1522507031) by Maurice Dawson an Assistant Professor of Information Systems (Cyber Security) at the College of Business Administration at University of Missouri- St. Louis. Read the e-book abstract.

The author examines the concept of IoT to design the “novel” (his words) security architectures for multiple platforms for surveillance purposes.

The traditional tools of mass surveillance lack one very frightening feature that the emerging tech delivers in abundance – interference, conditioning and “attitude” programming – this blog post was inspired by an article in The Intercept titled “THE DARK SIDE OF VR: Virtual Reality Allows the Most Detailed, Intimate Digital Surveillance Yet“.

Traditional mass surveillance will ultimately be relegated to a support role by the emerging tech of augmented and virtual reality with the assistance of covert biometric data acquisition, facial and gait recognition data also extracted covertly from “innocuous” social media posts and AR/VR interactions on VRSN’s.

[which is not a new field in Perception and PsychoPhysics see Person Identification from Biological Motion – Structural and Kinematic but the ability to “collect” this data in a more sophisticated and reliable way (in the form of 3D visualization via AR, VR & AI) makes it all the more useful for less progressive purposes]

And of course the “carrot & stick” tools that will look to alter subjects attitudes and opinions by harvesting emotional responses (using retina-tracking for example) and “cleansing” these attitudes and opinions to what is the “preferred” [state] response / opinion / attitude / reaction (or more likely lack of reaction).

[As one chief data scientist at an unnamed Silicon Valley company told Harvard business professor Shoshanna Zuboff: “The goal of everything we do is to change people’s actual behavior at scale. … We can capture their behaviors, identify good and bad behaviors, and develop ways to reward the good and punish the bad.”] – The Secrets of Surveillance Capitalism; 05.03.2016, von SHOSHANA ZUBOFF.]

A research team* at one of my Alma Mater’s Dublin City University wrote a paper in 2014 that postulated that with AR, VR and AI in VRSN’s that subjects and their world view could be tweaked or changed.

The paper discussed how the field of VR is rapidly converging with the social media environment. The paper titled “The Convergence of Virtual Reality and Social Networks: Threats to Privacy and Autonomy” is summarized by the US National Library of Medicine National Institutes of Health in an abstract as follows:

[“The rapid evolution of information, communication and entertainment technologies will transform the lives of citizens and ultimately transform society. This paper focuses on ethical issues associated with the likely convergence of virtual realities (VR) and social networks (SNs), hereafter VRSNs. We examine a scenario in which a significant segment of the world’s population has a presence in a VRSN. Given the pace of technological development and the popularity of these new forms of social interaction, this scenario is plausible. However, it brings with it ethical problems. Two central ethical issues are addressed: those of privacy and those of autonomy. VRSNs pose threats to both privacy and autonomy. The threats to privacy can be broadly categorized as threats to informational privacy, threats to physical privacy, and threats to associational privacy. Each of these threats is further subdivided. The threats to autonomy can be broadly categorized as threats to freedom, to knowledge and to authenticity. Again, these three threats are divided into subcategories. Having categorized the main threats posed by VRSNs, a number of recommendations are provided so that policy-makers, developers, and users can make the best possible use of VRSNs.”]

Using VRSN Scenarios for Thought Manipulation & Conditioning

VRSN scenario manipulations are well suited to programming behaviour as well as altering opinion in the “target” or what we used to call the “user”. The “user” tag is no longer accurate in my opinion because the function of the “user” is to extract value from the experience. The “user” in now the “interactor”. In the new scenarios the value extraction (or injection) is enjoyed by the “publisher” or “controller”. [For publisher substitute “government”, “alphabet agency” or “despot”] – the emergent field of surveillance politics and mass manipulation.

The preferred “interactor” attitude and ultimate acceptance/agreement with ideas, opinions, reactions and points of view can be engineered by programming avatar responses to concepts in the form of gestures and facial expressions in response to these stimuli (simple applications being “happy”, “sad”, “neutral”, “angry” avatar responses).

When exposed to subject matter the VRSN can gauge the “interactors” opinions in broad terms using the analysis of the “interactors” emotional responses via eye-tracking or emotion capture and send the avatar the preferred reaction in line with the preferred opinion that the “controller” wishes the “interactor” to hold – if the kinematic fingerprinting suggests that the “interactor” does not hold the “correct” opinion.

The reality is that VRSN’s actual knowledge of the “interactors” affiliations increases exponentially over time as do the metrics which show the successful alteration / cleansing of these “opinions” over time and the A/B testing of experimental methods to produce that result in a “target”.

In an apparent contradiction the VRSN sort of goes back to the “old world” school of line of sight observation of a surveillance “target” (replacing digital footprints) but with one major difference – the observation is paired with “alteration” capabilities – all delivered while you enjoy your leisure time playing in your VRSN. Brave new virtual world.

The Convergence of Virtual Reality and Social Networks: Threats to Privacy and Autonomy Authors:

*Institute of Ethics, Dublin City University, Dublin, Ireland. Fiachra.obrolchain@dcu.ie. *Institute of Ethics, Dublin City University, Dublin, Ireland. tim.jacquemard@dcu.ie. *Insight Centre for Data Analytics, Dublin, Ireland. david.monaghan@insight-centre.org. *Insight Centre for Data Analytics, Dublin, Ireland. noel.oconnor@insight-centre.org. *Institute of Ethics, Dublin City University, Dublin, Ireland. pnovitzky@gmail.com. *Institute of Ethics, Dublin City University, Dublin, Ireland. bert.gordijn@dcu.ie.

The Convergence of Virtual Reality and Social Networks: Threats to Privacy and Autonomy References