Showing posts with label Emile Wolfaardt. Show all posts
Showing posts with label Emile Wolfaardt. Show all posts

Monday, 19 September 2022

Neo-Medievalism and the New Latin

By Emile Wolfaardt

Medieval Latin (or Ecclesiastical Latin, as it is sometimes called), was the primary language of the church in Europe during the Dark Ages. The Bible and its laws and commands were all in Latin, as were the punishments to be meted out for those who breached its dictates. This left interpretation and application up to the proclivities of the clergy. Because the populace could not understand Latin, there was no accountability for those who wielded the Latin sword.

We may have outgrown the too-simplistic ideas of infanticidal nuns and the horror stories of medieval torture devices (for the most part, anyway). Yet the tragedy of the self-serving ecclesiastical economies, the gorgonising abuse of spiritual authority, the opprobrious intrusion of privacy, and disenfranchisement of the masses still cast a dark shadow of systemic exploitation and widespread corruption over that period. The few who birthed into the ranks of the bourgeois ruled with deleterious absolutism and no accountability. The middle class was all but absent, and the subjugated masses lived in abject poverty without regard or recourse. There was no pathway to restation themselves in life. It was effectively a two-class social stratification system that enslaved by keeping people economically disenfranchised and functionally dependent. Their beliefs were defined, their behavior was regulated, and their liberties were determined by those whose best interest was to keep them stationed where they were.

It is the position of this writer that there are some alarming perspectives and dangerous parallels to that abuse in our day and age that we need to be aware of.

There has been a gargantuan shift in the techno-world that is obfuscatious and ubiquitous. With the ushering in of the digital age, marketers realised that the more information they could glean from our choices and conduct, the better they could influence our thinking. They started analysing our purchasing history, listening to our conversations, tracking key words, identifying our interests. They learned that people who say or text the word ‘camping’ may be in the market for a tent, and that people who buy rifles, are part of a shooting club, and live in a particular area are more likely to affiliate with a certain party. They learned that there was no such thing as excess data – that all data is useful and could be manipulated for financial gain.

Where we find ourselves today is that the marketing world has ushered in a new economic model that sees human experiences as free raw material to be taken, manipulated, and traded at will, with or without the consent of the individual. Google's vision statement for 2022 is ‘to provide access to the world's information in one click’. Everything, from your heart rate read by your watch, your texts surveyed by your phone’s software, your words recorded by the myriad listening devices around you, your location identified by twenty apps on your phone, your GPS, your doorbell, and the security cameras around your home are garnering your data. And we even pay for these things. It is easier to find a route using a GPS than a map, and the convenience of a smart technology seems, at first glance anyway, like a reasonable exchange.

Our data is being harvested systematically, and sold for profit without our consent or remuneration. Our search history, buying practices, biometric data, contacts, location, sleeping habits, exercise routine, self-discipline, articles we pause our scrolling to peruse, even whether we use exclamation marks in our texts – the list continues almost endlessly – and a trillion other bits of data each day is recorded. Then it is analysed for behavioural patterns, organised to manipulate our choices, and sold to assist advertisers to prise the hard-earned dollars out of our hands. It is written in a language very few people can understand, imposed upon us without our understanding, and used for financial gain by those who do not have our best interest at heart. Our personal and private data is the traded for profit without our knowledge, consent, or benefit.

A new form of economic oppression has emerged, ruthlessly designed, implemented by the digital bourgeois, and built exclusively on harvesting our personal and private data – and we gladly exchanged it for the conveniences it offered. As a society, we have been gaslighted into accepting this new norm. We are fed the information they choose to feed us, are subject to their manipulation, and we are simply fodder for their profit machine. We are indeed in the oppressive age of Neo-Medievalism, and computer code is the new Latin.

It seems to have happened so quickly, permeated our lives so completely, and that without our knowledge or consent.

But it is not hopeless. As oppressive as the Dark Ages were, that period came to an end. Why? Because there were people who saw what was happening, vocalised and organised themselves around a healthier social model, and educated themselves around human rights, oppression, and accountable leadership. After all – look at us now. We were birthed out of that period by those who ushered in the Enlightenment and ultimately Modernity.

Reformation starts with being aware, with educating oneself, with speaking up, and with joining our voices with others. There is huge value to this digital age we have wholeheartedly embraced. However, instead of allowing it to oppress us, we must take back control of our data where we can. We must do what we need to, to maximise the opportunities it provides, join with those who see it for what it is, help others to retain their freedom, and be a part of the wave of people and organisations looking for integrity, openness, and redefinition in the process. The digital age with its AI potential is here to stay. This is good. Let’s be a part of building a system that serves the needs of the many, that benefits humanity as a whole, and that lifts us all to a better place.

Monday, 25 April 2022

The Dark Future of Freedom

by Emile Wolfaardt

Is freedom really our best option as we build a future enhanced by digital prompts, limits, and controls?

We have already surrendered many of our personal freedoms for the sake of safety – and yet we are just on the brink of a general transition to a society totally governed by instrumentation. Stop! Please read that sentence again! 

Consider for example how vehicles unlock automatically as authorised owners approach them, warn drivers when their driving is erratic, alter the braking system for the sake of safety and resist switching lanes unless the indicator is on. We are rapidly moving to a place where vehicles will not start if the driver has more alcohol in their system than is allowed, or if the license has expired or the monthly payments fall into arrears.

There is a proposal in the European Union to equip all new cars with a system that will monitor where people drive, when and above all, at what speed. The date will be transmitted in real time to the authorities.

Our surrender of freedoms, however, has advantages. Cell-phones alert us if those with contagions are close to us, and Artificial Intelligence (AI) and smart algorithms now land our aeroplanes and park our cars. When it comes to driving, AI has a far better track record than humans. In a recent study, Google claimed that its autonomous cars were ‘10x safer than the best drivers,’ and ‘40x safer than teenagers.’ AI promises, reasonably, to provide health protection and disease detection. Today, hospitals are using solutions based on Machine Learning and Artificial Intelligence to read scans. Researchers from Stanford developed an algorithm to assess chest X-rays for signs of disease. This algorithm can recognise up to fourteen types of medical condition – and was better at diagnosing pneumonia than several expert radiologists working together.

Not only that, but AI promises to both reduce human error and intervene in criminal behavior. PredPol is a US based company that uses Big Data and Machine Learning to predict the time and place of a potential offence. The software looks at existing data on past crimes and predicts when and where the next crime is most likely to happen – and has demonstrated a 7.4% reduction in crime across cities in the US and created a new avenue of study in Predictive Policing. It already knows the type of person who is likely to commit the crime and tracks their movement toward the place of anticipated criminal behavior.

Here is the challenge – this shift to AI, or ‘instrumentation’ as it is commonly called, has been both obfuscatious and ubiquitous. And here are the two big questions about this colossal shift that nobody is talking about.

Firstly, the entire move to the instrumentation of society is predicated on the wholesale surrender of personal data. Phone, watches, GPS systems, voicemails, e-mails, texts, online tracking, transactions records, and countless other instruments capture data about us all the time. This data is used to analyse, predict, influence, and control our behaviour. In the absence of any governing laws or regulation, the Googles, Amazons, and Facebooks of the world have obfuscated the fact that they collect hundreds of billions of bits of personal data every minute – including where you go, when you sleep, what you look at on your watch or phone or other device, which neighbour you speak to across the fence, how your pulse increases when you listen to a particular song, how many exclamation marks you put in your texts, etc. and they collect your data whether or not you want or allow them to.

Opting out is nothing more than donning the Emperor’s new clothes. Your personal data is collated and interpreted, and then sold on a massive scale to companies without your permission or remuneration. Not only are Google, Amazon and Facebook (etc.) marketing products to you, but they are altering you, based on their knowledge of you, to purchase the products they want you to purchase. Perhaps they know a user has a particular love for animals, and that she bought a Labrador after seeing it in the window of a pet store. She has fond memories of sitting in her living room talking to her Lab while ‘How Much is that Doggy in the Window’ played in the background. She then lost her beautiful Labrador to cancer. And would you know it – an ad ‘catches her attention’ on her phone or her Facebook feed with a Labrador just like hers, with a familiar voice singing a familiar song taking her back to her warm memories, and then the ad turns to collecting money for Canine Cancer. This is known as active priming.

According to Google, an elderly couple recently were caught in a life-threatening emergency and needed to get to the doctor urgently. They headed to the garage and climbed into their car – but because they were late on their payments, AI shut their car down – it would not start. We have moved from active priming into invasive control.

Secondly, data harvesting has become so essential to the business model that it is already past the point of reversal. It is ubiquitous. When challenged about this by the US House recently, Mark Zuckerberg offered that Facebook would be more conscientious about regulating themselves. The fox offered to guard the henhouse. Because this transition was both hidden and wholesale, by the time lawmakers started to see the trend it was too late. And too many Zuckerbucks had been ingested by the political system. The collaboration of big data has become irreversible – and now practically defies regulation.

We have transitioned from the Industrial Age where products were developed to ease our lives, to the Age of Capitalism where marketing is focused on attracting our attention by appealing to our innate desire to avoid pain or attract pleasure. We are now in what is defined as the Age of Surveillance Capitalism. In this sinister market we are being surveilled and adjusted to buy what AI tells us to buy. While it used to be true that ‘if the service is free, you are the product,’ it is now more accurately said that ‘if the service is free, you are the carcass ravaged of all of your personal data and freedom to choose.’ You are no longer the product, your data is the product, and you are simply the nameless carrier that funnels the data.

And all of this is marketed under the reasonable promise of a more cohesive and confluent society where poverty, disease, crime and human error is minimised, and a Global Base Income is being promised to everyone. We are told we are now safer than in a world where criminals have the freedom to act at will, dictators can obliterate their opponents, and human errors cost tens of millions of lives every year. Human behaviour is regulated and checked when necessary, disease is identified and cured before it ever proliferates, and resources are protected and maximised for the common betterment. We are now only free to act in conformity with the common good.

This is the dark future of freedom we are already committed to – albeit unknowingly. The only question remaining is this – whose common good are we free to act in conformity with? We may have come so far in the subtle and ubiquitous loss of our freedoms, but it may not be too late to take back control. We need to self-educate, stand together, and push back against the wholesale surrender of our freedom without our awareness.