Thursday, July 08, 2021

A Jumbled Thought

Consent, transparency, and surveillance, the big trio of things to worry about with ubiquitous computing. In some aspects, it seems that we have been blind-sided by the stomping over consent and transparency in the bid to quietly build up the massive surveillance infrastructure under the noses of everyone. But is it really a blind-siding moment, or is it a deliberate application of apologising instead of seeking permission?

I think an important factor that makes the modern digital-centric world different from others is the separation of the ego from material body in which is strongly associated with, and the far-flung nature of its influence in both time and space. Let me dissect what I am saying down into smaller components.

``Separation of the ego... associated with''---this refers to how the ``I'' in my head (ego) is less related to the ``I'' that is my body. The ego can be itself devoid of context of the body through the pseudonymous world that is the information networks, indeed this is an artefact that had been understood early on, with aphorisms like ``On the Internet, no one knows if you are a dog.''. So a paraplegic person need not be confined to the wheelchair to lead a rich and fulfilling life, assuming that the person is comfortable that such a rich and fulfilling life happens mostly with the ego and not with the body. Information networks are fast becoming a reality on their own, and to see it otherwise is a great way to get trapped in fallacies that drive much of what is wrong with the world.

``Far-flung nature of its influence in both time and space''---this refers to proliferation of the knowledge of any particular person's ego in geographical space (someone in Russia may well know the nature of someone in Indonesia despite them never having shared a common geographical space while interacting), and time (someone who left a series of comments in the past will have the comments made available for reading in the future, allowing someone from the future to know intimately about the person from the past in ways that go beyond regular archaeological reconstruction). Contrast this with the pre-Internet world, where the knowledge of a person every ten kilometres away from where that person lives decreases quite substantially, and that past [mis-]deeds are inadvertently forgotten, unless the said [mis-]deeds are sufficiently catastrophic for a large number of [angry enough] people.

In the pre-mass surveillance world, the three items I stated in the beginning had the following properties:
consent
Explicit consent is required, with the ``analogue'' nature meaning that the consent is more likely to be informed than not.
transparency
There is little incentive to actually be transparent about most processes since there isn't a good way of transmitting the associated information.
surveillance
Surveillance is targetted due to the vast amounts of effort required to maintain the surveillance. The targetting is usually well thought of to improve the return of investment [of effort] ratio.
Comparatively, with the effects of the modern digital-centric world, we see the evolution of the properties of the three items to this instead:
consent
Consent is largely implied, and even if it were explicit, there is usually enough carefully constructed details to confuse and obfuscate than to inform, leading to the paradoxical state of an ``informed'' consent without necessarily actually informing anything.
transparency
Processes can be transparent, but they can be made sufficiently convoluted to the point that it defeats the purpose of the transparency from the sense of understanding. Sometimes this isn't deliberate---the processes could be machine-driven/machine-designed, which adds an opaqueness that isn't intentionally meant to obfuscate (but does so anyway).
surveillance
Surveillance is cheap and widespread, since information in its pure form of digital data is easily observed, copied, and retained without ever affecting the original transmission. Comes ``naturally'' with the original intentions of optimising the system (which often degrades to monetisation) and implicit consent through de facto use of a system due to popularity.
So what do these all mean?

I don't know. Maybe it means the end of individuality as we know it, since mistakes are no longer permitted to be made when one is young, since it will be recorded and retained, and definitely used against one in the future by whomever has the incentive to do so, be it a government body trying to screen for candidates for its civil service, or a company trying to pick their employees, or even a landlord screening for potential tenants. Combine mass surveillance with machine-assisted artificial ``intelligence'', it is a great way to suppress individuality, since it is hard to argue against the ``hard numerical facts'' as presented by a machine.

A mass surveillance society run by algorithms is going to be a more discriminative than an inclusive one. Because algorithms have no morality, and more importantly do not share the values that we treasure as a society. Some may claim that it is possible to train the algorithms to learn our values, but I will point out that only past examples are used to train algorithms---we are still making up the rules on how inclusion ought to work in order to right the wrongs of the past, and that isn't going to make it into the algorithm soon enough.

Alright, that's enough depressing thought. Going to read more Tintin, and watch Reine's start in her playthrough of Virtue's Last Reward. Till the next update.

No comments: