Monday, October 11, 2021

Manzi

From Jim Manzi, then President & CEO of Lotus Development Corp. (``The Productivity MacGuffin'', BYTE Magazine 1992:8 p360):
Over the last 10 years, many companies have fallen into the trap of technology for technology's sake. This has not been entirely their own doing. They have had ample encouragement from members of our industry, who have never hesitated to trumpet their latest product release with no thought to customers' business goals. In all the excitement, people lost sight of a few simple economic truths.

The first is that real value in an organization is generated not by machines, but by people. It applies not just to computers, but to the steam engine, the printing press, and the pencil. The second principle is that the greatest value is generated not by individuals, but by teams.
What he said is still relevant today, nearly 20 years later. The ``AI Winter'' of yesteryear seemed to have given way to an ``AI Spring'', but the harsh truth is that even in the current ``AI Spring'', the propensity of snake oil is just still too high. Looking at technology with a sober look can often seem to be something that reeks too much of conservatism, but that cannot be helped---even the second paragraph from a technology leader acknowledges that point.

Machines, thinking or otherwise, allow us to do things faster, and hopefully cheaper. In the engineering rule-of-thumb of ``cheap, fast, good; choose two out of three'', machines can be seen as a way of allowing us to concentrate on pursuing quality, while using such automation tools to ``cheat'' in getting the other two along the way, a means of beating the aphorism. The thing is, we are still fundamentally a human-centric society, which means that at the core of it all, the human ought to be primal in any type of decision. That's what the second paragraph of the above quote is trying to capture: that the value of an organisation (be it a company or a society) is an evaluation of the value generated by the people that make it up. If everything is a race to producing more with cheaper and faster techniques with no innovation whatsoever, then the [short-term] winners will be those who have a predominance in machine.

The second principle as highlighted in the second paragraph is a caution against treating individual people as disembodied attributes/wish-lists the way ``human resources'' have been championing, and to take into consideration how the person, as a whole, can contribute to an existing team. This means that personnel decisions in organisations ought to be taken with team-fit playing a larger role than mere matching of some reductionist collection of attributess. This also means that staff retention with respect to maintaining effective teams should be a priority over the greed of hiring the absolute cheapest and cobbling them together in the hopes that they make an effective team.

Now, I'm not saying that Manzi is absolutely right here on all things. Following the history of Lotus Development should allow one to draw their own conclusions on Manzi's leadership capabilities and how they relate to his perspective. But in this case on how technology, business, and people relate, what he says has a certain ageless truth to it for as long as our societies are to be lived in by humans.

The unfortunate thing is that in this time and age, I am not even sure if we are running societies for the benefit of humans any more. In more ways than one, it seems that we have steadily moved our societies towards supporting that of corporate personhood over the individual. Many laws across the many countries have a tendency towards prioritising giving [substantial] benefits to these corporate persons over looking out for the humans that make them up, and really, no one should be surprised at this from a purely utilitarian perspective: that single corporate entity may be ``one'', but it has the clout (capital, which translates to socio-political eventual) that can rival some of the smaller countries in the world. In other words, for any legislator/enforcer to achieve so-called ``great return of investment'' or ``low-cost high-impact'' type deals, they can do the easy thing and pander to these corporations instead.

I would normally claim that this is a major fallacy solely through reasoning that the corporate entity is still legal fiction that, if all the people in it are absent or if all the people who trade with it are gone, it will also ``die out''. But with massive automation in both physical (think robotics) and mental (think reasoning/``AI''), it becomes increasingly possible to create corporate entities that can literally take on a life on its own devoid of most human intervention.

Think about it: a self-sustaining network (or ``society'') of corporate entities that trade information/material/logistics with each other, accruing capital among themselves that need only a minimal number of people to supply the missing innovation parts until they too can be replaced, where even their ``owners'' can eventually be made up of only other corporate entities as the bootstrapping human leaders get ``digitised'' into pure institutional knowledge before they eventually die out from old age. A world of the biggest class divide: the humanly untameable corporate entities who own everything, and the humans who can only hope to lease at ever-increasing prices over ever-decreasingly available resources.

I think that we are part-way there already at this point. It seems incredible to believe that after fifty years of exponential improvements in technology, we end up having to work more hours than our predecessors to have a chance at a lesser quality of life, and to be abused that somehow we have been the greater ne'er-do-wells.

Then why do we invest so much into innovation for, if the end result is more misery? Just who benefits from all these innovation then? It can't be for God, despite how many might want to claim---everything that we have, He gave it to us first. Besides, God is all-powerful---there is nothing that we produce that he wants/needs.

So, why then?

No comments: