Oooo... would you look at it? Stupid o'clock!
Man, it's rare to have a stupid o'clock entry, especially in the middle of NaNoWriMo. All those wasted uncounted words! What am/was I thinking?
Well, anyway, here we are again. I'm woozy because it is late at night, and I'm running towards the end of my sabbatical. Some big issues have been resolved in my head, but there are still others that lurk about, never truly having a proper answer, just because of the way things work. But that is life, isn't it? If everything is completely predictable from the get-go, then there's no need to go through the entirety that is life itself. There's always some portions of it that we will never have control over, direct or otherwise, mostly because it is emergent behaviour that requires an intelligence far superior than our own to know and manage the complexity (looking at you, God).
I was reading an old article about software engineering (No Silver Bullet---Essence and Accident in Software Engineering by Frederick P. Brooks Jr. circa 1987 for those who want to keep score), and it has succinctly summarised the issues that I had once tried to explain to a bunch of middle managers about how software is much more complex than hardware despite the fact that it seemingly takes less specialised physical processes to create. My explanation revolved around the analogy of how in software, we needed to create the physics of the system in addition to working out how the business rules need to be implemented, whereas in hardware, regular physics would already provide a natural limiter that takes away much of the complexity that appears within the solution.
Brooks is even more blunt. He points out that the real complexity with software engineering is that it is an entire discipline that takes away a human's strongest reasoning power: the ability to literally visualise---only in the most trivial or restricted of circumstances can any graphic come close to illuminating with its visualisation than otherwise. To Brooks, having newer and higher level programming languages solve what he calls ``accident'' aspects of software engineering---examples of which include the lack of machine power (either in terms of computational cycles, or in memory), which impacts how many issues that a software engineer needs to handle that is not directly solving the problem at hand (like the lack of garbage collection, or say abstract data types and their associated operations). The essence itself is still not an easy to solve problem (circa the late 1980s), and it is still true today: specifications are the hardest things to get right, for the reason that even the client does not know what they want.
In this time and age, that problem comes even before we start throwing in more nonsense that is only useful for gargantuan companies like Google and friends with their planetary scale information systems. I cringe when I think about how small and medium sized companies want everything to run on The Cloud using the latest in micro-service architecture just to save money on running and maintaining their own servers, only to discover that the true cost is the programmers' time, because much of their task are about translating business rules into program code (the dreaded ``enterprise-level CRUD'' type of programming). And the bulk of that time is spent in understanding and formalising the arcane business rules that no one in their right mind would sit down and codify completely, which also brings up the other cringe-worthy aspect of outsourcing everything to allegedly save on manpower costs, only to discover that the lack of a close working relationship between the software engineer and the system users causes project schedules to be overrun, software quality to suffer, and blame being thrown at everyone except those who have mismanaged through myopic penny-wise pound-foolish choices.
Well, so what do we do then?
Like many things, giving people a proper education with the ability to think critically is always a good baseline for everything. With critical thinking abilities, it is slightly easier to have clients who can be a tad clearer about what they want, and can understand things better when the software engineer points out some of the contradictions/edge cases that their own incomplete rules can generate, instead of paying the blame game.
The other thing is what I have been advocating for a while: use the right tool for the job. It is important to not re-invent the wheel as much as possible, but the cost needed for adapting an existing tool to fit the rest of the system needs to be taken into consideration also. Just because something is new doesn't mean that the old is no longer applicable---all things need to be evaluated critically. And above all, system design knowledge should be prized. Note that I did not say ``knowledge of frameworks should be prized''---I am literally pointing to something different from the type of job description that says something like ``Must know XYZ framework''. What is ``know'' here? Know of its existence? Know how to use it? Know its design principles? Know whether it is the right tool for the job?
Maybe I'm just curmudgeonly, but I think that anyone who believes that their three-month intensive NodeJS code bootcamp is sufficient for them to call themselves a system designer is just hubris. But then again, that's what the HR drones look for, so I guess it is just a way to play the game.
After all, who doesn't like a slice of that juicy infocomm tech pie, am I right? And companies love these people because they can be paid much less (but still a lot compared to other industries) due to being ``junior staff''.
Urgh. I hate the way this world works. Sometimes I wonder why I have not gone into the mountains to be a monk of some sort.
Till the next update.
No comments:
Post a Comment