5 Tech Myths and Why They Are Dangerous
Tech Is not Immune to Myth
A myth is a widely held (but false) belief or idea. In engineering circles, we try to believe in objectivity, but we are still emotional human beings prone to flaws in our thinking. It’s stunningly common for a persuasive, confident person to convince teams of people about all sorts of silliness.
Myth #1: Software is Very Efficient
What people really mean is that software is “fast”, not necessarily that it’s more “efficient”. We’ve built our modern society on this idea that computers can do things faster than we can, and it’s not a big stretch to assume that because computers are so much faster, they are the most efficient tool.
Like most things in the world, it isn’t that simple. To be fair, when people talk about “efficiency” they aren’t usually talking about it in a granular, scientific way. They mean “doing the same or more work with less resources”.
For organizations, resources just means “money”...but what if we want to look at this at a higher level, beyond just money? From a technical perspective, efficiency is about the ratio of (useful) work compared to the energy expended.
We have to understand that the human brain is orders of magnitude more efficient than any computer -- we can power 100 billion neurons by shoveling some corn into our mouths. That’s efficiency.
By contrast, chatGPT has a similar number of “neurons”, but it supposedly costs around $100 million to train the massive thing. It’s so energy intensive that some estimates suggest that a single query could power a 5W light bulb for over an hour.
Does that sound efficient to you? It’s fast. It’s effective. But is it really efficient? For one query to expend so much energy, it sure doesn’t sound like it.
Similarly, consider that every online application requires servers to run. Even the most efficient servers only use about 50% of their CPU capacity at a given time. That’s understandable to anyone that’s worked with infra, but it sure as hell isn’t efficient.
Estimates suggest that data centers use about 40,000 GWh/year in the US alone. In reality, most servers aren’t even running at 50% capacity (10 to 30% is more common) and studies suggest that idle servers still burn 60% as much energy as loaded ones.
So…when we’re talking about how efficient code really is, imagine these massive data centers filled with servers that are working at about 10% of capacity but still eating up huge quantities of energy.
To put this in perspective, an average coal power plant produces about 3500 GWh of electricity per year. Given these numbers, it’s entirely possible that 6-7 coal power plants have to spend their entire year burning coal just to supply idle computers with power.
The fact that this waste is obfuscated makes it easy to believe it doesn’t exist. As engineers, we can easily believe in the efficiency of computers because we write code…we see how computers can “think” in a time scale completely different to our measly human brains.
It is easy to forget that despite all the advancements, computers are still relatively inefficient machines, especially compared to the human brain.
People don’t really mean that software is “more efficient” than humans, they mean that it’s much faster than humans. It’s more efficient in terms of money, not necessarily energy.
So…why does this matter? Who cares? It matters because we all live on this planet and climate change is a scientific fact. Maybe there’s a better option than throwing more hardware at your problems; yes, hardware is cheap, but it’s worth understanding exactly what impact you’re having in scaling hardware when it isn’t needed.
Sometimes there isn’t a choice, and that’s just part of working in the real world. Still, I think it’s worth questioning the assumptions that many engineers work from every day. This idea that computers and software are oh-so-efficient is not so clear-cut.
I hope no one thinks I’m saying that computers and software are bad and that we should all go back to paperwork…but we should care about our hardware resources and we should strive to utilize as much of our CPUs as possible, both because it will save hardware costs and because it’s (slightly) better for the environment to do as much as possible with as little hardware as possible.
Myth #2: Software Engineering is Pure Science
Hopefully, this is an obvious myth. Software engineering is not just about writing commands to be fed into a CPU, not anymore -- otherwise we’d all be working in assembly. As software has become more and more high-level (abstracted), our focus as engineers is less about how we speak to the computer and more about how we speak to each other.
Software engineers have more in common with novelists than astrophysicists. We have to think about what we’re writing and how it’s structured. We have to translate English into code, interpreting a spec and building logic based on that understanding.
I’ve personally seen philosophies around this change over the years. When I was first learning, object-oriented programming was the only modality and you were not allowed to question the logic of deeply nested hierarchies. Really, you weren’t.
Many engineers were taught that this was the only way to program, that this was the best way to represent real-world concepts in code. This is where software engineers sometimes fail to be scientists -- we’re too keen to accept popular notions just because they are widely accepted by “smart” people in the field.
I’m not saying that OOP is “bad” or that it isn’t useful when it clearly is, but it’s important to remember that it’s just one idea about how to structure an application -- it’s a philosophy, not a scientific fact.
The same is true of Agile development. It’s a long list of inherently arbitrary rules that few organizations really follow to the letter. Nor should they, because it’s rather stupid to follow a set of rules without question. That’s not science. The inventors of agile are not gods whose rules must be followed, and they wouldn’t suggest you treat agile as such, either.
It’s dangerous to buy into the idea that software engineering is a pure science; it makes us weaker as engineers. Embrace the fact that writing software requires a level of subjectivity; ask questions, be skeptical, and don’t accept popular modalities at face value just because “smart people” say so. Everyone is selling something in the end, and it’s up to us to make value judgements on a case-by-case basis. That’s just how real life works.
Myth #3: Young Coders Don’t Care About Performance
I’ve seen this argument a lot from older engineers: “young people don’t care about performance and don’t want to write anything themselves”. As an older engineer myself, I can’t help but roll my eyes at this value judgment.
First, “performance” needs to be defined. Are you talking about speed? Efficiency? Accuracy? These are three different facets of “performance”, but people are usually talking about speed and efficiency.
We’ve already seen how “efficiency” is such a loaded word and skewed concept. Do you mean efficient only in terms of compute speed? We don’t care about developer time or opportunity cost?
It wasn’t long ago that most coding was done in C or C++, considered low-level languages today. Now, many engineers don’t even learn about pointers! That’s not something to lament as “skills lost”, really, it’s just part of this industry growing up.
In other words, the trend toward moving to higher and higher level languages has been ongoing since the advent of computing. It isn’t that young people “don’t care” about low-level details or that young people “only know how to add packages”. It’s that the industry as a whole has been moving in this direction.
This is a natural evolution driven by many factors. Firms want more output with less resources (human beings); higher level languages let us do more with less people. Third party packages are the same.
The push to spend less time writing your own code doesn’t come from young people, it comes from market forces. Be real; capitalism is a thing. It does not reward the people that make stuff, it rewards people that sell stuff. The less skilled labor you need to hire, the better. Market forces matter, they impact engineering every day, and many software engineers are very poor at contextualizing the field with the real world.
For example, when people say that AI won’t eliminate software engineers, it will just make the work more “efficient”, they mean “hire fewer people”. That should be obvious!
It isn’t just code that’s becoming higher-level, it’s platforms. How many ecommerce firms have ditched their development team entirely because they migrated to Shopify and merely use apps? Why hire sysadmins when you can use serverless? Why hire someone to write docs when you can use AI to ask questions?
The history of people that “make stuff” has always led to automation. What older coders see in this “lack of care” about performance isn’t a bad attitude or a lack of competence, it’s the industry changing around them as all industries always do.
To be blunt, long-term…capitalism always will reward people that sell stuff more than people that make stuff. Sometimes you need to adapt to this reality.
There might always be a need for low-level coders, don’t get me wrong, but the general goal for the industry is always going to be to build the product faster using fewer expensive engineers. This isn’t a good thing for anyone, really, but there’s no fighting it, either. This is how history works, people that produce stuff are a liability for companies. Computer science isn’t “special” just because the work is digital…we’re still people in a factory making widgets. It shouldn’t be a surprise that market forces demand that we make more widgets with fewer resources. Packages, higher level code, and platforms are a response to market forces.
Software engineering in the next twenty years will only take this to more extremes because that’s how industry has always worked. Grumble all you want, but stop with the derogatory rants about how smart you are to write all your own code while these youngsters do their npm.
Yes, there’s absolutely merit in being careful with bloated third party code or platforms that don’t do their jobs well…but you have to accept the market forces at play around you.
Or you don’t, but someday you’ll wake up to see this industry is completely foreign to you and that firms don’t care about the things you care about. Those young people and all their damned packages just might be your boss someday soon.
Myth #4: All Engineers Love their Job
Growing up, I had an uncle that did computer programming for early voice recognition systems. And I do mean early, before smartphones were even an idea.
He didn’t particularly like his job; his passion was hunting and outdoorsy crap. He never once talked about programming. It was a good job that provided for his family, but it wasn’t his passion.
Personally, I love engineering. It is my passion. Still, I think it can be dangerous and unfair to presume that every engineer loves their work. First, there’s a practical reason why you shouldn’t buy into this assumption. It will mean you get paid less.
This is one reason why careers in the video game industry are so notoriously crappy -- the demand to work in that field is so high, wages are lower and the expectation for constant crunch is high. It’s an unfair way to reward some of the most talented engineers, but that’s just how our society works.
Second, people that don’t have a “passion” for their work still deserve a job. They can be great workers that do great work! Also, do you really want someone at your office whose only passion is software engineering…?
Well rounded people make for more interesting teams, not just because they have something to talk about other than coding. Different experiences are a valuable thing to have -- why bother building a team at all if everyone thinks in a similar way?
For firms, hiring people that are intensely passionate about software engineering can actually be a double edged sword. Is the work interesting? If their concern is having fun and being challenged, they’ll burn out faced with boilerplate or business-as-usual.
Similarly, if they love engineering so much that they are constantly doing side projects and research….will they stay at your firm for more than a year or two? Or will they be seeking to leverage their knowledge into a gig with more prestige, pay, or challenge?
Will passionate engineers be as efficient with their time as possible, or will they want to play with new tech because it’s neat and they want to learn? If their only passion is engineering, are they product-focused enough to make good choices?
Engineers that show up because they want to get paid are not evil or bad. Sometimes you need workers that view work as work, because not everything is some fun challenge. Sometimes you need to grind shit out, and that’s when “passion” can become counterproductive.
To be clear…I like working with passionate people and believe it’s more an asset than a liability, but let’s dispel this notion that every software engineer is “just happy to be doing the work”.
If I were building a team? Yes, I’d want passionate people that love their work…but if you just view it as work? That isn’t as unhealthy or unproductive an attitude as some might think. If anything, sometimes that perspective is exactly what’s needed to support the product!
Myth #5: Everything About Page Speed and Lighthouse
This one is probably familiar to a lot of developers. Someone in your organization runs a Lighthouse report on your site and tells you the site is way too slow. Fix it and get the score up to a 90, please!
Stop with this silliness! If you’re basing big engineering decisions on a score that you probably don't fully understand, that’s not good. Everything in engineering requires nuance -- that big scary red number that is supposed to represent a score isn’t nuance.
Don’t blindly trust Lighthouse just because it says “Google”. Don’t put that onto your engineers as if “fixing” the page speed will actually do anything.
First, understand that Lighthouse does not use real-world data. It’s entirely lab-based. Do you really know what this number is? Do you really want to dictate engineering goals based on it?
That number depends on when and how the Lighthouse test was run. When is important because the algorithm is always changing. Your score today might not be your score tomorrow because that one number is derived from a combination of other metrics.
The fact that Google is constantly tweaking how much weight to give to which metric should convince you that this overall number is fairly useless! It’s a subjective interpretation based on a combination of other weighted metrics.
Further, if someone in your company runs a Lighthouse test on their fast desktop with blazingly fast Internet, it will yield different results than from a slower computer. It’s actually telling you very little about how the site performs for your users.
It’s a fact that this algorithm isn’t objective, because like all algorithms it can be gamed. In other words, a perfect lighthouse score doesn’t necessarily mean that your page is actually loading quickly, just that the algorithm thinks that it is. There’s companies that specialize in gaming this algorithm because so many business leaders believe in this (evidence-free) idea that this score actually relates to performance. Or they falsely believe that “google thinks my site is slow therefore is penalizing me”. That’s not what Lighthouse is actually saying! Not at all!
Lighthouse isn’t bad tech, but it is useless if you believe it will give you a simple, objective score about how slow your website is. Your complex e-commerce site is probably never going to score a 100 and trying to bend it to do so would be a mistake.
How exactly do you think your engineers are going to “make things faster”? This isn’t some magic trick where they can always just “optimize” stuff to make it load faster, especially for front-end performance. You’ll likely need to kill features or drop some analytics to improve that score. Is it truly worth it? Have you really considered the implications?
The best way to understand if your site is slow is to measure it using something like Page Speed Insights (which measures real world data) or monitoring software like New Relic. That way, you’re understanding how the site actually performs in the wild. I've seen applications with very poor Lighthouse scores that score very highly in real-world metrics.
Do people complain about site speed? Does it seem responsive and quick when you’re using the site on various devices and locations? I’ve worked in firms where the site is very responsive even on slower connections, but actual performance matters a lot more than this silly lab-generated score. Google even mentions that a bad Lighthouse score doesn't necesarily mean that "Google thinks your site is slow and will penalize you".
If you don’t understand every metric that is contained in the Lightspeed report, don’t use it. Don’t send it to devs saying that Google thinks your site is slow. That’s not what Lightspeed is and that’s not what it means.
The goal of Lighthouse is exactly what’s implied in the name…it’s a guiding light, not perfect night vision! If you’re a stakeholder, you should probably stay away from Lighthouse -- ask questions, but don’t send people that big scary number and demand changes without taking the time to understand the broader context.