Most of us have fallen for the golden hammer trap. This cognitive bias is probably the reason why many CTOs pick the stack they do and not the one that makes more sense for their domain. You probably heard it in this form, to a hammer everything looks like a nail.
When is overreliance justifiable?
When should we pay the upfront cost of investing in a new tool?
Let’s embark on a new project; build an application that will eventually catalog and index large amounts of data. This data needs to be able to paint a picture, meaning it needs to be easily aggregable, searchable, and actionable. We know Azure and other cloud providers have services you can use to do some of the heavy lifting. At the beginning cost is not a concern, user conversion is slow, and you are treating this as a proof-of-concept project anyway. But suddenly you see the cost for running all these AI are pushing you into the negative on operational cost as users start spreading the word.
You reach out to a few friends, and all propose different approaches. From handling the hosting and training on your data of your LLM like Llama and other open-source solutions, now you start seeing more possibilities.
Depending on who you asked they’ll have different takes on this problem. One might be the developer should have done more research and went with the optimal approach from the onset. A valid point and depending on the project type and scope it might be the best. Another might be that because preexisting services and APIs were used the bottlenecks and pain points were identified early therefore proving more insight into modules requiring flexible scalability.
This might be a contrived problem to design a solution for, but it harps on the most basic of tenants in software development—there is no shortage of solutions to a single problem. Unlike the scene where Dr. Strange sees millions of possible futures with only one possible winning outcome, in software you can have a winning outcome even if the underlying architecture is riddle with technical debt and scotch tape (tech debt: replace scotch tape with duct tape). To me this means that at least in software development there is no right tool for the job. I believe that the tool that can get you there faster has a slight edge over the right tool for the given job by today’s standards.
Let’s take Facebook as an example of practicality and pragmatism. It was originally created with PHP as it provided ease and flexibility to iterate quickly. It wasn’t until the fork in the road, till more performance was needed, that they implemented new tooling and extensions to make it faster.
This is how I would answer the stated questions then.
When is overreliance justifiable? As often as needed to build something useful. You’ll know it is useful if you must concern yourself with performance and scaling! Before that get something working and use iterations to polish the areas that need it more.
When should we pay the upfront cost of investing in a new tool? Only when you have data to clearly define a roadmap or existing user base. I like shiny new toys as much as the next developer, but we shouldn’t bring them as a solution unless it makes sense. Without direction planning Facebook userbase today at the stage of Facebook beginning will just add rigidity and complexity you can’t afford at the present stage.
Looking forward to any other perspective on projects and issues similar or different from the stated problems.
Photo by Pixabay: https://www.pexels.com/photo/black-claw-hammer-on-brown-wooden-plank-209235/


Leave a comment