Rex Kerr
2 min readApr 20, 2022

--

This is often true, but personally I run into many more cases where people cavalierly assumed they could avoid optimization and...guess what...now the system needs a ground-up rewrite because some critical aspect of optimality wasn't planned in. For instance, pretty much everything about Java's BufferedImage makes it unsuitable for anything beyond toy applications because performance is baked out by design. Any serious use of images requires an alternate framework (e.g. ImageJ's ImgLib2), and every program that doesn't reach outside for it is hampered....aaaaand if you're using the Java UI stuff, you pretty much have no choice but to touch BufferedImage. The architecture is the problem: they went for modularity and encapsulation first, which results in it being practically impossible to do what you need to to get good performance (despite doing basically anything with images normally benefitting from good performance). This isn't even one of the particularly bad cases; it's just one I run into a lot.

The problem isn't optimization. It's appropriateness. For whatever reason, a fair number of software engineers get a little silly in the head when they think about the appropriateness of optimization or automation...like do it always (or do it never). You wouldn't add a highly scalable no SQL database to a project just to hold a half dozen user config settings. But people do sometimes resort to elaborate tricks to speed up parsing of command-line arguments, thereby making the parsing less robust and not otherwise actually improving anything for anyone.

So, optimization is critical...when needed. The trick is to know when, preferably in advance, so you can design appropriately.

--

--

Rex Kerr
Rex Kerr

Written by Rex Kerr

One who rejoices when everything is made as simple as possible, but no simpler. Sayer of things that may be wrong, but not so bad that they're not even wrong.

Responses (1)