Nothing Special   »   [go: up one dir, main page]

DEV Community

jantolentino
jantolentino

Posted on

No measurement, no optimization

This message is something that has been echoed a lot within the programming community.

Don't optimize what you haven't measured yet, or, alternatively, optimize only what can be measured.

You can only appreciate this advice until you experience it yourself. And I did.

At work, I always look forward to projects with ample time to optimize your code after the initial release. There's satisfaction in refining code, making it cleaner and more efficient. However, I’ve learned that optimizing without measuring the changes in performance could lead to disastrous results, especially with my limited experience.

Most of our performance issues stem from database queries rather than our program's architecture.

Following the initial release, there were these two queries that I unfortunately had to write them separately due to its complexity. Optimizing them was already part of my plan after the release. I figured combining two queries would be more efficient. I tested the results, and everything seemed fine—same output, no issues.

Feeling satisfied with my work, I submitted a merge request late that day and waited for feedback. The next day, eager to check on it, I found that no one on the team had reviewed it yet. Curiosity struck, and I decided to measure the performance gains myself, hoping to validate my efforts. I didn't.

I opened my Postman and did a 100 request to test and calculate their average response time. Overall, the new implementation had a higher response time. While some requests were faster than with the previous implementation, there were also significant spikes. Ultimately, it wasn’t any better overall—I had made it worse.

I closed my merge request, noting that the code was flawed and could cause performance regression. Upon further investigation, I realized I had some lapses on the types of data being queried, which impacted the performance.

While it may have been an unfortunate attempt to optimize, it was nonetheless enriching at the end. It was a great experience. It gave me opportunities to learn more about Laravel performance utilities, Postman became more handy, and gained experience researching potential performance regressions.

Finally, optimizing based on instinct is just the beginning. You still need to back it up with accurate measurements. As they say, you can only call it optimized if you have benchmarks to support it!

Top comments (0)