Warnings are errors

I recently assisted on a project where my goal was to add a new feature to the application. Fortunately my task was simple as the latest version of the framework the team was using had an API that did most of the work I needed to do. Unfortunately the team informed me that we could not upgrade to the latest version because it would "break" the application. "How bad could it be?" I thought to myself as I spun up the application... I immediately saw the problem: dozens of lines of deprecation notices and code style warnings.

Software warnings are much worse than you think

Warnings don't technically break your application because the application can handle them but they do break your product. The product in this case being the more ethereal version of the project that you're working on that you have to continue to maintain and work on for years. Applications that have little problems like a deprecation notice here and there tend to accrue more of those little problems until they become big problems. These problems are not just limited to upgrading frameworks. Lots of little problems like this can make software slow and unreliable. They can make debugging issues difficult. They can lead to your project as a whole becoming so hard to maintain that you have to refactor and start over from the beginning.

A solution

Proactively fixing warnings as soon as they are introduced is the only way to prevent them from becoming a larger problem later. These warnings don't just apply to server-side architecture. Make sure that console is completely empty in the browser. Take care of warnings in your CSS style, optimize your javascript and kill warnings in your compile/build process across the board. Clean, up-to-date code is the only code programmers really want to maintain in the long term.