Refactoring Legacy Code: Remove future proofing

The legacy code I'm playing with at work had some nice future proofing. It wasn't quite big design up front, but it was some future proofing. I got some out, but there's still a whole level of abstraction that's pending removal.

I'm not here to write about what I haven't done - Let's talk about the small bit I did get to rip out.

There were two bits of future proofing I was able to remove. The first was simple, a boolean that was always true. Literally set to true at instantiation and never changed.
The product does some filtering of input and there was the expectation we'd have mutliple types of filtering, Coarse and Fine. Maybe it existed at one point... I don't know.

We had a variable that was used in a dozen places and was ALWAYS true. That's a dozen tests that were testing something the system WOULD NEVER DO.

I really want to say it was a simple refactor to remove the variable and everything that used it. I REALLY want to... In the end, it was. My first attempt didn't go so well.

"It so simple" I tell myself, I'll just rip it all out and get fix the broken tests... Nope. I don't know why 1/2 the tests aren't passing anymore. sigh

I throw it all out (best thing you can do is love throwing away work). I removed usages one at a time, fix tests one at a time.

REMEMBER: When you refactor, KEEP THE SYSTEM COMPILING EVERY STEP OF THE WAY! This make life SO MUCH BETTER. Make one small change, run the tests. Fix the broken ones. Doing this keeps the system functional and allows you to stop at, basically, any time.

My second pass following the above reminder worked much better and was, in fact, quite easy.

There was a second bit of future proofing around how the filters work. The "evaluate" method took in some "Filter Data" that was cast to a specify type. Not even an interface, a C# Generic on the method.

bool Eval<TFilterData>(TFilterData data);

This allows the data coming in to be ANYTHING. There is no constraint to what "TFilterData" could be.
Each filter had to cast the data to the type it was hoping to get

bool Eval<TFilterData>(TFilterData data){
    var expected = (ExpectedType)data;
    if(expected == null) return false;
    ...
}

This is some crazy future proofing. At the time the code was written, they knew how to make the data types. Using this level of generic to enable ANYTHING... is setting up the code to be able to build a filter around ANYTHING.

Writing a method definition so you can do anything with it is a wee bit of future proofing.

Getting rid of this we fairly straight forward, but used some techniques I've refined working with MicroObjects. In the end, whatever TFilterData was passed in was constructed by the object instead of being passed in. A couple of them even ditched the data object and just did some evaluations internally instead of in an object.

The more generic and flexible the method you're creating is allowed to be, the more likely it's future proofing and complicating the code you're writing now.

With the future proofing of the Filters, there were multiple levels of abstraction that weren't needed. The simple effort of removing the future proofing got rid of levels of abstraction and multiple instances of "yo-yo"ing in the code for creation, utilization, and evaluation of the filters.

Clean it up

The first future proofing I mentioned here existed since almost the beginning of the project. There was a comment, with a date, asking, "This is always true, can we remove it?"

It wasn't cleaned up. I think there's two big aspects of what allowed this future proofing to survive YEARS in the code.

Complexity

The code was complex. It was hard to tell if it was actually used or not. Some Dark Black Magic of reflection loading values from the database cause hesitation on removing any fields.

Mindset

The early developers of the project didn't have a refactoring mindset. They did TDD and had a well tested system - but there wasn't a mindset to refactor ruthlessly. Without the rutheless refactoring the code complexity grows.

You're either removing complexity or adding it. There is no middle ground.

No Future

The future of the code is only known in the future. Code for what you need NOW. Poor maintainable code wants to know everything in advance so it never has to chagne - becuase it's not maintainable.

Maintainable code wants nothing but what's needed RIGHT NOW because that's all we can be sure about. That's all we KNOW. Everything else is a guess and we all guess wrong. A guess adds complexity to the code that's not even required yet. Its a cost in the code, complexity, maintaining it, testing it - and it provides ZERO business value.

When we write code for what might happen, we're costing the business money. If we write ONLY for what's needed (and refactor ruthlessly) we'll be able to add in what's needed when it's needed.

Only when it's needed now should we write the code for it.