Tool Impact on Developer Discipline - DI Framework

There's a lot of tools that exist to make what developers to easier to do. If it's something we do a lot, we tend to find ways to have something do it for us.
I've done it. I'll probably do it again. I built a little tool to automate the generation of Android Services based off JSON because I spent so much time copy/pasting existing services and editing them into the new serevice. Copy and Paste means you're missing an abstraction - I couldn't find it, so I built a tool to deal with it.

We find challenges and difficulties accomplishing what we want to do, so we find things we can do to reduce that pain, to simplify the process. It's one of the things I really enjoy about being a developer - I can simplify things I want to do.

The downside of the things that get built, is they get built for the way that developers write code and that's very often not inline with the technical practices. These tools promote writing code in the fashion they're built to simplify. I find that they make it very hard to write good object oriented code. It's simpler to write (not maintain) bad OO code, especially with these tools.

Dependency Injection Frameworks

I'm clearly against Dependency Injection Frameworks (DIs). I've used them before and they were fantastic... for my bad design abilities. They made my instantiation easy... for my poor OOP abilities.


I see DIs used in two major ways, Top Down Injection (there's probably real words for this I don't know - don't care while I'm typing this, but let me know if you do) like MVC apps use for the controllers.

The controller can have a non-empty constructor that takes (ideally interfaces) parameters. In the DIs configuration, you register what class is instantiated and provided for what interface.
Great... unless your interface is implemented by more than one class.

When it's implemented by multiple classes and those have to be injected at different places... It gets janky. It can definitely be done... with hidden coupling.
The DIs configuration now couples ClassInjectInto with ConcreteImpl. Each DI has different syntax, so it's something to be found and followed, not always understood.
The coupling between ClassInjectInto and ConcreteImpl is hidden. It's away in the DIs config... in some line at some point. Unless you break out the configuration... it's just going to grow. DIs configuration only becomes more complex. The less understood it is the less it can be pruned.

DIs not understood

An example of that is a project I came onto. It wasn't a huge DI configuration; maybe 100 lines.

Given I joined the project with a focus on refactoring to GOOD OOP, and DI does not fit that - I wanted to extract it. This is typically an easy refactor since it can be done in small bits. It doesn't need to be done all at once.

I go with the examples that specify the class/concrete relationship. Easy to use the dependency constructors since the class to use is right there in the config.

In this particular case, we didn't have any of those... I've done a few, but it was right when I was developing the aversion to DIs. It's easy to remove by using dependency constructors. All DIs is removable by dependency constructors.

It's a SIMPLE mechanism to eliminate the mystery that DIs represent for most engineers. I view it as one of the technologies you learn about when you need it... forget and then have to RELEARN to update it... so instead of learning it well enough to refactor the configuration, it's learned well enough to understand what and where to copy and paste.

Which gets us back to the DIs config I was going through... I search for the interface being registered... It's used in a couple class constructors. Perfect... How is that constructor being used... ... By passing values in. Well... So... It's not actually using the dependency injection.

It feels great to delete code.

I proceeded through the whole file of configuration... None of them were actually used. This file had the current feature getting configurations added to AutoFac because... it's what was there.
This is a danger of DIs - they aren't understood. Sure, we can understand them better, but it's black magic unlike anything else we deal with frequently. Maintaining the knowledge to be effective with it is a losing battle.

DIs are dangerous to the system because they resist being refactored, which leads to acceptence of refactor resistance... which kills products.

DIs block refactoring

An earlier project I was involved with used DI... funny how those projects keep popping up in a post about DI... :P

They had a few paths through the code that were SUPER similar. The engineers on the project followed a lot of the MicroObject Technical Practices. The big one that impacted us was interfaces. They interfaced everything... YAY! Except how they got to the interfaces. It was largely by creating the concrete class and then extracting the public methods.

This is fantastic though. Really. They had interfaces. It creates a simple way to look at method signatures and those that look the same might be realted. No guarantees though; gotta look at what they represent. It was a lot of DoThingXForApi1, DoThingXForApi2, DoThingYForApi1, DoThingYAndZForApi2. The naming was great as well. To my well tuned refactor nose it really stood out as candidates even though the method names were rarely the same.

The project had a couple of related flows from the API to the database, some number crunching and returning a payload. Small intern-able project.

I looked over the code and saw A LOT... I mean, A LOT of duplication. This is true duplication; the intent of the code is the same, it'll have to change for the same reasons.
We'll skip over all of the refacor bits that got us to have the similarities IN YOUR FACE!
Using the super similar interfaces as a guide, we refactored the classes to use a single interface. Wonderful. Continue the refactor to the parent classes to use that interface instead.
Until we get to a class that uses DIs. Suddenly we go from registering an interface and an implementation, to having to specify what class, interface and implementation.

Doable; yes - but why do that instead of double constructors? It's just as much coupling, but there is ZERO hiding, ZERO black magic, ZERO god classes... It's a cleaner design.

Global Singleton

The other way DIs are used is as a global singleton. This is an unforgivable way to use it. I can forgive the TopDown DI usage - it's easy to remove and it starts the code down a better path.

Using a global singleton to retrive a concrete instance of a given interface... There's no forgiveness for this usage.

From AutoFac

 using (var scope = Container.BeginLifetimeScope())
    var writer = scope.Resolve<IDateWriter>();

This is BAD practice.

The reason this is bad is because it's a still a new inline. We don't want to tightly couple our code for more reasons than it's hard to test. This becomes more testable because tests can inject a mock.
The test is then tightly coupled to the implementation details.
Tests MUST run in series. We're working with a Global resource... shudder

Good OOP does not new up collaborators because that prevents good design. Using DIs to replace the new and declaring victory because we can write tests is a bandaid on a much deeper issue in the code.

Using DIs like this cause a massive resistance to refactoring in the code from the tests. (You have tests now, right?)
The tests become invalid when you try to refactor the code... Refactoring becomes unsafe when the tests break because of it. Refactoring won't happen in this situation.

The way to get away from new inline isn't to use factories or DIs - it's to not get the collaborator during behavior execution. You better have your collaborators when your method is invoked; because you shouldn't have to go get them.

A simple reason for this is SRP; a method should have a single responsibility. When the responsibility becomes, "Get X, Get Y, Use X in Y" we're doing A LOT. Ok, for most of the industries methods that's nothing. Why "get"? If we can instantiate what we need in the constructor, then that's the correct time to instantiate. The object then HOLDS it's collaborators.

This forces changes to the design of the system. It FORCES a better design. Let's pretend "better" is subjective in this case... Instantiating in the constructor forces a more flexible, maintainable, testable, refactorable and understandable class.
I strongly feel this makes the design that's forced into the code objectively better.

I've fortunately avoided code that did this extensively. Using the "Service Finder" pattern is an anti-pattern.

It will kill your code.

Show Comments