/ embedded

Thoughts on Embedded Test Driven Development (part 1 of 2)

This 2-part series covers my experience trying out TDD for my embedded system at work. Part 1 covers my thoughts prior to attending a 3-day training course on embedded TDD, and part 2 covers my attitude after taking the course and doing more extensive TDD exercises.

I spent almost my entire professional career living in the embedded software realm, occasionally admiring the more pure software roles and all the "nice things" they have. Among them were automated builds, automated tests, easy debugging, lots of memory, and few (if any) requirements on testing real-time behavior. I thought that maybe some day the embedded world would get some decent tools that are popular, cheap, and friendly.

After hearing about some TDD work being done by some colleagues, I decided to take on a small TDD pilot for one of the modules I was working on. A few quick googles led me to James Grenning's book on embedded TDD, so I bought it and sped through the first several chapters. Our project uses a mix of C and C++ and I felt pretty comfortable with everything I read (which is rare for me). Overall it seemed pretty straightforward, and the thought of testing code without needing hardware really appealed to me.

Test-Driven Development for Embedded C

I consider myself a pretty skeptical person and in general I need a decent amount of data before I change my mind. Fast-forward 12 hours from reading those first few chapters and I was blown away by how much code I could actually test prior to even looking at my debugger. Dependency injection is a good thing and you should be doing it - but that's another post.

Once I automated the build and execute pieces of the process (also another post), I was hammering out tests by the dozen. In 20 minutes I had covered all of the 'bad input' tests on my list - something that I had earmarked an afternoon for (thinking that I would need to be in the lab with the hardware making a monstrous test-main file). With all this excitement in my head I quickly moved on to some more interesting functionality of my module. After writing one test and writing code to make it pass I thought "I know I'm going to need this other functionality in here, why not save some time and put it in now?" This won't surprise anyone more experienced in TDD, but it wasn't long before I had a bunch of code written with no failing tests and zero confidence in what I had just produced. It was then that I realized how disciplined you have to be to truly commit to TDD.

I wound up telling myself again and again (out-loud, drawing some strange looks from the people that sit by me) to only code what you need to make the test pass. It sounds simple, but it's not easy. For about a half hour afterwards I would force myself to delete a bunch of code I wrote that was not needed to make a test pass. Then I would go back and write another test. Then I would put the code back in, usually modifying it a decent amount to help facilitate the testing. It seemed tedious at the time but I stuck with it for the entire module I was coding, which was about 3 days' worth.

When I was done, I had an overwhelming feeling of satisfaction and yet a tiny voice was in the back of my head saying "is that all?" I had ensured that the algorithms in my module were doing exactly what they were supposed to be doing. I had ensured that not-crazy things would happen if some other module gave me bad input. What I had not done was to see if my code compiled using my target toolchain. I also had not done anything to verify that some asynchronous callbacks would get executed properly. Back to the book to look for some answers.

I should have known that you need to regularly compile for your target hardware. I just got carried away writing code with the spirit of 'fail -> pass -> refactor.' On top of that, it never even occurred to me to try to run my newly minted tests on my target hardware as Grenning recommends. I haven't done that yet, and I'm not sure it's worth trying on our little 16-bit MCU but it's a great idea.

Trying to somehow capture the asynchronous calling of a function brought mocks into my TDD world. I was already using gtest, and luckily Google sort of merged gtest and gmock a while back and now the makefile that is included with the repo can be easily configured to build both projects (although I believe gmock includes gtest, so you don't have to include both sets of headers in your tests/mocks). It wasn't long before I was mocking stuff all over the place, and ...honestly... not really needing to. I realized that I was mocking functions I had no intention of calling in my tests, and setting expectations for a wrapper to call the method it was wrapping. Neither of those adds much value.

I have since come to appreciate mocks for the flexibility they give you in estabilishing a particular state while also allowing you to capture the execution of depended-on-code or "DOC" in your tests (in full disclosure, I stole that from Grenning). Beyond that, I don't see what else mocks are good for, but that's part of why I'm taking that in-class training course. I received an email today and it turns out that Grenning is actually giving the training himself. Sweet. I should probably read the rest of that book.