Five Questions With Scott Louvau
Scott Louvau writes software for Microsoft. He doesn't write any software you have ever used, though, unless you have ever worked on his team. The software Scott writes, you see, is the software his team uses to test their product. To paraphrase the commercial, Scott doesn't test the software you use, he makes the testing of the software you use better.
Scott tells me that if he was not in software he would be some sort of craftsman because "I really like the idea of doing a job you can get better at every day". He recounts a Discovery Channel special on how aircraft carriers are built, a process which involves welding together giant panels of steel. The host watched as a welder took a panel which had buckled from the heat of welding it and completely flattened it with a single hit of his hammer. When the host asked the welder how he knew where and how to hit the panel, the welder replied "Twenty-eight years on the job".
Although Scott has only six years on the job so far, I am sure his teammates would agree that he knows where and how to hit their buckled panels in order to flatten them out. Here is what Scott has to say:
DDJ: What was your first introduction to testing? What did that leave you thinking about the act and/or concept of testing?
SL: Actually, my introduction to testing was an introduction to not testing. I wrote my first database application for my first real job, and excitedly told my boss it was ready to be released. Within two minutes he'd found a bug and within an hour one bad enough that it wasn't usable by customers.
I learned that if you haven't tested it, it doesn't work. If you wonder if something works and try it and it does, the odds are very good that it works because someone earlier wondered the same thing.
DDJ: What has most surprised you as you have learned about testing/in your experiences with testing?
SL: I guess it's that testing is so easy, and yet so difficult. It's very easy to break something and to think of the basic things about a system that should work and that you should try. It's not so easy to quickly find bad bugs in a relatively good product and explain why they are important. It's incredibly hard to choose the right tests from all of the possibilities and get them running more cheaply than the code they test.
DDJ: What is the most interesting bug you have seen?
SL: There are so many possible categories.
The worst bug I found caused Visual Studio to crash and delete as much of itself as it could.
The most entertaining bug I found had the offending developer's name burned into it. His My Documents folder became the default for all projects for anyone.
The subtlest bug I found was one where the arrow keys didn't work in one particular combobox in one dialog for some still unknown reason.
I can't sit down and look for bugs for more than half an hour without smiling. It's amazing how many little things there are to go wrong.
DDJ: How would you describe your testing philosophy?
SL: I guess I would say "to be pragmatic". Ultimately my goal is to get the most useful possible product out. I start with the things my product must be able to do and work my way into the soft spots I find as I go. I try to blend automating to prove the product works every day with exploring to find the areas that need more polish to really be useful to people. It's an interesting balance.
DDJ: What do you think is the most important thing for a tester to know? To do? For developers to know and do about testing?
SL: I guess it's to remember that we all have the same overall goal - to produce the right product for our customers. I found the process of triage (deciding which bugs to fix and which not to) very frustrating until I really internalized the idea that we're building the same thing.
Remember that many users will have to use your product when you release it, and will cope with the bugs you've created for them every day. Remember that they are also dealing with not having your product every day it isn't there.
DDJ: Is there something which is typically emphasized as important regarding testing that you think can be ignored, is unimportant?
SL: Metrics. They can be useful, but are so often abused to try to force things to be "done". Ultimately it's up to the people working on a project to really decide when it's done, and the metrics should only be potential tools to guide their efforts. If you would not feel good signing your name on the box of the product you've built, don't release it yet.
DDJ: What do you see as the biggest challenge for testers/the test discipline for the next five years?
SL: I think our hardest challenge will be to cope with the multi-core/many-core transition that's happening. Across computing I don't think we're great at writing massively multi-threaded code, testing it effectively, and reproducing and fixing defects. We will need to be.
DDJ: Going meta (to channel Jerry Weinberg), what else should I ask you? What would you answer?
SL: Three questions:
What do you consider the key(s) to your success?
I think two things have served me very well. First, I don't like to just use something that works. I want to know how it works and why it works. Second, I try to do something better every day - automate something I've done manually, figure out a little code design problem - something. It adds up. =)
What is the most important thing you've learned as a tester?
The same thing I said for "What do you think is the most important thing for a tester to know" - we're all working toward the same goal.
Why do you like being a tester?
I think that software testing is much more of a "frontier" than development. We collectively know much more about how to write services and applications and utilities than software tests. I think testing is also much more open ended - anything you can think of to decide whether the code is good is fair game. I guess I'm also just fascinated to see my programs using other programs like a user would.
[See my Table Of Contents post for more details about this interview series.]