This is a guest post by Peter G Walen.
In June of 2013 I was sipping a cup of coffee and scanning Twitter before launching into my day. I often flip through twitter posts each morning, except in election years, looking for interesting things friends and colleagues may have posted around technology and particularly software development.
This specific morning I saw a tweet from Jim Holmes that his new book was available on LeanPub, here. Sadly, it is no longer available. Jim’s message was clear and concise: “There is one best practice. Think.”
With the growth of tools to aid testing and improve information flow since 2013, I thought it might be of value to reconsider Jim’s message.
So, Testing?
I’ve worked in organizations following paths very similar to this. I can recall four very carefully documented work models and approaches. I still have copies of the documentation explaining the work models and the great benefits from this “new” model.
Practices and Tools
The increase in demand for standards and standardization has continued. Not exactly a “surge,” at least in the US, but there is notable presence. Management tools and controls seem to be everywhere and growing in number. People continue to search for the magic metric bullet which will solve all the problems.
Practices are being discovered or rediscovered. They get pushed forward because they are “new” while others, working well, are being tossed to the side as being “impractical.”
The quest for a technology-driven tool to fix problems making, testing and delivering software continues. While “templates” for directing testing are dropping in popularity, they are being replaced by other ideas.
Test “automation” tools promoted as “codeless” are growing in numbers. Often, these are aimed at “non-technical” people as ways for them to “start” in test automation. This might be a reasonable approach if there was proper teaching and learning in what good testing practices are.
The challenge for many organizations results from a focus on the tool and less on the reason for using a tool. Whether the tool requires some level of coding on the part of testers or automation testers, or whether the tool is one of the “codeless” test tools, there tends to be a bias on most organizations to focus on experience in using the tool or writing code in the needed language.
There tends to be little attention given toward an issue Jim described in his book. I have seen the same issue time and again in practice in the years since then. People focus on applying the tool. Companies focus on experience with the tool, language or “framework.” This leads to other, deeper issues.
It matters very little if the “tests” are “automated” or “manual.”
All “tests” must deliver value to the development team and entire project team. If they do not, they are wasting time and giving a false sense of the status of the project.
Not About Smarter Tools
Each test management tool can be used wisely or poorly. Each test automation tool can be misapplied, and often are if that is the “preferred” tool for the organization. When tools are used well, in their intended way, they can be extremely powerful and empower the development teams to work better and faster.
When tools are used poorly in ways never intended, they will impact the team’s efficiency and slow delivery down.
It is important to notice something. The salespeople for the tools are likely going to promise whatever they believe they need to say to “make the sale.” While they may say “of COURSE it can do {X}.” Do your own research.
It may be the tool you are looking at can indeed do {X}. It is also possible the tool can sort of do {X}, if you squint and look at just the right angle. See what other people using the tool are experiencing. Not the managers who decided to get it, but the people tasked with using it.
Look at the technical information. Was it designed where {X} was a key feature? Are the “modifications” needed to make it work? How about additional licenses to “enable” feature {X}?
Does it really do what you need to do? Or does it sort of do what you need to do?
You might look for reviews and comments not put out by the vendor. Look for how people perceive the tool. Look for how successful they were using the tool to do the things you need it to do.
In short, test the assertions made by the vendor against the experience of people who use the software. Be as certain as you can be to get the right tool or tools for what your organization needs to be successful.
A smarter tool will not help your organization if it becomes shelf-ware. If it does not get used for its intended purpose, no tool is smart.
Understand the Needs
I suspect part of the reason for this is that there is little to no understanding what work needs to be done. This includes “automate everything” or “track everything.” These are simple, non-answer soundbites. They sound like directives but give no real meaning or context.
It might be in the organization’s interest to consider what the needs are before defining the tools to use. I understand that many organizations have tools in place. Replacing them could be an expensive proposition.
Might the cost be less than the cost in forcing people to work in ways that impair the quality of their work?
This should be revisited reasonably regularly. The needs of the teams can shift, just as the needs of the organization can shift. The tools used need to shift with the changing needs.
These needs can range from running complex regression scenarios, to API calls, to UI, to variable function and integration tests. They might be unit tests or tests for a CI/CD environment.
The tools selected need to fill the needs of the teams doing the development and deployment work. The challenge is to understand how different teams can have very different tooling needs.
Learning
Once a tool or suite of tools is installed for teams to use, they will need to be trained if the tools are new to the organization. Part of this is helping them become familiar with the tool.
It is likely the more common tools will need little or no training. However, it might be worth considering training people on the recommended use. The problem I have seen at many shops is people had experience with a given tool, and all of them had used it differently. There was no clear, shared understanding among the teams of how to optimize the tool.
Part of the training should include having teams work with the tools. Letting them gain familiarity while exploring features and functions will help teams experience early, small success.
Small wins often help the acceptance of new tools and practices. Moving from grudging acceptance to realizing the potential for improvement. This individual and team-based learning can make all the difference with gaining ready acceptance of a new tool suite, or any practice.
Whichever software tool is selected, there is one tool which must be employed diligently.
The Most Important Tool
There is one tool available that can help maximize any other tool selected for use. This tool can help people identify the practices they need to follow to build the best software they can build.
This same tool has been hinted at throughout this piece. It is what drives success. It is central to making decisions.
We must use our brains to think. Our minds help drive the creative spark which helps find solutions to problems and answer questions which need to be answered.
The most important tool, which often gets forgotten, is the human capacity to think in abstract terms. If people are told to blindly follow scripts, then that is what will happen. When they are allowed to think and explore, amazing things can result.
Peter G. Walen has over 25 years of experience in software development, testing, and agile practices. He works hard to help teams understand how their software works and interacts with other software and the people using it. He is a member of the Agile Alliance, the Scrum Alliance and the American Society for Quality (ASQ) and an active participant in software meetups and frequent conference speaker.