In this podcast I discuss test automation biases and how to avoid them.
This was originally released to Patreon supporters with full transcript a week early.
00:00 What is an Automation Bias?
The kinds of statements that you see online. People throw them up on social media and they are presented as “This thing is Good”. “This thing is Bad!”.
00:29 Treat all opinions as biased
Fundamentally. Anything that you read or hear about automating. Treat it as though it’s biased because it is! Someone out there is saying something because they feel strongly about something, they’re biased towards it or away from it.
It’s really important to try and focus on the practical presentation of it, rather than the theories and opinions and treat theories and opinions with suspicion, unless they are backed up with some sort of experience.
01:50 Automating a GUI is Slow and Flaky
Certainly I have seen a lot of people create automated execution through the GUI that is slow and is flaky.
I have also seen automated execution through the GUI that is fast because they rely on data in the database. They find data in the database rather than creating it all the time. They jump into the middle of processes because they have a good structure. They can go to particular URLs and do things in tight sequence. They’re not flaky because they synchronize well. Certainly I have created automated execution that is not flaky. It’s synchronized very well. I don’t have to amend it very much.
03:36 Automate through the API
Which is great. If you have an API, if you don’t have an API, the option isn’t open to you, what are you going to do? You’re going to go to the dev team and say, we need an API so that we can auto mate through the API and not the GUI.
No, you work with what you’ve got you test with what you’ve got you automate with the time and the constraints that you are faced with. I’ve seen applications that have APIs and they have GUI and the API is an entirely different path through the application than the GUI.
Technical knowledge and understanding is required to make a decision around this.
I do automate through the API. I do automate through the GUI. It depends what’s most appropriate and will get me the results that I’m looking for.
05:23 Code Free Automating is Bad
I was particularly biased against code free tools until I used some of them and then saw that they can be very capable at what they’re doing. They require a degree of skill in order to create abstractions depending on the code free tool.
If you do not have any coding skills on your team, then you are going to be thrown towards code free tools.
06:17 Tool X is Better than Y
Well, it might be it depends on your context. It depends what application you’re testing. It depends what your skill sets are.
There is no. One best tool that rules them all. If you think in that way, you are going to limit your testing.
07:04 Postman is Better than Insomnia
Well, they do different things. Why would it be better?
Postman is trying to implement more and more features to span the coverage of what people want to do on projects within API testing. One of those is more code based manipulation of the requests and Postman is supporting that activity within the postman GUI itself. I might on some projects view that as a risk because it is in the postman GUI and limited by whatever libraries we can import in postman.
I quite like using insomnia because it doesn’t do all the things that postman does.
Postman is a good tool. It’s just very big for some applications.
08:54 Python is Better than Java
My bias. Is towards, whatever language is used to develop the application, you use that to automate the application.
Some languages have great library support for some functionality and some don’t.
So you have to consider what you will have to write what you will have to add on. Do you have the skills for doing that or not?
10:24 Seeing Through Biases
One of the easiest ways to overcome biases or to tackle them is to know a couple of things, right? What are you trying to achieve? What outcome do you want? What are your skill level? What is your experience? What experience can you draw upon in the environment? How much time and money are you going to put into this? etc. There are more questions to ask yourself.
12:52 Try it, then decide
Try them all and keep revisiting your decision. On some projects we’ve run parallel experiments to try different approaches.
Ultimately whatever people out there say, as soon as you start implementing it, as soon as you start automating with it, you and your team are responsible for what happens next.
13:17 Page Objects vs Screenplay
On some projects we’ve run parallel experiments to try different approaches. I remember we were having discussions about. Page objects versus screenplay pattern. So I did an implementation of the page objects.
Someone else did an implementation of screenplay pattern. We compared the approaches to see the outcome.
I have seen both work. I have seen both fail. Also bear mind that you can implement the screenplay pattern and the screenplay pattern can be an abstraction on top of a bunch of page objects. Screenplay pattern is a user-focused abstraction with the tasks that they do. Page Objects is a very structural and implementation based pattern, which might be appropriate for certain levels of testing.
14:52 Take Responsibility
If things go pear shaped later on, you can’t turn around and go, well, that person wrote a blog post saying, we should do this. It’s not going to cut it at that point. You own this, you made the decision, so you better know what you’re doing.
16:11 External Experience
Unfortunately the best way to start with automating is to have someone who has done it before on your team. There’s a danger that they come loaded with their biases and they may be building what was good for what they did before or based on their experience or good for them, but may not be appropriate long-term for your environment, for your project, for the way that you work, for the budget constraints.
17:02 Start Small, Make Progress
Start small. Get something that works. Build on it.
17:50 Do not ignore issues
This is a key thing with automating. If you are starting to automate and you go, well, that bit is quite hard to automate. Let’s do it later. You may find that later doesn’t happen. Maybe the tool set that you’ve chosen can’t automate it. Fix those issues now before they become an issue. If your tests are running flaky, figure out why and fix it because that will bite you later. It’s almost guaranteed.
20:34 Be Real rather than Believe
I’ve seen companies that end up with really limiting frameworks and approaches because someone covered up when things went wrong and someone was able to bluff and bluster the way through it.
21:18 Keep Options Open
At some point, if you’re automating and you’re starting to see benefit from this, you’ll probably be committing strategically to a particular solution. Don’t do that until you have explored other options.
One of the best times to explore other options is just at that point where you are about to commit to something. Because now you’ve developed a lot of experience.
22:48 Be Aware of your biases
We are all biased. I am biased. I am biased based on my experience and what I’ve seen work.
I try not to force my biases on other people. Because they may not have the same experience or set of skills in the environment to implement my biases. I try to explain how I am biased and why I am biased and why I choose to do things a certain way. But I am aware that there are alternatives and that those alternatives may be perfectly valid for other people.
23:19 It takes time
It takes time to gain experience. , give yourself and your project and your team and your company. The time to build the experience, be realistic.