Bookmark and Share

Webinar Recap: Getting Test Automation Right!

Insight image

Thank you to everyone who registered and attended our webinar on "Getting Test Automation Right!" With the growth of agile methodologies and the emphasis on rapid testing, test automation is something that quality assurance professionals constantly want to learn about. While manual testing can never be completely replaced, automated testing is something that QA professionals need to familiarize themselves with. We were thrilled with the number of registrants and attendees for this webinar and are so happy to see that you find this content valuable!

We learned from the 2014-15 World Quality Report that 41% of companies lack the right tools and methods to implement automated testing effectively. There are still a lot of misconceptions regarding test automation, and we wanted to clear those up and give you a clear path on how to evaluate your situation so you can see if automated testing is a good fit for you. This topic was especially popular as the number of registrants and attendees were the highest we've had for any webinar we have organized! We were so thrilled to have more than 2100 registrants and 700 attendees for yesterday's session!


We wanted to start our discussion with a poll to see how much automated testing is taking place within your teams and at your organizations. We appreciate the honesty with all of the answers you gave, and an overwhelming 60% of respondents said that less than 20% of their test cases are automated! We weren't surprised to see this figure, but we are hoping the information and ideas we shared will help drive the number of automated test cases up into the 45-70% range. 

We ran a second poll to gauge the biggest risks you see concering automated testing, and your answers fell right in line with the statistics provided in the World Quality Report. The top answer was "Not enough resources (knowledge and labor)" at 47%, with "test breaking (maintenance nightmare)" at 27% and "not enough time (slowing down development)" coming next at 21%. These answers came out exactly as we expected and it really highlights how developers need to be involved in the automation process as early as possible.

We would like to thank John Sonmez from SimpleProgrammer, Joe Colantonio from TestTalks and Dave Haeffner! Their participation and expertise were invaluable in putting this webinar together!

Below you can find answers from Joe to the questions asked during the webinar:

  • Q: To what extent should developers be involved in automating, some organizations put all the weight on QA. What is your comment on this?
  • A: I think in order to really be successful with test automation you need to make developers more responsible for some of the testing. If testers are the only ones responsible for creating an automation framework and writing tests, it’s going to slow down your development process. Distributing testing to developers makes a lot of sense from the perspective of helping to expedite your efforts. Good way to start getting developer involved is to start making them responsible for the majority of your test automation framework creation.
  • Q: I am manual tester and my test automation team wants me to identify the testcase, which can be automated. What I need to look for in my test cases to recommend that? Thanks
  • A: The criteria I would use to identify y the ones to start with are: the test case need to be repeatable, easy to follow, the expected results must be consistent and easily verified. Also test cases should be independent of one another and not have any dependencies.
  • Q: Of the risks of automated testing you listed, how do you usually mitigate those risks?
  • A: The biggest one I can think of is the need for code reviews. You should be code reviewing all your test automated artifact just like your would any other development effort. Having your QA resources involved in your review process is extremely important to make sure that all the automated test are testing the correct thing and no important scenarios have been missed as well as the correct verification checks are in place. 

Below you can find answers from Dave to the questions asked during the webinar:

  • Q: Can you give an example of one off test not to be automated?
  • A: A helpful example would depend on your context. A better answer is to think in terms of value. If you've identified something that you think should be automated (e.g., refer to my four questions to ask post) and have the time and skills to automate it, then it's probably OK. But if the thing you're automating will only be used once (or seldom), then it might not be worth it (unless automating it will obviously save you time or reduce likely errors compared to doing it manually). Also, if the thing you're trying to automate is difficult to automate, and it's simpler for a human to check it, then that's probably not a good investment of automation effort.
  • Q: Many say that it is advised to execute all the tests manually at least once and then go with automation for regression. Do you agree or disagree? 
  • A: Sadly, it's not as clear cut as that. I would challenge your definition of "all tests". Just because there are tests does not mean that they are valuable/necessary. I would make sure that they are actually exercising portions of the application that are relevant to the business, are what users use, and are potentially things that have broken before (see my post on this here). Once that's determined, through the process of automating them, they would effectively be stepped through manually. That's because in order to automate them you need to find markup on the page to interact with/check and figure out the timing of your application so you can make your tests work well with it.
  • Q: Would it be reasonable to automate an app that is under development and is not stable enough? 
  • A: Not typically. A better approach is to document the behavior of the application from the user's perspective (e.g., gherkin/BDD -- "Given, When, Then") without talking about implementation details (e.g., see imperative vs. declarative arguments to see examples and trade-offs). That way, at a high level, everyone understands what the application is supposed to do (from a behavior perspective, not a pedantic UI implementation perspective), and can back automation into these gherkin specifications as the features are being built. But with great power comes great responsibility (see Liz Keogh's prolific post on this for more).

Below you can find answers from Dave to the questions asked during the webinar:

  • Q: On what factors does automating a project mainly depend?
  • A: The biggest factor, in my opinion is having a good automation framework to make sure that the tests are not fragile and are easy to read and create. Without this, I haven't ever seen an automation framework succeed. Beyond that, I would say that having developer support is critically important. You need to have a development team that is helping to create and maintain the framework. Secondary to these things is the actual application itself. Today, we can automate just about any application. It's easier if the application developers build the application with testability in mind, but it is not critical. I'd also say that not having a continuous build that runs tests automatically, will almost always result in automation tests being ignored and eventually being discontinued.


  • Q: Should automation start after Page/UI design or wait until back end logic is done?
  • A: Automation should start at the Page/UI design level. It takes a long time to build most automated tests, so you want to begin as early as possible. In fact, I like to use the automation tests to drive the creation of the backend logic for an application. If you start creating the automation tests first, you can run them to as the backend code is being written to be sure that it is correct.


  • Q: What percentage of budget should be allocated to the automation when the budgeting is done at the beginning of the QA project?
  • A: I would allocated about 75% of the budget to the automation efforts. Especially if you are first starting out. I know this seems like a ridiculously large amount—believe me—I'm not crazy. The thing is that most of your effort should end up being spent on automation, because eventually 90% or more of your tests will be executed by automation if the automation efforts are successful. Eventually you should see the overall investment decrease over time once a stable framework is created and you are able to actually produce new automated tests for a cheaper and cheaper total cost.

Be sure to keep checking back as we have these thought leadership webinars every month!