Hi, welcome to the last chapter
in the introduction to our test automation essentials.
We're gonna wrap us some basics here.
We're gonna sum up some tips
for getting your whole team involved,
setting short term achievable goals,
creating tests with the value you need,
and keeping up your motivation.
Remember my story about
dividing the manual regression checklist
among everybody in the team, not only the testers?
We need motivation to think about testability.
How can we make automation easy?
Draw a test automation pyramid or other model
on the whiteboard.
Start talking about
what levels you can automate tests at now,
and where you would like to move towards.
Learn about your product together.
Make sure people in different roles share the same vision
of how each feature will behave.
Or it could cross roles on the team
to build the infrastructure your team needs
for continuous integration
executing automated tests,
deploying build artifacts for your testing.
Having developers and architects involved
means they can find ways to make the production code
more suited for test automation,
especially at the unit and API levels,
those lower levels behind the user interface.
Make your goals attainable
and find ways that you can measure progress
toward each one
so that you can use your retrospectives
to change your approach as needed.
Address the most risky areas,
or perhaps address parts of the application
where it's easy to automate tests right now
and see some benefits.
For example, if you have legacy code
with no automated tests so far,
you may decide to refactor that legacy code and
make it easier to automate tests
and automate the tests as you do the refactoring.
Or you may decide to build all your new features
going into the future in a brand new architecture
that is layered to be more automation friendly
at different levels.
Decide how you want your test to look
for the benefit of everyone on your team
who need to work with those tests.
Choose tools that work for the whole team.
You have valuable automated tests
when you see a test fail
and you know, perhaps just from the name of the test,
exactly what has broken,
exactly where in the production code a change has made
so you can fix it fast.
Using good coding practices
combined with well specified tests for good coverage
means a valuable automated test suite.
Remember, you can do all the exploratory testing you need
to feel confident in a new feature.
And, you can even use automation
for some of that exploratory testing.
But only keep the minimum automated regression tests
that give you adequate coverage,
because there's always a cost to maintaining them.
Having tests running in your continuous integration
meaning fast feedback, continually.
You may not see the benefits overnight,
but, as you automate more tests,
you will save effort that will let you
move into more valuable testing activities.
Maybe automating even more tests,
maybe doing exploratory testing, accessibility testing,
whatever your team needs.
Remember that failure is just learning,
so take a bow and move on.
Each of us has a unique context.
Our team is different, our company is different,
our business domain is different,
the risks we face are different.
So, in this introduction, I've explained a lot of
practices and principles
that have worked in many contexts.
But be aware that you have to see for yourselves
what works for you.
So experiment and adapt as you go.
In the intermediate section of the course,
we'll look at more ways
to get fast, reliable feedback
from your automated tests.