Hi! Welcome to Chapter 2.
A lot of people encounter the same problems or similar problems when they try to automate regression tests.
We're gonna look at some of the common roadblocks that people face and teams face, and start thinking about how we might overcome them.
We'll talk about what you can expect as you begin a new automation effort.
Stop and think about this and talk to your colleagues about it.
Let's face it: test automation is hard.
If it were easy, we wouldn't be here.
What has held you and your team back up to now? Maybe you've tried to automate tests before, and either you never got traction on it or maybe you were automating a lot of UI tests and then you did a major redesign of your product's UI and all those tests needed to be changed and it just felt like too much effort.
So pause the video for a few moments and think about "What's the one biggest thing in your way for test automation?" and we can think about how to overcome that.
I'm guessing you may have thought about some of this.
Yes, after decades of test automation, when I start a new automation project, I do feel a little fear, it's normal.
Got a lot of unknowns, we don't know what problems we're gonna run into.
But we can be confident; we can learn the skills to overcome it.
And yes, it takes time and money to learn the skill, even if you're using an open source tool that you didn't have to pay money for, you're still paying money to learn it, you're sitll paying to learn how to do good practices for maintainable tests for your testable code, your production code needs to be testable, so a lot of components to that.
And, yes, we do need to get new skills and tools, we need to think about what happens if we're automating UI tests and that UI changes, so these are all things that, with modern UI tools, we can overcome these.
And it is an investment, it is going to pay off over time, if we take the time to do it well and get a good solid basis.
This diagram is something from Brian Marick, from years ago, and he called it the Hump of Pain.
And this particular illustration is from Gerard Meszaros' wonderful book, xUnit Test Patterns.
So when we start doing test automation, whether it's test driven development or automation at other levels than the unit level, it's all going to take us more time, we're learning new things, we don't already have a library of reusable components to use, and it's more work on top of what we were already doing, and we're probably still doing manual testing at the same time, so especially if we're working on legacy code that doesn't have automation right now, perhaps it's designed very poorly, it's not very testable, it just can seem impossible.
But, I promise you there are ways to do it.
Still, you may run into some team members who say, you know, it's too much work, let's just manually test it and then not ever test it again.
Well, if you never are gonna change it again, that may be a viable approach, you have to think about that.
So, probably you are gonna change it, and you probably are gonna benefit from some automation.
But notice, once we acquire the skills and get good with the tools we're using and build up a library of reusable components, and as we've learned to make our code more testable, it's easier and easier to automate.
Now it takes us less time to deliver the same code to production; supported by automated tests, we have a lot of extra time we can use for other testing activities, we can use it to study production usage and learn what customers are doing, what's valuable to customers, maybe spend time upfront in the design efforts with customers, lots of other things we can do with that time, so it's really going to pay off.
In most contexts so as I say, automation does require a significant investment, and we talked in the first chapter about setting goals, so set a small realistic measurable goal around test automation to start with.
Maybe your first step is going to be to put together a test strategy for test automation, and we're going to talk about that in future chapters.
Maybe your team gives yourself a couple weeks to come up with a basic strategy, and then you have some kind of measurement to know "yes we think we have a strategy that we can build on and keep going," or "oh we don't like our strategy we need to redo it," that's okay because we want to invest in the right things so we can go faster later on.
So, decide what you want to do, how you'll measure your progress, and figure out if you got your desired outcome.
Nobody can afford to automate everything all at once, it just cannot happen, so give yourself time to figure out a good approach for your team.
Maybe there are some parts of your application that are notoriously buggy, and you're getting bugs reported in production all the time, maybe they're not that bad but they're annoying and you're spending a lot of time fixing them.
Maybe that's where you want to put some safety net of automated regression tests first so that more changes to that don't break it.
Or maybe there're some parts of your product that your business stakeholders say, "these absolutely are critical and have to work," "our customers can't live without these, even for a few minutes," so that's where we want to put our first test automation efforts, in those critical areas.
Wherever you can add the most value at first, start with that.
And, at the same time though, as I said, make a small goal, do a small increment, maybe just automate a small number of tests at first, and you're going to immediately see some benefits.
Don't rush the process.
As you get going and learn more, you will be able to go faster.
And we'll talk about this more in later chapters.
So I can give you a really good tip on starting a new test automation effort, and I've done this now, I think with four different teams.
So get with your business stakeholders, find out what's really important and has to work in your product, write down your manual release regression checklist, and every time you need to release to production, maybe you're on a team releasing every two weeks, just split those checklists and tests up with everyone on your delivery team.
Product owner, DBA, developers, analysts, everybody.
And say, "okay, we're gonna spend a few hours now doing this release regression testing so we can feel good about releasing.
" It's highly motivating, people see how tedious it is to have to do these tests over and over.
And it's really hard to make sure that you follow all the instructions, and if you see a problem, it can be hard to reproduce it.
So they're going to start thinking about, "how can we design our code to be testable? How can we starting doing some test automation so that we don't have to do this really boring tedious work and free our time up for the more important work where we really feel like we can add value?" So, that's one way to share the pain and sharing the pain means that we get change a little faster.
In the next chapter, we're going to look at more ways to engage your whole team in automating tests and discuss why that's important.
In this video, two important things are going to happen.
First, we're going to install the mabl trainer,
which is a Chrome extension that we'll use to create tests,
and second, we're going to get our first tiny glimpse
at just how easy mabl makes it to build, configure, and execute
your automated tests.
To get started, log into your workspace at app.mabl.com.
The first thing we need to do is install the mabl trainer.
Do that by clicking the Install Extension button.
You can also install it by going to chrome.google.com,
click Extensions, and then search for "mabl".
If you already have the trainer installed, just skip this part.
After clicking the Install Extension button,
the Chrome webstore launches in a new tab and loads the extension's page.
Click the Add to Chrome button.
The extension will then request permission to run.
To allow that, click the Add extension button.
In a moment, you should see an icon in the browser's toolbar
indicating that the mabl trainer was successfully installed.
When the mabl dashboard reloads,
you should also an additional confirmation that the trainer was installed.
Now that that's done, we can move onto training our first journey.
You can think of a journey just like a test case.
It's a series of interactions with an application, recorded with the mabl trainer,
that can then be replayed on all the different web browsers we support.
Think of it like a user going on a journey through your application
and interacting with all the different features.
To create a new journey, click the Train Journey button.
You can also click "New Journey" in the left nav.
Before you can begin recording, you have to give your journey a name.
Let's do that.
In the next section, you can add your journey to a plan,
but what's that mean?
Plans are how we configure groups of journeys and how they should be executed,
whether that's sequentially, in parallel, or in stages,
or some combination of those,
which browsers to test against,
whether they should be run on a particular schedule,
or as part of a deployment process, all kinds of things.
We'll dig deeper into plans in other videos, so for now, let's just keep it simple.
Just like journeys, plans also need a name. Let's do that.
In the application field, enter a name for the application being tested.
For now, we'll be using the mabl Sandbox.
For environment, this could be something like Staging or Production,
but to keep our results separated from other environments you may already have,
let's just call it "Sandbox".
And finally, we need to enter the URL of our application.
Since we're using the mabl Sandbox, that URL is sandbox.mabl.com.
Be sure to enter the full URL, including http:// or https://.
If you make a mistake, just click the X icon to the right of the field.
All we need to do now is click the Create Journey button.
When we do that, the web page will open, and the mabl trainer will launch.
It's usually a good idea to clear cookies for the app before training.
You can quickly do that by clicking the link here.
Once that's done, just click the button and we'll be ready to go.
You should see the mabl trainer window now, and you'll notice the Live button is highlighted in red.
That means the trainer is recording.
Let's click a few things and see what happens.
It recorded my button click.
And now I'm going to record clicks on each of these radio button labels.
Great. Now, let me show you how you can replay steps in the trainer.
When you hover over a step, you'll see some icons appear.
Click the ellipsis icon on the far right,
then click "Move cursor here".
You'll see the purple line move from beneath the last step I recorded
to just above the step I selected.
Now I'll click the play icon up top to replay these three clicks.
Notice as each click is replayed, the element receiving that click is highlighted.
That's good enough for now.
Let's save this journey, wait for the green success confirmation, and then click Close.
The tab with the application and trainer will close,
and then we'll see the journey details page for the journey we just created.
When the journey details page finishes loading,
you'll see the new journey has already begun to run.
To take a closer look, click View Output.
Now we're on a journey output page.
This is where we can see the output of a journey, both while it's running
and of course after it's completed.
To see only the screenshots, click the Thumbnails icon.
You can then click a thumbnail to see that screenshot.
Just like in the trainer, you'll see the element being interacted with is highlighted.
For each step, you can see a snapshot of the DOM at the moment that action was performed.
It can be really helpful in debugging.
We also capture a HAR file for each step as well.
And if you don't know what that is, don't worry, we'll get there.
You can also download both of those, as well as a PDF of the journey steps,
along with the final screenshot.
If while you're looking at the journey output and realize you need to edit that journey,
you can do that quickly from here.
Let's take a look at the plan details
to learn a little bit more about what's in there.
On the plan details page, we can see
information about what application, environment, and URL this plan is targeting.
These are the plan triggers.
By default, this plan is set to run every four hours and on every deployment.
And over here, we can see this plan is set to use Chrome and Firefox.
Down below, we can see information about recent plan executions.
Clicking a row will expand it to display some info
and let us view the output for each browser.
To edit a plan configuration, click the Edit icon at the top of the screen.
This is where you can choose web browsers,
make changes to application URLs or add additional ones,
configure plan stages and sequential or parallel execution,
and when your journey should run.
I'll make a couple of changes, but there's a lot more here that we'll go into in subsequent videos.
Saving these changes will bring us back to the Plan details view.
And my changes to the schedule and browser configuration are visible here.
The navigation panel can be expanded and collapsed by clicking
this icon in the bottom left of the window.
Insights are notifications we surface about the behavior of your application.
Right now, it's just letting us know that our first plan is good to go,
but you'll get information about things like auto heals,
And if you like, these can also be sent to a Slack channel.
Let's go back to the dashboard and see what's there now.
A graph of recent journey runs is starting to form,
here I can see a Pass and Fail count within a given environment,
and links to the journey output are accessible here as well.
From here I could also launch the trainer and edit this journey,
I could perform a quick edit, or see a summary of the journey results.
It looks like we've got everything set up and working correctly.
Let's review what we just did:
we installed the mabl trainer, we recorded our first journey,
we ran it across two browsers in parallel,
we looked at plan configuration options and even got our first insights from mabl.
Not bad for eight minutes, but it is a lot.
If you need to rewatch this video, go ahead and do so.
That's the great thing about videos.
Stay tuned for more.
After you've shown your team the potential value of test automation, now you need to know how to utilize the particular skills of your team to their maximum effectiveness, which is what we're covering with the next lesson.Go to Lesson 3