Case Study: Improving Verity Nature’s Field Apps Using Apptest.ai

Case Study: Improving Verity Nature's Field Apps Using Apptest.ai
Justin

In this case study, we’ve interviewed Justin Baird, Verity Nature’s (https://www.veritynature.com) CTO, about how he and his team have been using Apptest.ai to improve test coverage and quality assurance.

Thanks for joining us, Justin. First of all, tell us about Verity Nature?

Thanks for having me! At Verity Nature, we are building applications that empower technicians and project developers to digitize every step of the measurement, reporting, and verification process required for nature-based carbon sequestration projects. By doing so, we are helping to deliver transparency and integrity to the voluntary carbon market.

Many of our projects operate in remote wilderness areas where there is no Internet connection. With users out in the field and offline, we demand flawless functionality across devices, ensuring the quality and reliability of the measurements taken. Given these requirements, our software quality assurance program operates with uncompromising specifications, and we are always on the lookout for innovative solutions that can help make our products even more robust.

Cool, so how did you hear about Apptest.ai and what do you like about it?

As our product development has progressed and we have brought in more features across a number of different user types, we’ve needed to move beyond our current testing regime. I was looking for automated solutions but also wanted to see if I could find any AI solutions to make it faster to deploy and test more scenarios than just the ones we come up with ourselves.

A friend sent me a link to Apptest.ai, and I’ve been impressed with the solution. It pretty much does what I had in mind regarding using AI to test apps—it’s all already in there. What I like about it is that it has enabled our test team to do more and achieve higher quality results faster, with the same number of testers.

When you’re testing new functionality, you can’t just go and immediately automate the whole process. A tester will have to play around, see how things feel, and see what happens with the new functionality on one device compared to another. This kind of experimentation is necessary and also time-consuming. What I really like about Apptest.ai is that it does this kind of stuff for you.

It runs a bunch of iterations, figuring out where the buttons are, where to log in, what to press to go to the next page, and so on. The dashboard shows you all of these iterations, with screenshots of all the steps and iterations, logs of every step of the testing process, and dashboard views of app performance on every device you test on.

Apptest.ai is able to automatically identify and recognise all the user interface elements on a page. In this figure, it detects the items that can be clicked, like menus, buttons and search boxes.

This is where having an AI do all of this experimental and iterative testing lets our testers focus their time on more meaningful work. Also, the mobile test farm is really cool.

Mobile test farm, what’s that?

Oh yeah, so that’s my own name for the Apptest.ai’s testing platform. At the Apptest.ai data center, they have a big rack with a huge number of devices. That’s how you can take your app, upload it to the service, and then run the app on dozens of devices at the same time. This, in itself, saves a huge amount of time and a lot of investment. We didn’t have to buy every flavor of Android phone out there to test our app ourselves—we can use Apptest.ai’s devices instead. (The actual name of the system is the Device Farm.)

Also, we have some apps that need to be tested on low-end devices for certain projects. Working with the Apptest.ai team, I sent them three of our low-end target devices, and they added them to their mobile test farm for us. Now we can test with those devices along with all the others, and let the AI figure out the issues for us 🙂

Anything you think we can improve to make testing even easier for you?

Yes, while we have definitely saved a lot of time, getting things up and running at the very beginning was a bit challenging! To get this all working well, it requires a lot of things to be configured and set up correctly. It’s a bit of a new paradigm to do testing like this, so it took a little while for our testers to get used to having an AI assistant and to understand how to best leverage this new resource.

Also, we build tablet PC apps, and this was a bit more of a struggle to make work with the AI because we had to run it locally using Apptest.ai’s Stego application, as the mobile device rack doesn’t have tablet PCs available. I think adding tablet PCs to the mobile test farm would be a great enhancement.

Thanks Justin, any final thoughts?

Sure, so Apptest.ai has helped us save a lot of time and headaches, and has made our test team more efficient. But the things I’ve found most valuable are the automated test scenarios and the comprehensive coverage across nearly any device you’d ever need to test on. That’s the really cool part—it saved us a lot of expense by not having to buy all those phones ourselves. I also think it’s a great tool for any startup to try, as well as for mobile app development agencies.

Leave a Reply

Your email address will not be published. Required fields are marked *