Skip to main content

[More Conferences]

Eurostar Mobile Deep Dive 2015

I presented a session on mobile testing entitled "Technical Mobile Testing - Risk, Issues and Experiences" in Maastricht on 6th November 2015.

The full version of this talk is available in Evil Tester Talks

Eurostar Mobile Deep Dive 2015 Thumb Image

I attended the Eurostar Mobile Deep Dive in November 2015 to present a session on Technical Mobile Testing - Risk, Issues and Experiences.

Mobile testing offers numerous challenges and the potential for massive scope creep with so many device combinations and tool choices available. We can’t afford to test everything so we make decisions on scope. We have learn to use risk as part of the the decision making process: business risk, and technical risk. In this talk, Alan will describe his experiences of hands-on testing of mobile native applications and mobile web applications, and the application of technical and risk-based thought processes in deciding what, and how, to test.

Expand your knowledge of mobile technology and improve your mobile testing process. Learn to apply important techniques like; viewing traffic via proxies and Wireshark, and using local wi-fi hotspots. As well as multiple ways to screen capture for defect reporting, how to use emulators and browser developer tools, and how to assess the risk that each new tool adds to your process.

I drew upon my experience of hands on device based testing, and considered the tools we used, why we used them, what we gained and the risk we adopted as a result. I also explained the variety of decisions that we faced. I didn’t touchmuch on automating in this talk, as that was well covered by other talks at the conference. In the main, I focused on our test approaches and how we introduced ever more technical knowledge into our approach.

I did explain how we used tactical automating to hit a web page with headers scraped from UserAgentString.com to check server side adaptability.

Slides

Main Points

  • We currently test out of fear, we could test to address technical risk
  • learn to understand the technicalities of your application or website to target the technical risks of cross platform usage when deciding what to test, rather than testing ’everything’ on a subset of devices out of fear
  • poor ergonomics add risk to my test process so I add keyboards and that work around might add risk to the test process, in fact all such workarounds add risk, so we need to technically understand and communicate that risk to manage it in the team
  • the fact that we are testing on device at all, should be managed as a risk, because it means we are building something that we don’t fully understand or trust
  • build apps which use all the libraries and flows of your main app, but doesn’t have a ‘GUI’ but will self check the libraries and interaction and report back to a main server, then we could deploy the ’test app’ to multiple cloud devices and quickly receive information on compatibility without having to deal with lag time of manual interaction with the device for testing
  • Focus on pain in your process and remove the pain. e.g. typing on devices is error prone and sometimes I don’t have back and forward cursor keys so fixing the errors is painful - I could add a custom keyboard to the device, or I could add a physical keyboard to the device. By addressing the ‘pain’ in my process, I introduce a technical workaround that might introduce risk, but makes my process easier. And the risk is something we can investigate, discuss, assess and mitigate risk. Pain in the process is anything that gets in the way, stops you being effective - you may be so used to it, that you don’t even notice it - that’s dangerous.

Conference Notes

I visited Eurostar 2015 and popped in to a few sessions, but was mainly there for the Mobile Deep Dive.

Alan Page started with an opening keynote on using instrumentation at the AOSP level to provide usage and defect information automatically for an emulation layer that Microsoft had been working on. He also mentioned that his team aims for “More value. Better Quality. Every week” which sounds like a perfectly splendid goal to remind yourself of every morning, and revisit every evening to reflect on how you are doing.

Julian Harty’s talk built on this with an in-depth exploration of how much analytics can be used to help the testing process. I particularly liked the example where a study had examined the ‘phones used by the people most likely to leave a review’ on the app store, and then use those phones as the subset of devices for your on device testing.

Karen Johnson asked (and I’m paraphrasing from memory rather than a written quote) “do you analyse this as a human or a machine? Or both?” Only about 3 of us put our hands up to both. Karen provided a walk through of an example thought process for approaching the scenario and condition creation of constructing an automated walkthrough of a mobile app she had worked on. Karen also reminded us that systems change, so the automated solutions we create will also change and to not get hung up on permanence of an artefact and to remove it when it ceases to add value.

I popped in to the Wearables talk by Marc Van’t Veer and saw him using IOS to Quicktime streaming for capturing the camera from his iOS device to take pictures of testing on his smart watch. Also using Mobizen to stream the display of his device on to a desktop for realtime screen capture. Always good to see people using technological workarounds to improve their testing.

Jeff Payne provided the funniest talk of the deep dive conference which was also filled with valuable information. I made notes that I need to learn how to examine the memory of a running mobile phone, and examine the temporary files, cache files and databases on the phone itself.

Jeff’s quote “Test where easiest and most effective” struck a chord since it also relates to knowing where you are going to get the most benefit from testing a risk because you understand where and how you can test it, rather than just testing on every device.

I found this day a useful addition to the Eurostar Conference and was fortunate to have a whole series of good discussions during the day, and in the evening.

Social Media

Fortunately some gentle folk posted social media comments during the talk.

My #sketchnote from @eviltester’s #esconfs presentation “Technical Mobile Testing” #MobileDeepDive pic.twitter.com/YpV8z0czRu

— Zeger Van Hese (@TestSideStory) November 6, 2015

Benefits of using a proxy #mobiledeepdive @eviltester pic.twitter.com/eQioOTIJkW

— karennjohnson (@karennjohnson) November 6, 2015

automate mobile tactically, use user agent strings to simulate devices. @eviltester #MobileDeepDive #esconfs

— karennjohnson (@karennjohnson) November 6, 2015

“It was a stupidly big phone before we all had stupidly big phones” @eviltester at the mobile deep dive

— Alan Page (@alanpage) November 6, 2015

Alan Richardson @eviltester I don’t like testing mobile #esconfs

— Thomas Ponnet (@ThomasPonnet) November 6, 2015

@eviltester talking about technical testing and proxy tools @esconfs #MobileDeepDive pic.twitter.com/WHOtNSm34n

— Alison Wade (@awadesqe) November 6, 2015

#esconfs #MobileDeepDive @eviltester Totally agree: “Our test approaches in mobile testing are based on fear”. Fear is never a good guide!

— Ruud Teunissen (@RuudTeunissen) November 6, 2015

[List of Conference Talks]

This talk is also available,
with bonuses (e.g. transcript, extra videos,
exercises and resources),
in Evil Tester Talks: Technical Testing.