Loading…
CAST 2013 has ended

What we learn from our experiences helps shape us as human beings. What we learn from the experiences of others can give us new insight and new perspectives. At this year’s conference speakers will share with you the lessons that they have learned in Software Testing, as well as how these lessons influence the way that we approach testing both now and in the future.

Learn More and Register »


Install web app: bookmark http://cast2013.sched.org/mobile/ on your iPhone, Android, or Blackberry
Hall of Ideas G [clear filter]
Monday, August 26
 

9:00am CDT

Software Test Attacks for Mobile and Embedded Devices
Today's expectations for many software testers include addressing mobile and embedded devices. Unfortunately for many companies, churning out complex or critical mobile and embedded applications while keeping pace with emerging technologies is fast becoming the norm rather than the exception it was just a few years ago. Competitive pressures place a burden on software testing resources to succeed with shortened project schedules, minimal strategic planning and/or staff new to mobile and embedded software.

In the style of James Whittaker’s Books on breaking software, Jon Hagar and Jean Ann Harrison will provide specific in depth test attacks aimed at uncovering common mobile-embedded software bugs. The session provides an opportunity to gain a basic introduction to a series of attacks which are based on industry error taxonomy. Exercises to test for bugs within software on real devices will give attendees hands-on testing experience. Attacks are applicable to software systems include: mobile-smart phones, medical systems, automotive devices, avionics systems, and industrial devices.

The tutorial is hands on, so bring your mobile devices (smart phones, tablets or any mobile device). Also, we will provide some devices (robots and games) so attendees can practice some attacks. The goal of the session is to give attendees practical test attacks for use on their future mobile and embedded software projects.

Facilitors
avatar for Markus Gärtner

Markus Gärtner

it-agile GmbH
Markus Gärtner works as a testing programmer, trainer, coach, and consultant with it-agile GmbH, Hamburg, Germany. Markus, author of ATDD by Example - A Practical Guide to Acceptance Test-Driven Development, a student of the work of Jerry Weinberg, founded the German Agile Testing... Read More →

Speakers
avatar for Jon Hagar

Jon Hagar

Systems Software Engineer, Grand Software Testing
Jon Hagar is a systems-software engineer and tester consultant supporting software product integrity, verification, and validation with a specialization in embedded and mobile software. Jon has worked in testing for over thirty years. Embedded projects he has supported include... Read More →
avatar for Jean-Ann Harrison

Jean-Ann Harrison

system/software QA Engr, Thales Avionics
Jean Ann has been in the Software Testing and Quality Assurance field for over 15 years including 7 years working within a Regulatory Environment and 8 years performing mobile software testing. Her niche is system integration testing with focus multi-tiered system environments involving... Read More →


Monday August 26, 2013 9:00am - 6:00pm CDT
Hall of Ideas G One John Nolen Drive Madison, WI 53703
 
Tuesday, August 27
 

11:05am CDT

Exploratory Automated Testing
Director at Large

When most people think of automated tests they picture automating what human testers do in running their tests. Sometimes this is what we desire, but it isn’t the most powerful way to use test automation. ETA is a testing approach that uses the power of the computer to look for bugs that functional testing misses. Unlike regression tests that do the same thing each time they run, exploratory tests do something different each time. The key to this type of testing is the test oracles – checking for abnormal behavior.

Speakers
avatar for Doug Hoffman

Doug Hoffman

BACS, MSEE, MBA, ASQ Fellow, ASQ-CSQE, ASQ-CMQ/OE (quality management), Software Quality Methods, LLC.
Douglas Hoffman is an independent consultant with Software Quality Methods, LLC. He has been in the software engineering and quality assurance fields for over 25 years and now is a management consultant in strategic and tactical planning for software quality. He is past section chairman... Read More →


Tuesday August 27, 2013 11:05am - 12:20pm CDT
Hall of Ideas G One John Nolen Drive Madison, WI 53703

1:30pm CDT

How to find good testers in the rust belt
This is an experience report from a test manager discussions the hiring of testers over the past eight years in a tertiary market with details on what has and has not worked for me (so you can get ideas that might work for you).  If you don't happen to work in one of the top 10 tech markets and you still need to hire testers, this session is for you.

 

This session will offer the following key takeaways:





  • Why there aren't enough testers out there (who know they are testers).



  • The pros and cons of different backgrounds (yes, including CS majors) and why each made good candidates for me.



  • Why hiring based on abilities and mindset over credentials and degrees can lead to good candidates in the door, why this likely means "losing" more people to other departments, and why it's OK to stop being so greedy.



  • Ways to change your “getting applications in” process or "How to keep HR from throwing out all the good candidates".



  • Why you need to get out and hunt down candidates instead of hoping they find you.




Speakers
avatar for Erik Davis

Erik Davis

Manager of Testing, Hyland Software, Inc.
Erik is currently responsible for the overall testing effort of a team of 170 testers. He owns, reviews, and finds ways to improve the way testing is done including; bringing new ideas to the team, finding ways to engage testers in testing as a career, and building a stronger community... Read More →


Tuesday August 27, 2013 1:30pm - 2:45pm CDT
Hall of Ideas G One John Nolen Drive Madison, WI 53703

3:00pm CDT

Exploratory combinatorial testing
The promise of Combinatorial Test Design is that, when used thoughtfully, it often results in:



  • Increased variation between tests (which helps find more bugs),

  • Decreased repetition between tests (which improves tester productivity)

  • Very efficient coverage of user-specified thoroughness goals (which helps testers maximize both their thoroughness and efficiency).


The reality is rarely so straightforward.  Particularly when Exploratory Testers try to apply this test design approach. 

In this presentation, Justin Hunter:


  • Expands upon concepts that have been laid out by Jon Bach and Rob Sabourin

  • Acknowledges "the elephant in the room" (e.g., that practitioners often use Combinatorial Test Design methods to try to create highly-detailed test scripts, which is a repugnant goal for Exploratory Testers)

  • Describes practical ways that testers have successfully blended Exploratory Testing strategies and Combinatorial Test design

  • Highlights some of the significant challenges that Exploratory Testers face when applying Combinatorial Test design


Key ideas/outcomes you want to share with the attendees:



  • Combinatorial test design strategies can be used in many more places than Exploratory Testers probably realize

  • These strategies can successfully be applied at the "test charter" level in addition to the test case level 

  • Combinations can create engagement through priming effects






Speakers
avatar for Justin Hunter

Justin Hunter

CEO, Hexawise
Justin Hunter, Founder and CEO of Hexawise, is a test design specialist who grew up in the fine town of Madison, Wisconsin who has enjoyed teaching testers on six continents how to improve the efficiency and effectiveness of their test case selection approaches. The improbably circuitous... Read More →


Tuesday August 27, 2013 3:00pm - 4:15pm CDT
Hall of Ideas G One John Nolen Drive Madison, WI 53703

4:45pm CDT

Testing under Pressure
For most of us, testing shares more in common with emergency response than with airplane maintenance. In a perfect world we’d check the torque on every bolt, and leave the runway with 100% certainty every flight. Most testers don’t have that luxury; we’re thrown at problems, and have to solve them as quickly as we can, with whatever tools we have. We’re expected to quickly understand new contexts, to deal with high pressure, low resources, and rapidly evolving situations. I’ll be comparing my experience as a firefighter to my experience with testing. We have to imagine the worst case: we enter a scene with little or no information, an urgency of action, and limited resources. It’s imperative to get in and out quickly, to prioritize the critical, high impact response, and to handle whatever unexpected challenges the job is going to throw at you. Every situation is different, and there’s never enough information, so how do you prepare for the unknown?

Speakers
avatar for Geoff Loken

Geoff Loken

Quality Analyst, Athabasca University
I'm currently charged with the oversight of software testing at the Athabasca University. Before that I spent a bit of time doing testing at Bioware, out of Edmonton. In my copious spare time I train, blog, and attend conferences about QA, and will be speaking for the second time... Read More →


Tuesday August 27, 2013 4:45pm - 6:00pm CDT
Hall of Ideas G One John Nolen Drive Madison, WI 53703
 
Wednesday, August 28
 

11:05am CDT

Elephant Whisperer inspired lessons learned in Software Testing in South Africa
A selection of lessons learned in software testing in the Financial Service industry in South Africa is compared to the experiences of a conservationist. The conservationist undertakes the mammoth task of settling a herd of badly behaved, traumatised elephants onto a private game reserve.  The size and complexity of a test environment, the execution and reporting techniques used and the development of context driven and exploratory testing principles and philosophies are explored.  Cindy Carless uses the deeply moving and entertaining account of settling the elephants while allowing them to remain wild to support her philosophy.  She asserts that the real value of software testing is to provide meaningful information to decision makers.  This information is used to determine the readiness of the software to add value to the business it is being used to support and enhance.

Speakers
avatar for Cindy Carless

Cindy Carless

Test Analyst, Micro to Mainframe
Cindy Carless is new to the software testing field and is enjoying being exposed to the new skills and philosophies that so resonate with her. She has experienced a diverse career that started with a BCom qualification that took her into Financial Management. She then moved into... Read More →


Wednesday August 28, 2013 11:05am - 12:20pm CDT
Hall of Ideas G One John Nolen Drive Madison, WI 53703

1:30pm CDT

Famous software failures and what we can learn from them
Death, injury, and physical harm.  Loss of tens or hundreds of millions of dollars.  World-wide and even galaxy-wide embarrassment.  These are just a few of the consequences of some of the more famous software failures over the last couple of decades.  These failures have  received general interest press attention in the past, but have rarely been analyzed to understand how a rigorous testing process could have had an impact on the failure. 

Peter examines six publicized software failures, and discusses how effective testing may have brought about a different outcome.  He details the circumstances surrounding these failures, and offers lessons to testers on the importance of certain aspects of testing and evaluating the quality of critical applications.  By studying known failures and their causes, we can
add value to our own quality programs to help ensure we don't become a character in a future "famous software failure."

Speakers
avatar for Peter Varhol

Peter Varhol

Peter Varhol is a well-known writer and speaker on software and technology topics, having authored dozens of articles and spoken at a number of industry conferences and webcasts. He has advanced degrees in computer science, applied mathematics, and psychology. His past roles include... Read More →


Wednesday August 28, 2013 1:30pm - 2:45pm CDT
Hall of Ideas G One John Nolen Drive Madison, WI 53703

3:00pm CDT

Testing when software must work
Jet Propulsion Laboratory (JPL) in Pasadena California develops unmanned spacecraft to explore our solar system. Each of the spacecraft is unique and its software is unique. Spacecraft are limited systems with a finite amount of power, fuel, data storage, etc. The various spacecraft have their own flight software that commands it and ground software that evaluates the commands that are sent to the spacecraft to make sure that they do not damage the spacecraft. All of this software is extensively tested because if it doesn’t work, the spacecraft could be destroyed. Not only would billions of dollars be lost, but the scientific discoveries that the spacecraft would obtain are lost as well. Software testing is crucial and the lesson learned is “Test as you fly; Fly as you test”. What that slogan means varies depending on the type of software involved. This presentation will discuss the various types of software and the types of testing required to be able to assert that the software is tested as you fly and the spacecraft software is flown as you test. In addition, videos of the spacecraft and its mission will be shown to help with understanding the task and to demonstrate how the various types of software are used, and tested as well as the goals/purposes of that software.

Speakers
avatar for Barbara Streiffert

Barbara Streiffert

System Engineer, Jet Propulsion Laboratory
Barbara Streiffert is a Senior Systems and Software Engineer at Jet Propulsion Laboratory (JPL) specializing in the development of software approaches for use in ground data systems for spacecraft missions. She has worked in all aspects of systems and software development for commercial... Read More →


Wednesday August 28, 2013 3:00pm - 4:15pm CDT
Hall of Ideas G One John Nolen Drive Madison, WI 53703
 
Filter sessions
Apply filters to sessions.