Are you good at thinking outside the box? What does that even mean? As a tester, it’s your job to think of the things that others don’t. The reality is that there are many ways that people, even very smart people, can make really big mistakes. These mistakes can lead to massive problems. Problems that, after they appear in a live production environment, can seem obvious (how could you miss that one ?!). But they were clearly not obvious before they happened. How do smart people make such big mistakes, and why do smart testers not spot them?
In this session, we will take a look at some of the biggest software bugs in history, and ask how we, as software testers, could have detected or prevented these errors from happening. Not just by testing. We can only find bugs if we run the right test cases. The question is, how can we discover the test cases that nobody else thought of? Hindsight is 20/20 vision, as they say, and it can be great fun to look back and feel smug as we spot the obvious flaws that led to a catastrophe. But we have to remember that there were lots of smart people involved in these projects, and they made those mistakes. So what can you do to build up your resistance to making big dumb mistakes? Systems thinking is an approach to understanding complex systems.
Systems thinking means understanding a system by examining the linkages and interactions between the elements that compose the entirety of the system. It can be a big help in understanding complex systems and to predict how they might fail. In this session, we will look at some Systems Thinking tools and techniques that you can use to look at a system, a process, or a product in fresh new ways and see the complexities in the system that can lead to dramatic failures. Systems thinking may not solve all your problems, or turn you into a testing superstar, but it will certainly give you a very sharp edge.