Welcome everyone, this is the 3pm talk in the Common Ground track, Free Your Mind: Battling our Biases.

I’m dade, I’m a staff security engineer for a fintech startup and an independent security consultant on the side. I have a background in red team work at companies like Oracle and Intel, which will come in handy later on in this talk. If you’re interested in more about me, you can find basically all of my links at 0xda.de.

A quick disclaimer, this is not a technical talk. If you’re here for the latest buffer overflow in a poorly written C program, this is not the place for you. This is a talk about biases, the unexpected benefits of being a beginner, and changing the way we interact with our colleagues and peers.

Before we get started, I want take a brief informal survey. I’m going to ask a couple questions, and I just want you to raise your hand if the question applies to you.

Have you recently done something that you later felt was dumb?

Have you recently refrained from asking a question because you were afraid people would think you were dumb?

Have you recently been annoyed when someone asked a question that you thought they should know the answer to?

Have you recently refrained from sharing a piece of information because you assumed everyone already knew it?

(ad lib based on responses to survey)

I’d like to start off by talking to the beginners in the room. The people who are new to the industry, or even just new to their current job. We’re going to discuss some common feelings and how they can be associated with various cognitive biases, then we’ll chat with the more experienced folks, and we’ll end with a technique that I think can help us all battle our own biases more easily.

“Always be the dumbest person in the room” – I got this advice a lot when I was younger, and I think a lot of business guru type people will also give you this advice. An alternative might be “If you’re the smartest person in the room, you’re in the wrong room.” The idea here is that we should surround ourselves with people who are better than us at the things we want to be better at, that way we can get better by proximity to those people.

I speak from experience when I say that this can work very well if you want to rapidly level up your own abilities. It can also be exhausting, for sure, because you’re more likely to feel like you’re constantly behind your peers. But it can work well. At least, it works well in the beginning.

You see, as a beginner, we’re not burdened by the curse of knowledge. We know what we know, we probably don’t know some of what we know, but we definitely don’t know what we don’t know. We don’t have the years, or even decades, of historical context around any given decision or around any given problem. We can look at it with fresh eyes and think of solutions completely unburdened from the shackles of reality.

This can be a super power if we’re in an environment that will let it flourish. But it can also be the source of a great sense of shame and disappointment if we’re in a toxic environment.

I mean, if we’re the dumbest person in the room, that would imply that we’re the least valuable person in that room, which means that asking a question might just be a waste of everyone’s time, right? I mean they have years of experience, surely they’ve already thought of whatever dumb thing I wanted to ask. Or at least, that’s how I felt as a beginner.

If we’re the dumbest person in the room, then when someone else says something, they must know what they are talking about. Even if we don’t understand it, it must be right, right? I mean, they are the authority, aren’t they?

But what if everyone feels this way? What if every one of us feels like we’re the dumbest person in the room? Then we’re all agreeing to whatever happens to be said, regardless of if it is right or not. We’ve created a bandwagon effect that just leads to worse decision making.

I think it’s important to remind yourself that you are not alone. If you have a question, there’s a good chance you’re not the only person who has that question. Or maybe someone else had that question a few weeks ago and they can help answer it for you, which helps you and it helps them reinforce what they’ve learned.

If we choose to not ask a question, or to not attempt that new project, or not commit to a project because we think we can’t do it, we’re engaging in a form of self-handicapping. If we stick to only doing the things we know we are good at and never attempting to do something that challenges us, that’s self-handicapping. Self-handicapping can help preserve our self-esteem by helping us to avoid perceived failure, but it can also hurt our confidence by preventing us from experiencing meaningful personal growth.

When our minds are free of assumptions about how a system works, how something should work, we are free to be curious and experiment. We are free to ask questions, we are free to try new things, we are free to experience growth and development. But we are also free to be wrong. In fact, we’re likely to be wrong. A lot, probably. But being wrong isn’t something we should fear. Being wrong helps us change the way we perceive the world, perceive the problems we’re facing, and helps us to overcome those problems.

When I was in third grade, I did a report on Thomas Edison. I didn’t know all of the things I know about him today, but to a third grade nerd, he seemed like a good subject for a report on a historical figure. One quote, however apocryphal and paraphrased it may be, has stood with me ever since.

“I have not failed ten thousand times. I have not failed once. I have succeeded in proving those 10,000 ways will not work.”

This quote captures an essential re-framing of the concept of failure. A re-framing of the concept of being wrong. Being wrong does not mean we are not successful. Being wrong is but one stop on our journey to success. Being wrong is a great way to learn what is right.

There’s a concept in Psychology called “Shared Information Bias,” which suggests that a group of people will spend most of it’s time and energy talking about the things that everyone in the group already knows, and less time focusing on the things that only a few people might know. This has some meaningful business impact, if we think about it. I mean, if we’re having a meeting, we want to make sure the right people with the right context are in the meeting in order to reach some consensus on a decision and move forward. In business we don’t have the luxury of discussing the merits and shortcomings of every possible solution before we move forward.

But it also means that sometimes we are neglecting to make the best informed decision, because selecting the right people for a meeting is hard and relies on my understanding of what other people know. It leaves out people who might know a great solution, but weren’t included in the meeting, or weren’t included on the email thread.

I don’t think there’s one clear solution to overcoming this tendency. It’s going to be a game of balance, because we can’t just entertain every idea that everyone has before we make a decision. We can’t invite every person to every meeting. We can write documents and make them widely available, but we can’t ensure everyone is going to read it, in fact most people probably won’t read it. So is there still value in writing it? If you’re an IC like me, you’re probably more inclined to scoff at the idea of having to write down every proposed decision, the context, the consequences, etc. If you’re a project manager, an executive, or someone who just really loves formal process, you’re probably very excited by this idea and also very annoyed that people like me won’t follow your process.

But we should definitely be thinking about how to overcome, or rather, counteract shared information bias, and if you have tools you’ve used to help overcome this, I’d love to hear about them.

Amusing note: Shared information bias would suggest that most of you in this room are here because you already know a lot about these concepts. Kudos to those of you who ventured outside your realm of expertise to be here.

Switching gears, I want to talk to the experts in the room. Those of us who have put ten thousand, twenty thousand hours into our craft. Those of us who have forgotten much of what we know until we are randomly asked one day and it all comes flooding back to us.

As we grow in our field, we become saturated with various biases. Even if we think we aren’t biased, or that we experience bias less than our coworkers, that is a bias in itself called the Bias Blind Spot.

We accumulate knowledge over the years and that knowledge helps us make informed decisions about our work. That accumulated knowledge is why we are so valuable. But it also represents a challenge for us, as well.

If we’ve been in the same environment for most of our careers, whether that is the same job, the same company, or the same role within the industry, we are likely to face the status quo bias – or our tendency to prefer things stay the same because that’s what we know.

We become burdened by the curse of knowledge, finding it difficult to see the perspectives of people who haven’t been popping (or patching) shells for as long as we have. Even if those perspectives might be better than ours in some regards, we are probably going to have a hard time seeing it because of what our experiences have shown us.

We face confirmation bias, favoring the things that we’re familiar with, favoring the things that align with our pre-existing beliefs, and subconsciously leading us away from things that challenge our beliefs.

Several of these biases help steer us towards decision making that makes it difficult for beginners to be heard. They help steer us towards making the same decisions we’ve always made. They help steer us away from anything that challenges our status as an expert on a topic. But we have to make room for beginners. We have to actively encourage their participation, their confidence to ask questions, their sharing of ideas, their ability to approach problems in new and novel ways. Sometimes, we have to let them fail, because if we know their idea won’t work and we tell them as much, they may not feel comfortable sharing their ideas again.

We have to lead by example. Sometimes, even if we know the answer to a question, it can be valuable for us to ask the question anyways. By actively making this decision to ask the questions that we think others might have, we are encouraging a culture where asking those questions doesn’t feel so scary or overwhelming for others. We are helping to make sure everyone in the room knows the same information, and helping make sure that others are more comfortable speaking up when they have questions or concerns.

Our mental models of how systems work are often biased by our experiences and by the knowledge we already have. In any advanced system, whether that system is comprised of computers, of people, or some combination thereof, it is surprisingly easy for our mental models to quickly become inaccurate.

By making a conscious active effort to free ourselves of the constraints of our own mental models, we can look at things in a new light and find interesting ways to improve them. We can think critically about things that we otherwise take for granted. But making this effort is difficult. It requires going against every impulse our brain is telling us. It requires challenging ourselves at fundamental levels. But there are exercises we can engage in that help these challenges get easier and encourage us to more easily slip into this divergent way of thinking.

In 2007, Sir Ken Robinson gave a TED talk that posed the question of whether or not schools kill creativity. In that talk, he brings up the idea of divergent thinking, the concept of seeing a lot of ways to interpret a question, which opens up a lot of possible answers to that question. He gives one particular example that I have found myself using as a reference a lot.

How many uses can you think of for a paperclip? Most people might come up with 10 or 15 uses. People in this room might even outperform that, on average, and come up with 40 or 50. But people who are really good at it might come up with 200 uses for a paperclip, because they’ll challenge the assumptions of the question. Who said that the paperclip was a conventional paperclip? What if it was 200 feet tall and made of rubber? Suddenly, the uses for a paperclip can expand dramatically, by simply suspending our pre-conceived notions around what “paperclip” means.

This is, in my opinion, the essence of red teaming. Red teaming has nothing to do with hacking computers, though that is the way the computer security industry has hijacked the term. The actual skill itself that makes someone a valuable red team member is their ability to think divergently. Their ability to look at systems and think “What if X wasn’t X? What if it was A?”

When I got interviewed for my first red team job, one of my interviews revolved around a scenario in which I was an electrician. In front of me was a light hanging from the ceiling, and behind me is a light switch on the wall. The light is currently on. List ten ways to turn the light off, ten components of a functioning light, ten ways to tell if the light is off, and ten ways to prevent someone from being able to turn the light off.

This scenario originates from a document titled “Jack of All Trades” that dates back to 2001, created by Pete Herzog. It’s stated purpose is to teach security professionals to think outside the box and learn to use their knowledge in different ways. It puts the trainee into scenarios that they are not likely to have a lot of experience in, and requires that they come up with answers based on the scenario.

This is a great example of an exercise that helps to develop our divergent thinking skills, and has formed the structure that I’ve used to construct dozens of scenario-based questions since then, tailoring scenarios to be more appealing to the target audience.

About 6 years ago, I was visiting home and was asked to come speak at the local career tech center about my career in security. I had purple hair, not quite as many tattoos as I have now, and showed up in an all black outfit with an extra long black hoodie. I looked like I got trapped somewhere between a Hot Topic and the Matrix. I told the kids about my experiences in school as well as my experiences doing red team work for a large tech company. I got to demonstrate the perils of plugging in random USB drives, both with the USB Rubber Ducky, as well as with the USB Killer. To this day, I am grateful for the generosity of the class teacher, who was willing to sacrifice an old machine just so I could show some kids how to permanently damage a motherboard.

But I also used this opportunity to talk to the students about divergent thinking. I gave them a scenario not unlike the Jack of All Trades electrician scenario, but more tailored to things that might resonate with them.

You have a test next Friday, but the new Call of Duty also comes out that day. How do you get out of taking the test? How do you get your friends out of taking the test? How do you get the whole school out of taking the test?

I gave the students a few minutes to jot down some answers to themselves, and then asked for volunteers to share some of their solutions. They were hesitant, at first. The first few people to share the solutions were picking more conventional things, like trying to convince the teacher to reschedule or calling in sick. But then one student broke the divergent thinking barrier, and proposed that he would break all of the printers in the school so that the teacher couldn’t print the test out.

Then the flood gates opened. The kids felt more comfortable sharing their more creative ideas. One student said he would put raw fish in the HVAC system and then crank the heat up and break the knob off. Brutal.

Another student said they would cause a car accident to take out a power pole nearby the school the morning of the test. If the school had no power, they probably wouldn’t have students come in that day.

Once the barrier of the conventional was broken, the students probably came up with four or five dozen ways to get out of taking that test. I’ve never been so proud.

Let’s actually take some ideas from the audience. Going back to the Electrician scenario, anyone who has an idea for how to turn off the light, please shout it out.

(Audience interaction section)

What about ten ways to prevent someone from turning off the light?

(Audience interaction section)

Before I wrap things up, I wanted to acknowledge a special thanks to Toby Kohlenberg, who taught me what it means to be on a red team and who routinely encourages me to challenge my own assumptions and my own beliefs. Toby is the one who introduced me to the Jack of All Trades scenarios, and who offered me my first red team job. I’m proud to consider him a mentor, even if these days we don’t talk as often as we once did.

I also wanted to give a special thanks to Kelly Shortridge, who inspires me and who encouraged me to explore this talk, as well as whose ideas have helped shape my own beliefs around security and challenging the status quo of the industry. Challenging our own mental models plays a key role in her book, Security Chaos Engineering, which I highly recommend reading.

So, to wrap things up, I hope this is what you take away.

As an expert, go out of your way to ask questions that you think others might need to know the answer to, even if you already know the answer. Ask to clarify acronyms. Ask to clarify assumptions that people are making. You can lead by example and pave the way for a much more productive and informed team.

As a beginner, be curious. Be inquisitive. And don’t be afraid to be wrong. If someone says something you think is wrong, ask to clarify. Don’t assume that just because someone with 20 years of experience said it that it’s automatically right. Seek to understand why they believe what they believe.

Finally, and most crucially, engage in divergent thinking. Challenge assumptions. Challenge your own beliefs. Challenge your own mental models.

This is how we become better, not only at our jobs, but as people.

Thank you.