By Scott Herbst, PhD
bSci21 Contributing Writer
I recently found myself at Village Capital’s Education US 2016 Launch. If you have no idea what that is, you’re not alone; neither did I. About a month earlier, I had found this event on eventrbrite, it was free, it had the words “capital” and “networking” in it, and so I signed up. At least, that’s what I think happened, based on the detective work I did after a reminder to attend popped up in my calendar and I thought, what is this?! I had no idea what to expect. Was this going to teach me about venture capital? Was it a grand opening for a business? Is this some sort of network marketing scheme? I had no idea. I was reasonably sure there would be free food and drink. Armed with that knowledge, I stepped into the unknown.
I was beyond pleasantly surprised. It turns out that the education launch was the final evening and celebration for completing the first week of an educational tech-accelerator. If you don’t know what that is, a tech-accelerator is where a group of executives from tech companies that are all looking to go to the next level come together for an intensive educational experience to learn the things that it takes to do that. What was unique about this event is that all the companies represented were focused, in some way, on the technology of education. The featured event of this particular evening was for each founder or CEO participating to share their newly reworked value proposition. I was rapt as I watched 12 CEO’s and founders from all over the country share the problem their product addresses, their unique solution, and the data indicating their solution works. There’s so much possibility here! I thought. Before and afterwards, I spoke to several of them and asked if they would like to share their products and ideas with a lot of behavior analysts. Michael Simpson, founder and CEO of Pairin, whose mission is to make education relevant for students and their future employers, was the first to agree to an email exchange. What follows is the transcript of our back and forth.
Scott: Thanks for agreeing to do this. Let’s start right off with your mission statement. You make education relevant for students and their future employers. That sounds intriguing. How do you do that?
Michael: Over 50% of college graduates are in jobs unrelated to their degrees. An equal amount are woefully underemployed, and only 11% of employers strongly agree that the education system is preparing students well for the demands of the real world. More and more we hear employers lament that they are happy to teach the job-specific skills, if they could just find applicants that had the right raw materials to learn and perform well. When we survey high school students, we find that the inner city, so-called at-risk students, are 12-17% higher in resilience, persistence and many leadership qualities, but about 20% lower in motivation than suburban white kids. It isn’t that they don’t have what it takes to succeed, they just don’t make the connection to what they are learning in school to what they need in real life. So, if students and employers both want the same thing out of our education system, and 99% of teachers (according to an Education Week survey) believe developing social emotional skills improves classroom management, school climate and students’ ability to learn, why isn’t it happening? That’s what we are focused on. Helping employers identify what makes their people successful, so they can influence education. And helping educators, who has never been taught how to develop students beyond academics, how to measure and develop what everyone wants. It’s actually a bit crazy that this has had such little focus until recently.
Scott: I’ve got to warn you at this point: you’re veering into dangerous waters in terms of what a behavior analyst will see as acceptable explanations of behavior. We really like to deal with the observable – things that you and I could both look at, watch happen, and agree that they happened. We tend to have trouble with explaining behavior in terms of things we can’t directly observe. I can imagine that, when you use terms like “resilience,” and “persistence,” that the hair on our reader’s neck is starting to stand up. That said, when we met, you made it very clear that you weren’t talking about “personality traits,” but moreso skills groups (and you can correct my terminology). That is, you mentioned that scores on things like “resilience” can be moved by specific interventions. I’m wondering if you can talk about some of the constructs you measure, and what relationships you’ve seen between those and academic and/or social outcomes.
Michael: Behavioral analysts aren’t the only ones that like to deal with what is observable. Teachers also trust what they see more than what a psychometric assessment can tell them. Teachers, though, have to create the biggest impact they can, in the shortest amount of time. Kids don’t get a do-over at being kids. If teachers only trusted what they saw, without the training of a behavioral psychologist, where would they begin with a student? Often, the year has passed before they “figured out” how to reach that kid that needed them the most. With regards to soft skills, (I hate that word, BTW. They are VERY tangible.) they don’t even have a clear understanding of their meanings. That’s why we advocate using our quantitative assessment to baseline a group of students to identify strengths that students and teachers may be blind to, and the most critical development needs to address. Then use our observational rubrics with exercises throughout the year to measure growth. The combination gives teachers, students and parents a common language, especially since you share the rubrics with the students. It also helps identify barriers to development, which are the mindsets like Self-Blame that get in the way of, or enable, change. Personality measurements might help in understanding someone’s thinking style and approach to problems, but it is debatable how changeable is personality. It’s possible that you can create more harm than good by measuring things that put students in a box, instead of highlighting areas of development and giving them hope for change.
We have replicated every major non-cognitive skill and mindset psychometric with one survey that measures Emotional Intelligence, Character Strengths and Virtues, Personal and Professional Behaviors, and the Mindsets that I mentioned before. With one group of 4500 students in a blended learning program for dropouts, we were able to identify that their lowest aggregate score was Change (the drive to pursue variety and change in your life). It was about 20 points lower than the average in suburban high school students. Which makes sense, given that when your life is in turmoil, you seek control, not change. But, if you need to break out of a destructive behavioral pattern, it is the very thing that you must pursue. The students in the lowest quintile of Change made up 72% of the students with failing grades. Connection? We grouped them together to identify other patterns and found out that a high percentage of them were high in Self-Blame, low in Emotional Self-Awareness, Self-Confidence, and Self-Assessment. We call those “imperatives”, because if those are in need of significant development you have to start there. That school then implemented our lesson and exercises for developing Emotional Self-Awareness to help the students identify, name, and be able to defend or ask for help with their emotions. They also gave them the lesson on Self-Assessment to give them the skills to accurately evaluate their own performance. The results were astounding! Within one semester, students said they felt less susceptible to the opinions of others, were better communicators, and more stable. Attendance and grades both improved measurably. The results were so impressive, we filed for an IES grant with the US Dept. of Education to validate in a three year study our method of developing imperatives. We are working in conjunction with Marzano Research, and a key director with CASEL is on our advisory board.
Scott: Thanks, Michael. This is exactly what caught my attention when we met. I’m going to use this opportunity to share what I see about this from a behavioral perspective, and you can put in correction for anything I got wrong or missed. When you say that a mindset like “self-blame” is a barrier to success, what I read as a forward thinking behaviorist is a particular way of looking at the world. One thing we’ve been investigating in our field is the effect that language we speak (or think but don’t speak) about the world and about ourselves actually changes the way the world looks and, as a result, changes our behavior towards. So, something like “self-blame” is probably just a general way of speaking about one’s self and the world. And, it’s also a behavior. My mindset, from our perspective, is something that I’m actually doing. But it’s one of those things that I do so automatically that it doesn’t occur to me that I’m doing anything. It’s like if you put on tinted glasses. After awhile, you’d stop noticing that the world had a tint to it. And yet, you are still wearing the glasses. It sounds like you’ve developed an assessment that is particularly useful towards pinpointing general, linguistic behavior patterns. What I mean by that is that, I may never actually say or think, specifically, “I blame myself,” but I might spend a lot of time thinking, “I screw everything up,” “It’s all my fault,” “I always ruin it,” or whatever, and that would all revolve around this general way of looking at the world that is, functionally, self-blame.
What excites me about this is that, if we look at this mindset as a set of behaviors, then it gives us something to work with in terms of training new mindsets, and it sounds like that’s what your programs are doing. As for the “soft-skills,” I’m with you. I don’t like that term either. I bring it up now because I think we call them “soft” because they’re difficult to measure. How many times have we said about someone, “There’s something a little bit odd about him that I can’t put my finger on.” What I find compelling about what you’re doing is that you’re working to crack that nut, and then developing programs that validate your assessment by being useful in then making a meaningful change. Now, fair warning, some behaviorists are going to have a problem with the paper-and-pencil- nature of your assessment, but from my perspective, if it’s useful in recommending a course of treatment, then you’ve really hit on something.
I’ll quit fawning now and ask a question: can you give us some insight into some of the exercises you use to teach emotional intelligence? And of course, feel free to respond to anything I wrote.
Michael: Scott, you are spot on with your points, especially if you were referring to me with your “There’s something a little bit odd about him that I can’t put my finger on.” statement . Certainly mindsets and behaviors overlap. We typically won’t know how someone thinks or feels until they act on what’s in their head, right? I’m not sure if this will jive with behaviorist language, but as a coach, I always look to uncover the thoughts and feelings that precede an action. When helping someone create change, we always have better results when we get to the trigger to a behavior, and don’t over focus on the action. In many cases the behavior is the clue to where we need to go, not where we need to focus as a coach. Whether we semantically lump what we call a mindset in with its associated behavior or not, I think, is less important than just knowing their relationship. I often explain mindsets when debriefing someone on their Pairin results by pointing out the actions that clue us in to those. For example, when speaking of self-blame, I often use the example of when someone with low scores trips a little on the sidewalk they look at the sidewalk and think, what the heck “made me” trip? Someone high in self-blame will look inward and think, “I am such a clutz!”
It really comes down to a question of granularity. Without a measurement that can isolate the nuance, it is easier to lump them together. That’s one of the benefits of a measure that calls each of them out. We think of a mindset as a perception that can be directly manifested as a number of related behaviors and that can impact many others. For example, self-blame has its own set of demonstrable actions, but it is also an inhibitor to other skills. When you can knock that domino over, a series of behaviors change that are not directly related. Confidence, self-assessment, openness to change, leadership qualities, etc., all move to another, observable, and usually lasting, level.
As I mentioned earlier, when you can identify those barriers to behavioral development and focus on the underlying causes, sometimes massive and rapid change can occur. “Everyone hates assessments”. We hear that all the time, until people take ours. Mostly because it is not paper and pencil, is highly graphical, and is designed to provide value first and foremost to the people that are actually burdened by taking the assessment. Even in high schools, we are known as the “Only assessment our students want to take.” For those who have doubts about qualitatively measuring these things, you can look at the history of the science, but the proof is in the results. Take it for free. We give it away to individuals, because we think the whole assessment market is antiquated, complicated, and too costly. We believe the real value comes when you’re sold on the data being sound, and want to use to for something other than mere understanding. We believe the real value is on the grouping of behaviors and mindsets into meaningful, predictive patterns, then mapping individuals and groups to those so you can make more intelligent decisions. For your audience, our data would be beneficial to kickstart the work you already do, to focus your efforts before you have historical behaviors to evaluate.
At the risk of sounding too blunt, I’m a much bigger fan of preventive medicine than autopsies. Doctors do tests to figure out what treatment to administer and then they do additional tests, observe, and adjust along the way. Those latter elements are used to prove or disprove the initial test results. That’s what we do. You get a measure to focus your regular assessment, understanding, and observations. If it proves right, you just moved from looking at historical info to forward change quite a bit quicker.
The resources we provide to facilitate development come in three categories:
– Tips for coaches and coachees, teachers and students: short 600 character actions tied to the insights that get you started moving in the right direction
– Curriculum: We are fortunate to have a teacher with a doctorate in curriculum design on staff. The emotional self-awareness and other curriculum uses proven teaching principles like backwards design and gradual release of responsibility.
– Resources: These include the observational rubrics that are tied to the quantitative measure, group and individual exercises, teaches and student guides, etc.
All together, with 56 of these, it is the most comprehensive set of soft skills development resources ever compiled. Currently all but the tips are delivered via Word, PDF and PPT, but next year, they will be delivered directly to students and adults through an interactive video experience tailored to their Pairin scores. I attached a copy of the Emotional Self-Awareness set, so you can look it over. Would love your thoughts.
Scott: Thanks, Michael. And thanks for sharing the materials with me. I like the point you made about observing that, if you change someone’s mindset, that other behaviors tend to follow even though you didn’t target those. In our field, we call things like that “pivotal” behaviors. For example, if we were working on basketball skills, I might work on dribbling a ball harder and faster. If my passing and shooting then improved, we’d call dribbling a pivotal skill. From my view, what I am hearing that you’re doing is working on how someone relates to themselves, and when you change that, a whole host of other behaviors follow. I think the big difference between you and I is that, I would call that way of seeing one’s self a behavior, and you would call it a mindset. I think it’s just a difference in vocabulary.
I’m going to wrap this up by encouraging my fellow behavior analysts to go check out your work. I know it will be something a little different than what they’re used to dealing with, but if we’re going to grow our field and where we’re relevant, I think these are things that are important for us to consider. We know from decades of research that we don’t see the world or ourselves objectively, and that how we see it (which is hard to observe) affects other behaviors (that are easier). Again, what I find interesting about the work you’re doing is that it takes one type of behavior (responses to assessment questions) and looks for relationships with other types, and then points to exercises that can influence the former, with implications for the latter.
So thank you, again, for taking the time to dialogue here. I enjoyed the back-and-forth and am leaving with some things to think about. I think once we get past our differences in talking about things, we’re really doing the same work. We’re looking for relationships, exploring them, and then when we find them using that to help people have fuller, more satisfying lives. Thanks for sharing the ones that you’ve discovered with us.
Scott Herbst, PhD is the founder and Lead Trainer at SixFlex Training and Consulting. After six years in academia, he left to pursue his passion of training leaders and managers to create, manage, and communicate in work environments where people are productive, excited, and vital. As a course designer, he grounds his curricula in cutting edge research in language and thinking as well as decades of research in operant performance. As a trainer, he is an engaging and powerful speaker who makes learning fun and exciting. You can visit his company site at www.SixFlexTraining.com, or email at email@example.com for more information.