Aligning Practice-Based Coaching Efforts and Effects
Joyce Escorcia: Hello, everyone. Welcome to our webinar today. Thank you for joining us for our Coaching Corner Webinar Series, where we're going to be talking today about aligning Practice-Based Coaching efforts and effects. Thanks for choosing to spend the hour with us. We are super excited to have Darbianne Shannon with us, our guest expert. I am just going to ask her to share a little bit about herself Introduce herself to everyone today. Darbi?
Darbianne Shannon: Hi. My name is Darbi Shannon. I'm an assistant research scientist at the Anita Zucker Center at the University of Florida. I'm excited to have the opportunity to share today a little bit about our work.
Joyce: We are excited to have you today. During our webinar with Darbi, we are going to be talking about how the collective efforts of program leaders, coaches and education staff can have an effect on child and family outcomes. We're going to also be talking about how to identify sources of effort and effect data within a program using a tool called the Effort and Effect Cascade. We're also going to be talking about how to use data-informed decision-making to strengthen Practice-Based Coaching, or PBC, and professional development support.
We've got a lot planned for our hour together today. The information and materials that we are sharing today have been developed by the Practice-Based Coaching Data-Informed Decision-Making Model Demonstration Project, which was funded out of the Office of Special Education Programs, or OSF if you hear us use that acronym. We are super excited to share this information. I hope that you're able to walk away with some things that you can use and try out in your program. We want to draw your attention to our Viewer's Guide, which can be found in the Resource widget for today's webinar. There's going to have visuals and activities, and some take-back and use handouts that you can refer to.
If you haven't already done so, please just download that. We're going to get into things. We wanted to start the conversation today just by asking you how do you know if coaching is making a difference in your program. We know that you're joining us today, that many of you are coaches or you're supporting coaches or you're supporting coaching within your program in one way or the other. This may come up in conversation quite a bit. Using that Q&A widget, let us know how do you know that coaching is making a difference in your program. What does that look like in your program?
Sarah Basler: Joyce, while we wait for some of those responses to pop in, I just wanted to ask you from your work in the field, what are some things that you're hearing programs say about how they know if coaching is making a difference?
Joyce: There's different answers to that question. I know one thing that we hear a lot of folks look at child outcomes data, also looking at those kinds of teacher-child interactions, those observation tools, like the CLASS or the QSIT that they're looking at that data to see any increases or growth there. Survey data, many programs are surveying staff to see what they're feeling about the impact of coaching pre and post from coaching years. There's a few of the things. Let's look at the Q&A, Sarah. Do you see anything coming in?
Sarah: I'm seeing data collection. Just collecting data overall. Then, they notice that there are better child outcomes. Those are some of the ways that they can tell that coaching is making a difference. Changes in those observational measures. They see growth in things like the CLASS or the QSIT.
Darbianne: I'm seeing a ton of stuff come in. We are so fortunate in Head Start programs to have lots of different sources of data about what our educators are doing, what our coaches are doing, what our children are doing or learning. I hope today that one of the things that you'll walk away with is thinking about how do I know which source of data is going to help me to say coaching is making a difference, and how do I link those sources of data across coaches and educators and children and families, because we work so hard to collect that data, but then thinking about what do I do with that and how do I use it to make decisions. Just wanted to add that. I'm excited to see that you guys are loving the data as much as I am.
Sarah: We have this one response that is great for today because they were saying in the past that they've not had a great coaching structure. As this coach takes over in her position, they're ready to learn more and figure out how to see the effects in supporting coaches in a different way. I thought that was a great take and we hope that you take away something from this webinar. I'm going to move us along and share about the foundations of PBC, which is what DIDM is built on. When we are talking about PBC DIDM, which is Practice-Based Coaching Data-Informed Decision Making, some people think, “What is this DIDM?” It's built on the strong foundation of PBC. PBC, we know, is an evidence-based coaching framework.
We know that when it's implemented as intended that we can see that educators learn new skills and practices. That, in turn, children and families learn new skills and practices. It's a content-ready model, which means we can put anything in the middle. We can focus on any set of evidence-based effective practices. We have researched PBC extensively. Some of the research that has been conducted on PBC focuses on implementing practices related to embedded instruction for early learning, pyramid model practices.
Another study was conducted on Best in CLASS, which is a Tier 2 social and emotional intervention. There's a wide range of research about the effectiveness of PBC. In 2010, the Head Start National Center for Quality Teaching and Learning began digging in and sharing more information about PBC. Head Start continued to share how important coaching was and how much it makes a difference by updating the Head Start Program Performance Standards, which includes a research-based coordinated coaching approach.
There were the big training events, the big Practice-Based Coaching training institutes where we supported training coaches in PBC. Since then, we've supported lots of Head Start partners to implement PBC. Then, through all that work, there has been a need, a missing piece, and that missing piece is that the PBC DIDMP, helping programs develop that coordinated data-informed approach to coaching. PBC DIDM is that value-added component that builds on a strong foundation of PBC. You might be wondering why are we saying data-informed decision making. What is PBC DIDM? It's a model that supports coaches and program leaders to make data-informed decisions, decisions about PBC and other professional development supports within their program. “Data-informed" was chosen specifically instead of "data-based” or "data-driven." We hear those words a lot. That was intentional because DIDM is meant to build on the capacity of coaches and program leadership teams.
When we say leadership teams, we're referring to anyone that is a decision-maker within your program. You may be familiar with implementation teams, anyone who makes those decisions within your program. We would build upon using the data to inform decisions rather than letting the decisions make — the data make decisions for you. Data-based and data-driven might imply that they're making — the data is making decisions. We know that there's more contextual information that goes into informing and making meaning out of that data within your program.
Some of the parts of the PBC DIDM model are effort and effect cascade and the Prepare-Look-Think-Act framework. This model, as I mentioned previously, it's content ready. It's a process for making those data-informed decisions about Practice-Based Coaching related to professional development. The three parts we're going to dig into more deeply with Darbi. She's going to walk us through what each of these processes look like.
Darbianne: I was going to add before you jump into those structures, these are some of the tools that we use within the model. What we hope is that through Practice-Based Coaching Data-Informed Decision Making both coaches and leaders within the program, as well as the educators, feel empowered to use their data, to talk about and reflect on their practice. Sometimes data being collected can feel like something that's being done to you.
We want it to be something that feels like a reflective tool and empowers those within the program to talk about what they're doing well and to work collaboratively around areas that they want to strengthen or improve. Data-informed is about recognizing you're bringing some professional knowledge and judgment to looking at that data and empowering those within the program to talk about what it is that they're doing. I just wanted to add that.
Sarah: Because there's so much more that goes into it. It's not just — we have so much more contextual information behind that data. We talked about how PBC DIDM builds upon that strong foundation of Practice-Based Coaching. It's meant to make a program's PBC implementation stronger. When we think about implementing the PBC DIDM model, programs can really think about strengthening those structures that hopefully they already have in place by making them more data-informed. Those structures are the leadership team or implementation team, a coach community, and a process for engaging in PBC cycles.
When each of these hopefully are implemented regularly, we would have leadership teams or implementation teams, which would include administrators, coaches, education staff. The idea is that they can collaboratively make decisions about coaching caseloads, decisions about the content or focus of coaching. We're going to use the term "leadership team" throughout this whole webinar. Know that when we refer to leadership team, we're referring to the same people responsible for making those decisions about PBC within your program. A coach community, this would include coaches within your program or network. And are facilitated by a lead PBC coach who is knowledgeable about PBC and the effective practices that have been selected as the focus of coaching in your program. They are knowledgeable about the PBC DIDM model.
A lead PBC coach is someone who would support the other coaches within your program. For some of you, you might have smaller programs where there's only one coach, or you don't have support or anyone really supervising you. A coach community might be your regional or statewide training and technical assistance network. No matter what the makeup of your coach community looks like, the point that we're trying to emphasize here is that coaches need ongoing professional development, too. Then we've got PBC cycles. That's how coaches are going to support education staff to learn about and use those effective practices that are the focus of coaching.
PBC cycles are where the effort and effect data are often collected, and usually through the form of a coaching log or practices checklist. As a reminder because sometimes it comes up, what is exactly a PBC coaching cycle or what is included. That's when a coach and a coachee are going to engage in a focused observation that is based on that shared goal and action plan and reflection and feedback. All this would occur within close proximity to that focused observation. I'm going to turn it over to Darbi. She's going to help us dig in a little bit more and learn about effort and effect and walk us through the cascade.
Darbianne: Sarah's given us a strong foundation around what are the tools that are part of the model and what are some of the structures you might be using to implement this model. The last thing Sarah talked about was the PBC cycle. When we're thinking about that Practice-Based Coaching cycle, we can start to lay around these two maybe new vocabularies, which are efforts and effects. If we're thinking about a coaching cycle, the efforts refer to what is it the coach is doing to support that educator to implement effective practices as intended or with fidelity.
When we're thinking about that coaching log, we have some key information we want on there around how much: how long are they spending together, how many coaching cycles are they implementing together, and then how well. That's getting at those key components or key parts of the coaching cycle. We want to be sure that when the coach and teacher or educator are working together. We have that focused observation, it's happening every time, we're engaging in reflection and performance feedback. Making sure that we're using Practice-Based Coaching as intended, we're going to see those positive effects.
Thinking in that coaching cycle context, efforts are what the coach are doing and then the effects are the education staff using more practices with children and using them as intended. This could be embedded instruction practices, they could be pyramid model practices, they could be practices related to the class or early literacy. There is a variety of different things that could go there. Efforts are professional development and coaching, and effects are what we see as an outcome of those supports being provided.
We're going to dig into an example. Keep in mind it's just an example. We don't have every detail, but just enough information to get us thinking. We're going to think about the Henderson Program. In the Henderson Program, they have four coaches. Those four coaches are supporting centers in the North, Central, and South locations. When we think about that, we can also see that the size of the caseload is a little bit different. Those two coaches in the North are supporting 22 education staff. The coach at Central is supporting 10 education staff. The coach in South is supporting 18 education staff. That's how it's structured overall. They have a variety of both Early Head Start and Head Start teachers.
Sarah, anything else I should mention about the program set up? No? We have a little bit of data. These data are from a tool that's aligned with the CLASS. What we see in the orange is that there are some educators that are not increasing their use of practice. They collected this data on two occasions. Some of them either stayed right where they were or didn't make progress or move forward. What we see in yellow are the educators who are increasing cohesive practices. They're using them more often, with more fluency, maybe with different children across different activities.
When we look, we're starting to notice there might be some differences in the use of effective practices across those three different locations. If I was on an implementation or a leadership team or even a coach community looking at this data, I'd probably want to talk with other members of the team about what are they noticing. What stands out to you guys? Share with us in the chat some things that you're noticing as you look at these data. Joyce and Sarah, I am having a little trouble seeing the chat, if you can see it, feel free to jump in and call at them and try to get to it.
Sarah: I'm not seeing anything just yet as they're coming in. One thing I might want to know as a coach is that breakdown, what are those practices. For those teachers that maybe aren't making progress, what are their similarities across sites. We finally got some stuff coming in.
Joyce: Nancy said she noticed that the South is an increasing — someone mentioned higher caseloads.
Darbianne: That is a good observation that there are some differences we said in the North there are those two coaches. In the South, there's 18 education staff, but only one coach. Joyce, whoever mentioned that that's a really good point. There are some differences in case level. It makes us wonder why is that true. You mentioned also that the South seems to not be increasing as much in their use of practices. We don't know why that is. Like Sarah mentioned, we don't know if it's around one particular area or domain.
We don't know if it's just generally across different areas or domains. As someone who's in that coaching role, we also want to say what are the strengths. If you were on this implementation or leadership team, what are your celebrations? Because we never want the data to drive us into a place where we're not also recognizing what it is that we're doing really well. What stands out as something's working well here, let's dig into that.
Sarah: Alyssa notices that Central is having a lot of success. That stands out as successful there as a strength.
Darbianne: That's important to know because when we have centers or coaches that are finding success, we want to say what are the lessons we can learn from them, and maybe we can replicate something that's working really well and apply it to other locations or centers or coaches. I'm able to see my — and somebody sent us the chat about the Q&A. Yes, Q&A and chat are the same thing. Stephanie said that all three areas had some improvement. Let's celebrate the wins. Even if they are small, we always want to think about what are those traits.
You guys identified that there might be some needs around aligning caseloads. Those are important things to keep in mind. Would there be any other sources of data that we need to look at? When we asked you guys initially how do you know if coaching is making a difference, you named a lot of different kinds of data. Where might we go from here? Knowing that the South isn't making as much progress, but Central is doing great, what other data could we look at that might help us to better understand why we're seeing the numbers we're seeing?
Alyssa said, "Maybe a survey." You might want to know from the educators at that site what seems to be supportive or helping them. What other sources of data? Lisa said, "What's going on at Central, it seems successful overall." We could just dig into Central and say what happening there.
Sarah: What's making the good so good?
Darbianne: Allison is pointing out around quality. I think that that's an important point too. It's not just about did a coach go to that center or even did a coach and an educator meet. We also have to think about what happened during that coaching cycle. Are they doing observations, but not able to find time for reflection and feedback? Maybe they're sending emails as a follow up, but not always time to meet and really think about what's the action plan goal that they want to focus on. Allison, that point around quality is a really important one. It gets at that part that we had, let's see if I can make that work, around how well.
We just know what we're seeing as far as effects. We know that we have some data about our practices, but we have to back up and say, "Well, what do those efforts look like? What's actually happening when coaches and educators are together in those coaching cycles?" When we think about that, we mentioned coaching cycles are done really well or done with fidelity are always going to include a focused observation, an opportunity for reflection, supportive feedback around the practices the educators are using and using really well to support children and families, constructive feedback, which is not telling them what they did wrong, but really helping them to think about what are the opportunities for me to enhance my practice, to use a practice more often or differently or in a different way, and then identifying materials and resources.
Are there things that I have in my classroom that I forgot I got at the beginning of the year at that great professional development workshop that maybe I could pull off the shelf and use with children and families? Or maybe there is a great video that could demonstrate or model how to use a practice. We want to be sure that we have ways for coaches to reflect on and self-report am I implementing all those key components of the Practice-Based Coaching framework or cycle every time I go in. That's going to help me to answer that question around how well. They have some information about how much. As Sarah mentioned before, that's going to be on the coaching log, how many sessions, how many minutes of observation. We need to have both of those pieces.
The other thing that they're doing in this program is we mentioned they have a lead implementation coach or a lead PBC coach, and that person's job is to support other coaches. One of the ways they provide that support is they actually observe the coaches implementing Practice-Based Coaching to give them support and feedback around what that coach is doing really well. Maybe they're doing an amazing job with reflection or taking the time to find excellent resources and materials to support those educators. We want to be able to give coaches support around what they're doing really well. If they need some support themselves, then that constructive feedback. Anything to add, Sarah, around fidelity? About how much and how?
Sarah: No. I think it's important to note that the structure of a lead coach is really going to vary program to program, and that if you don't have that support, this might be a good way for you to think about who might I look at to provide support to me as a coach. Or, if that's something you're already really good at, maybe how you could support others within your program to build on their fidelity.
Darbianne: I think that that's an excellent point. We know that there's things like My Peers, you might have a supervisor in your program, you have regional TTA. All of those could be really excellent resources if you are the only coach in your program. I'm going to take us back one more time. So far, we've talked about efforts. That's whether coaches are implementing those key or essential parts of the Practice-Based Coaching framework. We have effects, and that's if it's making a difference. When we looked at this first graph, we were thinking about effects. Are the educators enhancing or using more practices as a result of that coaching support? We wanted to dig deeper into what's happening in those coaching cycles. We have some data about that.
We see in the first column, the site or location. In the second column, we see the name of the coach. We know we have four coaches. In the third column, we have the average number of sessions that each coachee has received. We can see that on average, across all sites, they're getting six sessions. In the South, they're only getting four. Whereas in the North and Central, they're getting six or seven. Even in how much, we're starting to notice some differences. As we move to the right, we see that observation data and also the self-report data. Take a moment to just look through. What are some things that are standing out to you about these coaching data? You can insert in the Q&A, which we're using as a chat today.
Sarah: While we're waiting for the responses to come in, I want to let you know that I dropped in a resource that might help you dig a little deeper with the elements of fidelity. That should have come to you via the Q&A.
Darbianne: I see Kimberly asking, "Is the session representing a whole cycle?" When we're thinking about a Practice-Based Coaching session or cycle, you're always going to be looking at what's that shared goal, whether you're revisiting one you wrote in a previous cycle or session, or you're writing a new one together. You're always going to be talking about what's that shared goal and action plan. You're always going to engage in focused observation, reflection, and feedback. Those things will always happen in the context of a collaborative partnership. Whether it's a group or an individual coaching model, those components or parts are always there.
Sarah: We had a response come in that they noticed a correlation between less coaching support, so the number of sessions with the progress, the coach fidelity, or the use of PBC, they were noticing less sessions as showing that less progress.
Darbianne: That's happening here. We're seeing fewer sessions in the South. We knew that those educators weren't making as much progress. There does seem to be some connections there. That might be because of the number of people. It might not be a feasible caseload.
Sarah: Katya tells us that they noticed that South coach wasn't observed using the essential strategy. When the lead coach came to observe, not seeing those strategies used.
Darbianne: Here's where I want to be sure that we remember that it's Practice-Based Coaching Data-Informed Decision Making. It could be really easy to look at this data and be like, "Tomika, what's up? Why aren't you using those essential strategies?" But then we have to remember that there might be more to the story. To really understand what's happening, I might need to gather some additional information from Tomika about how satisfied is she with what's happening related to coaching, is there a reason she's not using the essential strategies, is she not using them because she doesn't know what they are, or is she not using them because she's just burned out or because she doesn't have enough time.
When you have a lead PBC coach or a lead implementation coach whose job it is to support other coaches, that's where they could potentially reach out and say, "You know, I'm noticing some things in the data. I wanted to give you an opportunity to reflect with me about how I can better support you." This isn't calling her out or saying she's doing something wrong, but saying, "You know, let's have a conversation using these data about what it is that I can do in my job to better support you in your efforts to be a really amazing coach."
Here's what we hear from Tomika. She says, "I really love being a coach and helping them to learn new things and solving any problems that come up in the classroom. I feel like there isn't enough me to go around. I know I'm not using the essential strategies each time, but it's just so hard to focus on one practice." Hearing that, Sarah, are there any keywords that jump out to you?
Sarah: What jumps out to me is "isn't enough of me to go around." That's clearly saying, "I need help. I know I'm not using the central strategies." She's very aware. She's saying, "I need help. I know I'm not doing it." That, to me, sounds like maybe more support on how to get that maybe even what those essential strategies are stands out to me. She seems to love her job, so that's great.
Darbianne: That's where we want to be sure that we're keeping her. She loves this job. She wants to do it and she knows what she should be doing, but she might be feeling a little bit burnt out. Maria's saying, “she wants to be more consistent.” I see Michelle saying, "Focus on one practice." Michelle, that is a really important piece that when you go in as a coach and you want to problem solve all the things that are happening in the classroom, that coaching cycle can go by in a snap. I see people saying goals might need to be more specific, just pick one thing.
That's right. It's so hard to focus on one practice. I want to unpack that with her. I want to ask her about that. I just love these comments. You are right on target. We know that she knows what she should be doing, but her time is getting away from her. The way that I support Tomika would be really different if she didn't even know what the practices were, then we might get some videos out and say like, "This is what reflection looks like." That is not the support that Tomika needs. Tomika really needs some help around that focus on one practice.
That's where we start to think about this old theory of change said, "We want high quality coaching to lead to educators' use of practices and positive child and family outcomes." One of the gaps here is who's supporting Tomika. What we're starting to see in the Henderson Program is how important that lead coach can be in providing that support to other coaches and really saying, "We have to back up and make sure our coaches have the tools, resources and support they need to do their job really well." I'm going to pause and see, Joyce and Sarah, if you have anything to add about our old way of thinking and why it's so important to have those coach supports.
Sarah: For me, when I was just starting out as a coach, it would have been really hard if I didn't have that coach support. I was lucky enough to have a lead coach to support me and learning the strategies and giving me feedback. I think that often coaches tend to be promoted from within. You might be a really great teacher or home visitor or family childcare provider, but coaching really is a different set of skills and practices. It takes time to learn. You need support, too. I think it's a common misconception that we come into this role of coaches knowing it all. We need support, too.
Joyce: I was thinking the same thing, Sarah. It's like, "Woohoo, I'm a coach." It's like, "Okay, now what?" Because it's a different set of muscles, because you're working with adults and you're working with many times your peers. It's just like honing your craft like with that. You might be really great with little people, but working and supporting adults around their use of effective practices and that oftentimes it takes a little work for it to translate over. Or like, "I can do it, but now I have to learn how to support someone else to do it."
Darbianne: I see some comments coming in about if I'm in a small program and maybe my director isn't able to support me. That might be true. It might be that the folks that are within your program, your immediate program, don't know a lot about Practice-Based Coaching right now. There are, we mentioned before, some really great resources through My Peers or even these webinars or supervisors and regional TTA folks that they want to help around these kinds of things. There are coaches within your region. It's just a matter of finding ways to connect with them and the supports that they provide. I don't know, Joyce, if you want to add anything else about that.
Joyce: No, I'm, you know, say "amen" to everything that you just said. All of those things are super important. It is a struggle. Maybe you are the coach and the Ed manager. That's often the case as well. Just finding support for yourself. Identifying what those supports are and finding your community. Whether it's through your regional, is there a coaching network out there. MyPeers is a great place to get started, but it's really about finding that community for yourself and for coaches that maybe you're supporting.
Darbianne: When we think about the effort and effect cases we want to think about those not just about coaches' efforts to support educators, but we have to think about that whole collective system. That's where the cascade comes in. That's why we have program leaders and TTA staff that are supporting coaches, coaches supporting education staff, and education staff supporting children and families. The thing that we need to make sure that we have is sources of data every step of the way.
What that might look like is the program leaders or implementation team are really thinking about how much professional development is available within our program or through that regional network, and how well are we supporting our coaches. We want to know is it making a difference in coaches' knowledge, skills and dispositions or beliefs about what makes an excellent coach. That's one tier of support that we think about.
The second one that we think about is the one that we've been really digging into is the coaches' efforts, how many cycles, how many sessions, how many action plans, how many practices are we talking about and how well. Are we using those essential practices of focused observation, reflection, supportive, constructive on an action plan every time we meet? Is that helping our education staff to make progress, both in their observed use of practices, but also education staff's confidence? Confidence, I saw you guys mentioning that as a source of data.
Finally, if all of those collective supports are lined up and we're doing everything we can do to support our coaches and education staff and our education staff aren't burned out, but they're feeling really well supported to work with children and families across homes or centers using those effective practices that we know can really promote positive outcomes in development and learning and capacity building [inaudible]. We want to see those sort of across the board. At that point, I'm going to turn it back over to Sarah and Joyce.
Sarah: We wanted to throw this out there to the group. Now that you've had a chance to see the full cascade, let's think about how looking at our efforts can help us to better understand our outcomes or our effects. If you would use the Q&A.
Darbianne: I do see some things that are coming in around do we coaches sub or get pulled into different things. They do. That's where when we have data about our efforts from something like a coaching log, it can help to draw attention to and facilitate conversations with the leadership implementation team around are we using our coaches in the ways that we intended.
We know that there are sometimes extenuating circumstances. Having that data can really empower coaches to have a conversation with their leadership team around "I'm a coach, but I haven't had an opportunity to coach in the last two weeks. You know, what can we do as a program to collectively align our efforts and to make sure that we have that coach who's able to provide those coaching supports?”
Joyce: Darbi, it seems like there was some issues with audio. I think a few folks didn't hear the question. I'm not sure, Sarah, I think something maybe happened with your audio for just a sec.
The question was, "How can looking at our efforts help us to better understand our outcomes or effects?" Thinking about what Darbi, what you just shared now for our audience. You can pop it into the Q&A just how do you think that really looking at those efforts can help us understand the outcomes or the effects of what we're doing. I just wanted to recap a bit because we had a little bit of a glitch there.
Sarah: I was seeing that too. Can you hear me?
Joyce: Yes.
Sarah: Good. I see that Joanne says that it can help us adjust our practices. That's an important way of knowing what our efforts are. If we know how much or how little or how well our effort is going in, it can help give us information about how to adjust.
Darbianne: I love that Eva put in here that it's reciprocal. I think she's absolutely right, that we have to know the effects that we want to see, like what's the outcome we're working toward, and then how we can apply strategies or efforts to support those effects. Eva, I love the way you're thinking about that. You're right, the arrow could go backwards and forwards, for sure. We have to know where we're trying to go so we can be planful and thoughtful about that.
Sarah: Oh, I love this. Oh, go ahead, Joyce.
Joyce: Oh, no, you first.
Sarah: I see a response that you can see where you're spending your time. That can be helpful to see. If we're not providing a lot of support to coaches, if we're looking at our efforts and effects at the various levels, we can see where there may be gaps. That can be really helpful.
Joyce: I was going to say being intentional about looking at efforts and effects can, because sometimes we just think like things are given. Like if I do A, then I know B is going to happen. I think sometimes when we look at it this way, it makes you, it forces like you, or the collective you or us, to see, all right, did the effort we put in really get us the effect that we wanted or thought it was, or we just think that it did. Of course, it did. Being intentional about it, using the tool, can put it out there visually for that PBC implementation team to say, "Okay, is what we're doing really taking us in the direction that we want us to go? And, if not, why not? What else can we do?”
Darbianne: I see Allison pointing out that this tool might help us to think about caseload. A good point is that we can't spread our coaches too thin and still expect coaching to have those really positive outcomes. That reminds me of the Henderson Program, because they had one coach, Tomika, who was spread super, super thin. Sarah, do you want to walk us through some of the things that they thought about in that Henderson Program related to caseload? I do think this model really helps teams to think about that.
Sarah: The Henderson Program considered they had one coach who was really asking for help. Tomika was saying, "I need help. My fidelity scores are reflecting it. I need more support." They first thought about having coaches, how could they work across locations to help manage their efforts where they were needed the most. They considered the different types of coaching formats.
Were there education staff that had similar needs? If so, they could utilize group coaching. Tomika's coaching peer, she wanted some more support, but the lead coach was full and didn't have time to support. Maybe there is another peer of Tomika's that could provide some extra support to Tomika. Those were some of the things that — oh, go ahead.
Darbianne: I was just [inaudible] that coach at Central that we all said something really good is happening at Central. Maybe there's an opportunity there to kind like buddy folks up if you have a coach who's really strong and ready to take on more of a leadership role, that's an option.
Sarah: I do want to briefly offer an opportunity. If there's something that you think Henderson Program didn't think of that you want to share, what's something else, are these pretty aligned with what you would think or are there other things that they might think about? If there's anything that you didn't see here that you think Henderson Program should consider, pop that in the Chat and we'll refer back to it. Here's what they decided. Because Tomika is spread so thin, they're not going to leave her hanging. They decided that they're going to use group coaching led by the South coach, Tomika, and the Central coach. They're going to pair Tomika up and they'll run some group coaching.
That coach is going to work with a group of education staff that have similar strengths and needs. The leadership team also decided to give Tomika some more individualized support. Her peer that she's working with in group coaching is going to review some additional coaching videos, coaching sessions of Tomika coaching, and give her some feedback about her use of those essential strategies and give feedback about the quality of that. They were really thinking about that caseload and how they could thin out that caseload or make it a little bit more manageable, and also providing her some more individualized supports.
Darbianne: I think Joanne's putting in there a survey for coaches where coaches have the opportunity to talk about where are they feeling really confident or what are the areas where coaches want to be supported. That's definitely something else that a program could think about doing to identify additional supports that maybe they didn't have in mind that coaches know that they want [inaudible].
Sarah: There's definitely probably coaches within your program or within Henderson Program that feel like, "Yeah, I do that really well, I could support someone with that.”
Darbianne: The other thing I was thinking about is a lot of times at the beginning of the year, we'll assign coaches to a particular site and we'll say like, "Oh, Sarah's in charge of the South and Darbi's in charge of the Central." But once our data starts to come in for the year and we start to see how our practice is looking in the classrooms, Eva's saying you need to take a more holistic approach and look across all of our different classrooms.
Maybe that means that it's not just one coach per location or center, but that we have to be a little bit more creative about using our data to make decisions about where our coaching supports are going to go. This model is a helpful tool for talking through those decisions because it should be fluid. It should be responsive to what the needs of your program are.
Sarah: Not all education staff need the same level of support. I think that's important to note. Sometimes we go into PBC thinking everybody needs intensive individual one-on-one coaching. That's not, one, feasible. Two, not everybody needs that much support. Getting creative. I'm going to turn it over to Joyce and let Joyce wrap us up and share a little activity, a back home activity for you.
Joyce: Thank you, Sarah. As you guys were talking and Darbi was talking about some of the different points of things to consider, and even in the chat, it just made me think about different ways that this tool could be used. A nice way to tie it all together with our back home activity. What you see here is what's in your Viewer's Guide. We have this handout for you and for you to take back and think about one, what activities/events are taking place at each level, walking through this same conversation that we had today, thinking about how is your program measuring efforts and effects with data, and then what do you want to talk about with other members of your program or of your PBC implementation team.
Two, this whole conversation are things that can guide and maybe help shift some of your coaching efforts thinking about that coordinated coaching strategy of one, what are we doing and how's that going for us, how well is that really going for us. This is one tool that you could use to bring to that conversation. Even this webinar, because we know it's a lot of information. When we were sitting here talking, I was like, "Well, this would be like a nice, you could take like the last part where we went through the tool and have this facilitated conversation with your PBC implementation team as well." Just another way you can use it.
But you do have that as a part of —checking my time. You do have that as a part of your Viewer's Guide. We just invite you to take that back and start having some of those conversations, or just dig deeper into the conversations that you're having. Just to know that that's there for you as well. Again, thank you, everyone. Have a great rest of your day.
CloseHow do we know if coaching is making a difference in our program? Data-informed decisions about Practice-Based Coaching (PBC) efforts and effects are important. The Effort and Effect Cascade tool helps programs and coaches identify and use data to inform key decisions about program-wide and individual professional development and PBC support. Watch the video to learn how to identify key sources of data for measuring coaching efforts and effects.
Note: The evaluation, certificate, and engagement tools mentioned in the video were for the participants of the live webinar and are no longer available. For information about webinars that will soon be broadcast live, visit the Upcoming Events section.