Photo by Kalea Morgan on Unsplash
This series on Diversity, Equity, and Inclusion in Government was written by Vince Vu. Stay tuned for the five-part series on Diversity, Equity, and Belonging in Government. Read Chapter One. And Chapter Two.
Chapter 3: Listening to Insights with an Equity Lens
Welcome back to the third chapter of our Diversity, Equity, Inclusion, and Belonging (DEIB) in Government series (“the Work”)! If you have no idea what I’m talking about, or want to do some refreshing, please check out chapter 1 or chapter 2 of our series in engaging with the Work.
Last time, we took a deep dive into some core community engagement concepts, and how they can (and should!) be applied internally for DEIB Work. We discussed why authentic community engagement is so challenging, and why that’s not necessarily a bad thing. We looked at a comprehensive community engagement framework, and drew parallels between academic theory and concrete government actions. Finally, we touched on why these “external facing” concepts can be helpful when engaging in internal initiatives.
This week, I want to get into the weeds. Let’s talk through the how – how to collect feedback and understand insights in your DEIB program.
How to Listen to DEIB Feedback
It’s an essential fact: when you do DEIB Work, you will do a LOT of listening. In fact – listening is the root of creating change. After all, how do you know what to change to improve conditions for employees if you don’t listen to them directly?
Much like there’s a big difference between listening and hearing, there is a big difference between simply collecting data and listening to stakeholders’ feedback. For comparison’s sake, I like to think of it like this:
Old way of collecting data | New way of listening to feedback |
Conducting endless surveys with no clear direction | Establishing listening posts (a listening post is a broader moment of feedback than “survey” – it can include interviews, focus groups, ambient listening, or surveys) at critical moments of an employee lifecycle |
Asking demographic questions the “formal” and “correct” way | Asking demographic questions that resonate with, and are respectful to, the groups they are trying to represent |
Hiding important differences in the experiences of employee communities to “preserve anonymity” | Thoughtfully disaggregating the right types of data to highlight disparities and the Work to be done |
This isn’t just marketing mumbo-jumbo, but a deliberate reframing of the essential activities that enable you to listen to DEIB feedback – with the ultimate goal of acting on that feedback. Let’s go through each of the mistakes we make when collecting data – and how we can correct those mistakes.
Conducting Endless Surveys with No Clear Direction
I get it – you’ve probably heard some version of the phrase “survey fatigue.” This often comes from well-meaning people who are trying to be responsive to their employees. Afterall, people are plagued with surveys everywhere – from the surveys on your two-foot long receipt at the grocery store to your purchases from Amazon. Everyone nowadays is asking for feedback – but, is it actually helpful to employees if they are besieged in the same way at their workplace?
Well friends, I’m here to equip you with some firepower to respond to that critique:
People don’t suffer from survey fatigue. They suffer from lack-of-action fatigue.
I know, it doesn’t roll off the tongue easily. But, it’s true, I promise.
The truth is that people generally love giving feedback, IF AND ONLY IF there are clear outcomes and results based on their feedback. What people DON’T like is if they are asked to repeatedly provide feedback in a non-structured way, without a clear line of sight to how and why their feedback matters.
So how do we apply this to the Work? Simple:
- Have a concrete framework for why, when, and how you’ll be asking for feedback (the community engagement framework from chapter 2 can be helpful here!)
- Before you ask, make sure you have the infrastructure and capacity to respond and take action on the feedback.
Simply put – don’t just blast surveys out to your employee groups. Have a good reason why you’re doing it, when you’re doing it, and have a clear action plan for how you’ll respond!
Asking demographic questions the “formal” and “correct” way
This point refers to demographic questions that probably suffer from the worst case of being written in “government tone”. It’s okay to acknowledge that identifying terminology and the community preference of descriptors change! The best thing we can do as government entities is to not try to force stale established “categories” onto people, but to be responsive to the changing terminology and vocabulary that different communities are using.
Too often, surveys will solicit demographics without any explanation or context at all. At a minimum, here’s what you can do:
- Make sure your demographics are all optional, and at the end of your survey.
- Before you ask your demographic questions, be upfront about your point of view on demographic questions. This will help build trust with the respondent and clearly articulate your stance about why you are collecting this information. Here’s some examples:
Before your race and ethnicity questions:
Before your questions on gender identity:
Before your demographic questions overall:
- Interrogate the terminology you use when asking demographic questions. As government professionals, we often feel the need to use the “formal” or “proper” words when describing demographics. It’s important to note that many of these terms have evolved over time, and continue to evolve. Many terms for demographic categories, such as Hispanic or Pan-Asian, were created to prioritize the act and ease of data collection over actually being able to represent the complex communities themselves. When possible, try to understand the terminology being used today, and phrase questions in ways that different communities describe themselves as (rather than defaulting to “government speak”). One way to address this is by being pretty explicit about what you mean by different categories. For example:
- While we’re on the topic of categorization – remember that there are some smaller, functional things you can do to prioritize the Work during data collection. For one, make sure that you’re taking a neutral stance about the order of your options (is “man” listed before “woman”?). My personal rule is to ask race and ethnicity in alphabetical order, but to list gender identity in a way that highlights populations that have historically been marginalized the most. Yes, it’s a “small” thing, but taking all these steps together can have a huge impact on how many of your employees feel seen.
All of this can make back-end analysis harder. You may have longitudinal data sets that still use antiquated demographic categories, and struggle to link them to newer datasets with more modern categories. Taking the time to fix all this trending will be a huge endeavor. To this, I say – sorry my friends, but we need to roll up our sleeves and roll with it. It’s part of our calling to change with our communities.
Think of it like this: would you rather have a static and antiquated dataset, or a LIVING dataset that continues to be relevant over time? So kick back with some tea and get ready to make manual re-coding fun again!
Hiding important differences in the experiences of employee communities to “preserve anonymity”
I’ll be quick with this one: think long and hard about the best way of disaggregating data every chance you get. This means that rather than reporting out overall results, report results broken down by each demographic category you’re collecting. If you’re going to go through the trouble of collecting this information, you must be transparent when you report on the results.
There are two rules of thumb that I follow when disaggregating.
- Default to as much disaggregation as possible. This means that you should report out all the crosstabulations of data for every demographic option you have.
- If there are cases where disaggregation would compromise anonymity (for example, you only have 2 – 3 individuals of a certain group in a unit), consider collapsing the data, but in a way that can still be helpful. For example, collapsing all the different race and ethnicity groups to “BIPOC” so that you still have some data to take action. If you do this, make sure to check in with employees to get their thoughts on the usefulness of this grouping.
Even with these rules of thumb, this can often be a very difficult decision to make. However, it is critical to having data that is useful and actionable in your DEIB efforts. So ask yourself this: who is this data really for, and how can we make sure that it serves the communities we’re targeting?
If I can give you one piece of advice before we end – remember that you have experts within your own organizations that can help pressure test these strategies. The right strategy will differ for different organizations, and I highly encourage you to take what you’re learning in this series, modify it for your organization, and then ask employees and leaders about their reactions. While many of these concepts can be broadly applied, I guarantee that you will find some small (but very important) nuances in your own organizations.
Alright, that’s enough time playing in the tactical weeds. In the next entry, we’ll uplevel this conversation and talk about how we take all of this data and turn it into concrete action. Onwards and upwards, comrades!
Vince Vu is the Head of Government Strategy at Qualtrics, focusing on state and local government. He advises government agencies and organizations on effective experience management (XM) programming, including program design, survey assessment, resourcing, and change management. Prior to joining Qualtrics, Vince managed research and data analytics teams in multiple government settings at the city, county, and state levels. Vince earned his Masters in Public Policy, specializing in advanced policy analysis. Connect with Vince on LinkedIn.