How to Design a Better D&I Survey

During the past couple of months, many diversity practitioners have had the pleasure of attending a diversity and inclusion conference, award ceremony or other event. Following the return to the office, many are pumped and ready to implement the best practices and techniques learned during those events.

Coincidentally, we are also entering the survey phase of the year, and many are preparing to roll out a diversity and inclusion assessment or include items into an existing employee survey in the near future.

Before launching the next survey, take into account the following guidelines to yield actionable data from your assessment efforts:

Shorter is better: When assessing attitudes, behaviors and beliefs on diversity and inclusion, I advise our clients to focus on areas of priority connected to existing models, missions and strategies. Resist the temptation to measure everything and assess only areas that you intend to change. Including long lists of survey items simply for the sake of curiosity will breed contempt and apathy on the part of the respondents. In response, they will often adopt broad-brush strategies for answering the questions, which include strategies such as acquiescence or choosing neither agree or disagree for the majority of the items.

Context rules: Another common mistake when assessing diversity and inclusion in a self-report survey is including items that force employees to make inaccurate judgments. The self-report rating scale items noted below demonstrate this point. While these are topics of concern to many organizations, imagine how difficult it would be for employees to come up with accurate answers, since most are far removed from areas related to business operations and human resources:

• “Hiring in this organization occurs without any bias.”

• “Policies, practices and procedures are applied consistently to all employees regardless of their age, citizenship, ethnicity, gender, marital status, national origin, race, religion and sexual orientation.”

Instead, write the survey items in first or second person and offer a variety of rating scale methods like frequency, ranking, forced choice, checklist, strength of the attitude, confidence and importance:

• “How often during the past 12 months have you experienced any of the following behaviors?”

• “I would enjoy the opportunity to take on stretch assignments.”

Actionable items required: Survey items must be written in ways that generate actionable data. For example, one of our clients implemented a new leadership model and subsequently needed to refresh its employee survey items. In collaboration with the client, we generated survey items aligned with the model and written as concrete as possible in a way to provide actionable recommendations.

Communicate parameters: The final guideline refers to setting a time frame and placing instructions within each survey item. Sometimes employees either cannot remember the relevant information or were not employed long enough to answer the survey items accurately. To ensure a respondent’s recollection of events and keep attention focused on the present, identify a specific time period at the beginning of every question, such as “Over the past month, have you …?” or “Over the past 12 months, how would you rate …?”

Many employees overlook reading the instructions or may forget them by the time they need to apply them. Employees are more apt to follow instructions if they are included in each question.

Implementing these suggestions will yield the right survey data to aid you in taking actions aimed at improving the right bottom line outcomes.