Author’s note: it’s been a while since I had a chance to write a post completely “from scratch”, not having it based on a particular article or book. This one has been brewing in my head for some time now and I’m excited to share it with you all!
“You can’t manage what you can’t measure.” — Peter Drucker
“If you give a manager a numerical target, he’ll make it even if he has to destroy the company in the process.” –W. Edwards Deming
“Not everything that counts can be counted, and not everything that can be counted counts.” — Bruce Cameron
This trio of quotes captures beautifully the fundamental tension that we’re all trying to navigate when we work hard to make our organizations better. Without a clear measure of progress, it’s hard to know whether we’re making progress and whether our current efforts help advance us towards our goal or move us away from it (Drucker). However, not everything that we care about can be measured (Cameron), and sometimes trying to force the issue and measure something can lead to pretty painful, unintended consequences (Deming).
The current state of DIB efforts
Nowhere is this tension felt more today than in our collective efforts to make our organizations more diverse and our behaviors more inclusive, fostering a deep sense of belonging among our teammates. Figuring out what to measure and what progress looks like remains a heavily debated topic.
Measuring diversity is becoming a more popular practice because it seems easy at first. But when we dig a little deeper and grapple with less easy to measure aspects, such as socioeconomic status (see Aline Lerner’s response here), not to mention intersectionality, Deming’s observation seems closer to the truth.
Measuring inclusion is perhaps more critical since it seems to have a more profound business impact. Not to mention that improving diversity without any follow up deliberate action will most likely decrease inclusion. However, inclusion turns out to be more difficult to measure and improve.
Often stumped by this challenge, many HR organizations turn to their “silver bullet” measurement tool and attempt to use our all-purpose-hammer: the survey. Yet, as the folks at Cultivate so eloquently point out, survey data suffers from a myriad of human biases: from recency bias, through acquiescence bias, to self-reporting and social desirability bias. And I will further add some more “mechanical” challenges such as selection bias (partial participation) and proper statistical analysis of the results.
Supporting inclusion also requires a different “type” of measurement. Since improving inclusion requires human behavior change, feedback (measurement) needs to be a lot more frequent and timely in order to make a difference. Learning today that there was something that I could have done differently two months ago is not so useful. Learning about it immediately, or even an hour later can be transformational, since the window for corrective action is still open.
To find a solution to this conundrum, we need to take a slight detour and familiarize ourselves with a much lesser known tool in our toolbox, that’s currently undergoing a profound revolution.
Organizational Network Analysis
Organizational Network Analysis (ONA for short) is the process of studying the relational and communication patterns within an organization through the use of models (graphs) of said relationships/interactions and conducting analysis, often statistical in nature, do derive various insights at both the group and individual levels. For example, these models/graphs, often referred to as “sociograms”, can be used to evaluate the overall level of “closeness”/”density” of relationships inside the organization by measuring the average “distance” (number of connections) that it takes to get from any one place in the network to any other place in the network. At the individual level, it is fairly easy to identify “outliers” — the people that are least connected to everyone else in the organization. A slightly more comprehensive overview of ONA can be found here.
While the roots of ONA can be traced all the way back to the work of Emile Durkheim in the early 1890s, real research began in earnest in the 1930s and made significant leaps forward in the 1970s and 1990s as more sophisticated technology unlocked more complex analysis of the data. Today, ONA is offered as a standard service by both top-4 consulting shops like Deloitte and boutique consulting firms specializing purely in ONA like Culture Optix and Tree Intelligence.
But ONA never achieved wide, mainstream adoption. Most HR organizations today don’t even know that the tool exists, let alone use it in their day-to-day practice. I believe this is due to two main reasons:
- The cost of ONA in terms of time, energy and effort remained high. Even though technology helped in the analysis portion, the data collection process required for the construction and update of the sociograms remained mostly analog, relying heavily on survey data with all their drawbacks covered above, significantly constraining both the type of data that can be collected and the frequency by which it can be collected.
- The benefits of ONA remained fuzzy. Partly due to the data collection constraints, partly due to the relevant research still being in its adolescence stage, and partly due to not-so-great product management, the value proposition of using ONA and the types of organizational challenges that it can help address remained too broad and too shallow, never scratching a big enough itch to justify the complex execution and analysis.
But all of this is now changing.
The digital revolution
In the last two decades organizations have been undergoing a digital revolution in the way they collaborate and work together: from the pervasive use of email, through video conferencing, to instant messaging. Furthermore, many analog activities still generate some digital “footprint” — from calendar invites to digital work artifacts like documents, spreadsheets, and code.
This revolution opened the floodgates of data towards a new era of ONA in which not only can sociograms be constructed and maintained almost effortlessly, in real-time and with no human bias, but also the richness and granularity of the data that can be analyzed exceed by orders of magnitude what was possible a decade ago.
And companies like Humanyze continue to push the envelope even further by creating solutions that deliberately generate digital footprints to the analog interactions that currently don’t organic ones.
DIB + ONA = ❤
By now you should probably be able to tell where I’m going with this:
I believe improving inclusion is the “killer app”, the “thin edge of the wedge” if you will, for the broad adoption of next-generation ONA.
ONA, with its analytical orientation towards identifying individual and group relational patterns, and the ability to perform it seamlessly on an on-going basis is perfectly positioned to close the currently-broken feedback loop and provide us with the close-to-real-time feedback needed to drive real behavior change.
ONA can help us identify the overall state and trend, as well as both the “bright spots” (to learn from) and “hot zones” (to help) across many inclusive dimensions including but not limited to:
- Use of gendered language
- Communication silos and the people who connect them
- Outsiders and bridges
- Balance of communication frequency across teammates
- Balance of communication time/reciprocity
- Communication inside/outside working hours
- Communication sentiment (positive, negative, etc.)
Marrying DIB and ONA presents an opportunity to leverage the heightened awareness around this hot-button, critical topic and gain an edge in a red hot super-competitive market for both HR leaders and software vendors alike.