Deliberately Developmental Organizations

The Deliberately Developmental Organization (DDO)

Is a thought-provoking whitepaper by Robert Kegan et el which offers a brief summary of their recent book: An Everyone Culture: Becoming a Deliberately Developmental Organization

The whitepaper, and book, cover the cases of several companies who chose to transform their approach to personal and professional development from its standard role as a program in a broader portfolio of HR program to part of the organizational DNA and something that’s synonymous with the long term business success.

The authors identified 12 shared “discontinuous departures” – things that DDOs approach in a fundamentally different way than traditional organizations – across 3 major organizational design features: principles, practices and community; which makes them stand out:

ddo

Though I’ve been extremely curious to learn more about DDOs, I didn’t feel reading the book would be the best way to go about it, as the extremely case-based writing style doesn’t align with the way I digest this kind of content.

This whitepaper is the perfect fit for the kind of content I was looking for, and highlights the key areas that are worth further exploration.

 

Advertisement
Deliberately Developmental Organizations

Heuristics vs. Best Practices

In this post, I’d like to explore an important anti-pattern that I’ve seen emerging more and more recently, using Dave Snowden‘s classic piece:

A Leader’s Framework for Decision Making

The anti-pattern that I’ve observed is fairly straight forward: in a genuine attempt to foster learning and help people avoid past mistakes in dealing with a complex challenge practices rather than principles are being ascribed, leading to sub-optimal outcomes.

Let’s start with a more concrete definition of the two terms in the title of this post:

  • Best Practice – a procedure that is prescribed as being correct or most effective
  • Heuristic – a loosely defined rule (“rule of thumb”)

As well as Snowden’s definition of three types of problems (there’s also a fourth that is less relevant to this post):

  1. Simple problems – clear cause-and-effect relationships are evident to everyone; a right answer exists; known knowns; Example: solving a simple arithmetic problem
  2. Complicated problems – expert diagnosis required; cause-and-effect relationships are discoverable but not immediately apparent to anyone; more than one right answer exists; known unknowns; Example: diagnosing a car malfunction
  3. Complex problems – no right answers; emergent instructive patterns; a need for creative an innovative approaches; unknown unknowns; Example: forecasting the weather

As Snowden points out, (best) practices make sense only in the first two types of problems, the simple and complicated. They differ from one another by the level of expertise required to uncover the relationship between cause-and-effect, but once you have, following the best practice algorithm/checklist will lead you to the best possible outcome.

In complex problems, best practices don’t work. Solving complex problems is all about the unique context that you’re currently in, and what was the best solution to this problem in one context/situation may not be the best way to solve it in another. This is the realm of heuristics and principles, pointing you in the general direction of ways to classify the problem, and general approaches that seem to work for a specific pattern. But they cannot ascribe a specific solution that’s guaranteed to work.

Almost any problem that involves humans tends to be a complex problem. As multiple research papers seems to suggest, when it comes to human behavior, the relationship between cause-and-effect often times are murky at best, putting it squarely in the “complex” category.

And this is what I believe most people seem to (implicitly) fail to consider, when they ascribe a “best practice” solution to a human-oriented problem…

 

Heuristics vs. Best Practices

2 Minutes on Relationships

I’ve written about the lectures in Sam Altman’s “How to start a Startup” class before, and this particular one came to mind again this week.

How to build products users love – Kevin Hale

Kevin’s entire talk is great and well worth the watch, but the first 2 mins of the section linked above is one of my favorites.

Kevin uses John Gottman‘s research on the reasons couples fight, to draw parallels to the reasons users fight with their service providers. It is absolutely fantastic:

fight

2 Minutes on Relationships

Make enterprise tools great (again?)

Throughout my career, I’ve sat on both sides of the enterprise tools table, as both a user and a provider. Despite the constant evolution and plethora of new tools coming into the market, I’m getting a sense that two fundamental problems that get in the way of unlocking real value (for both sides) are still getting very little attention.

The tool is only as good as the process it enables

In the past, enterprise tools providers hard-coded the way in which the tool should be used into the tool. They were extremely perscriptive about what is considered best practice, and to some extent forced their customers to follow it. Customer revolted.

Today, most enterprise tools are designed to be extremely free-form: you can model whichever process you’re currently using in the tool and there are 1,000 different ways to use it. There’s also no judgement: each one of those ways is valid.

Consider Asana  as a good example: with a super-simple object model (just a “project” and a “task”), everything goes: a project can be a project, a team, a meeting, an event. No right or wrong answer.

Or consider Slack , where everything can be a channel. Granted, they nailed a couple of big innovations compared to other enterprise communication tools – namely designing with the user’s delight in mind, and building a platform rather than a tool, which puts them (through the players building on top of the platform) in a better position to address other core problems. But as Brad Feld also points out the core problems that existed with Google Groups, like discoverability (finding which groups/channels to join/use) and staleness (defunct groups/channels adding noise) still exist with Slack. If you’re a 100+ person company and have more Slack channels than employees, have you really made any progress towards better enterprise communication?

I’m not picking on Asana and Slack, the same is true for almost any other enterprise tool: JIRA, Salesforce, WorkDay, you name it.

I believe the pendulum may have swung a little too far to the other end of the prescriptive<->free-form spectrum. As the examples above illustrate, the free-form end of the spectrum is just as bad as the prescriptive end. There are multiple shades of grey that get us closer to the optimal outcome,  in which the responsibility around “effective usage” needs to be shared between the provider and the customer.

Providers, on their end, can provide guidelines and best practices for the customers who seek them. They can play a more active role in disseminating lessons learned on tool usage throughout their customer community. And they can use smart (yet overridable) defaults inside the tool, to nudge customers towards a more thoughtful use. To use Slack as an example, when you try to send out a notification-triggering communication to an entire channel (@channel), Slacks confirms that you really want to send a communication that will trigger a notification for X people across Y time zones before letting you proceed. Can a similar approach be used when you name a channel? or when a channel hasn’t been active for 60 days?

Customers, on their end, need to acknowledge that some of the responsibility for getting value out of the tool rests on their shoulders as well. They’re likely going to get very little value out of the tool without being thoughtful about the way in which the tool will be used and proactively managing and curating this usage on an on-going basis. If that’s everybody’s job – it’s nobody’s job. As I’ve advocated for here, organizational systems (and the tools that reflect them) need an explicit and well defined “system architect” role who can be held accountable to the value the tool unlocks.

No clarity on where the tool begins and ends  

Enterprise problems don’t exist in a vacuum. As Russel Ackoff pointed out, in complex systems problems tend to be intertwined in a “mess” where each problem has some derivative impact on the other problems.

Enterprise tools don’t exist in a vacuum. New tools get introduced into a preexisting set of solutions aimed at solving related and partly-overlapping problems. There are far too many examples of tools that fail to take that into account, offering their own solution/functionality to problems that have very little surface area with the core problem that they are solving, and are already well addressed by a more generic tool.

There is one problem area where this message seems to have fully sunk in already, at least in most cases: authentication. Very few new enterprise tools aim to reinvent the wheel there and add yet-another-set-of-credentials, rather than assume a pre-existing credentials system (Google Apps, Facebook) or at the bare minimum supper SAML integration. But that’s not the case for many other basic functionalities such as calendaring, file storage and messaging/notifications.

To give one example, consider Namely , which created its own custom shared “steam”/“wall” functionality for automatically posting work anniversaries, birthdays and sharing out HR announcements. But does it really make sense to have a dedicated view just for that content? Or would it be much more effectively consumed if it was part of the core enterprise communication system, be it a daily email digest, or a channel in Slack?

Again, responsibility lies with both sides.

Providers, on their end, could be more thoughtful in deciding where their tool starts and ends, focusing on the core capability that they are bringing to the table and “playing nicely” with other tools for anything else. Des Traynor of Intercom has a fantastic 10 minute talk on this matter.

Customers, on their end, would benefit from having a clearer picture of their own enterprise tools portfolio, identifying areas of overlap and areas of gaps, and using this analysis to proactively push providers to de-dup overlap and fill in gaps.

 

Make enterprise tools great (again?)