No Datum is an Island of Serendip

Continuing a series of blog posts inspired by the highly recommended book Where Good Ideas Come From by Steven Johnson, in this blog post I want to discuss the important role that serendipity plays in data — and, by extension, business success.

Let’s start with a brief etymology lesson.  The origin of the word serendipity, which is commonly defined as a “happy accident” or “pleasant surprise” can be traced to the Persian fairy tale The Three Princes of Serendip, whose heroes were always making discoveries of things they were not in quest of either by accident or by sagacity (i.e., the ability to link together apparently innocuous facts to come to a valuable conclusion).  Serendip was an old name for the island nation now known as Sri Lanka.

“Serendipity,” Johnson explained, “is not just about embracing random encounters for the sheer exhilaration of it.  Serendipity is built out of happy accidents, to be sure, but what makes them happy is the fact that the discovery you’ve made is meaningful to you.  It completes a hunch, or opens up a door in the adjacent possible that you had overlooked.  Serendipitous discoveries often involve exchanges across traditional disciplines.  Serendipity needs unlikely collisions and discoveries, but it also needs something to anchor those discoveries.  The challenge, of course, is how to create environments that foster these serendipitous connections.”

 

No Datum is an Island of Serendip

“No man is an island, entire of itself; every man is a piece of the continent, a part of the main.”

These famous words were written by the poet John Donne, the meaning of which is generally regarded to be that human beings do not thrive when isolated from others.  Likewise, data does not thrive in isolation.  However, many organizations persist on data isolation, on data silos created when separate business units see power in the hoarding of data, not in the sharing of data.

But no business unit is an island, entire of itself; every business unit is a piece of the organization, a part of the enterprise.

Likewise, no datum is an Island of Serendip.  Data thrives through the connections, collisions, and combinations that collectively unleash serendipity.  When data is exchanged across organizational boundaries, and shared with the entire enterprise, it enables the interdisciplinary discoveries required for making business success more than just a happy accident or pleasant surprise.

Our organizations need to create collaborative environments that foster serendipitous connections bringing all of our business units and people together around our shared data assets.  We need to transcend our organizational boundaries, reduce our data silos, and gather our enterprise’s heroes together on the Data Island of Serendip — our United Nation of Business Success.

 

Related Posts

Data Governance and the Adjacent Possible

The Three Most Important Letters in Data Governance

The Stakeholder’s Dilemma

The Data Cold War

Turning Data Silos into Glass Houses

The Good Data

DQ-BE: Single Version of the Time

My Own Private Data

Sharing Data

Are you Building Bridges or Digging Moats?

The Collaborative Culture of Data Governance

The Interconnected User Interface

The Big Data Collider

As I mentioned in a previous post, I am reading the book Where Good Ideas Come From by Steven Johnson, which examines recurring patterns in the history of innovation.  The current chapter that I am reading is dispelling the traditional notion of the eureka effect by explaining that the evolution of ideas, like all evolution, stumbles its way toward the next good idea, which inevitably, and not immediately, leads to a significant breakthrough.

One example is how the encyclopedic book Enquire Within Upon Everything, the first edition of which was published in 1856, influenced a young British scientist, who in his childhood in the 1960s was drawn to the “suggestion of magic in the book’s title, and who spent hours exploring this portal to the world of information, along with the wondrous feeling of exploring an immense trove of data.”  His childhood fascination with data and information influenced a personal project that he started in 1980, which ten years later became a professional project while he has working in the Swiss particle physics lab CERN.

The scientist was Tim Berners-Lee and his now famous project created the World Wide Web.

“Journalists always ask me,” Berners-Lee explained, “what the crucial idea was, or what the singular event was, that allowed the Web to exist one day when it hadn’t the day before.  They are frustrated when I tell them there was no eureka moment.”

“Inventing the World Wide Web involved my growing realization that there was a power in arranging ideas in an unconstrained, web-like way.  And that awareness came to me through precisely that kind of process.”

CERN is famous for its Large Hadron Collider that uses high-velocity particle collisions to explore some of the open questions in physics concerning the basic laws governing the interactions and forces among elementary particles in an attempt to understand the deep structure of space and time, and, in particular, the intersection of quantum mechanics and general relativity.

 

The Big Data Collider

While reading this chapter, I stumbled toward an idea about Big Data, which as Gartner Research explains, although the term acknowledges the exponential growth, availability, and use of information in today’s data-rich landscape, it’s about more than just data volume.  Data variety (i.e., structured, semi-structured, and unstructured data, as well as other types of data such as sensor data) and data velocity (i.e., how fast data is being produced and how fast the data must be processed to meet demand) are also key characteristics of Big Data.

David Loshin’s recent blog post about Hadoop and Big Data provides a straightforward explanation and simple example of using MapReduce for not only processing fast-moving large volumes of various data, but also deriving meaningful insights from it.

My idea was how Big Analytics uses the Big Data Collider to allow large volumes of various data particles to bounce off each other in high-velocity collisions.  Although a common criticism of Big Data is that it contains more noise than signal, smashing data particles together in the Big Data Collider may destroy most of the noise in the collision, allowing the signals that survive that creative destruction to potentially reduce into an elementary particle of business intelligence.

Admittedly not the greatest metaphor, but as we enquire within data about everything in the Information Age, I thought that it might be useful to share my idea so that it might stumble its way toward the next good idea by colliding with an idea of your own.

 

Related Posts

OCDQ Radio - Big Data and Big Analytics

OCDQ Radio - Good-Enough Data for Fast-Enough Decisions

OCDQ Radio - A Brave New Data World

Data, Information, and Knowledge Management

Thaler’s Apples and Data Quality Oranges

Data Confabulation in Business Intelligence

Data In, Decision Out

The Data-Decision Symphony

The Real Data Value is Business Insight

Is your data complete and accurate, but useless to your business?

Beyond a “Single Version of the Truth”

The General Theory of Data Quality

The Data-Information Continuum

Schrödinger’s Data Quality

Data Governance and the Buttered Cat Paradox

Data Governance and the Adjacent Possible

I am reading the book Where Good Ideas Come From by Steven Johnson, which examines recurring patterns in the history of innovation.  The first pattern Johnson writes about is called the Adjacent Possible, which is a term coined by Stuart Kauffman, and is described as “a kind of shadow future, hovering on the edges of the present state of things, a map of all the ways in which the present can reinvent itself.  Yet it is not an infinite space, or a totally open playing field.  The strange and beautiful truth about the adjacent possible is that its boundaries grow as you explore those boundaries.”

Exploring the adjacent possible is like exploring “a house that magically expands with each door you open.  You begin in a room with four doors, each leading to a new room that you haven’t visited yet.  Those four rooms are the adjacent possible.  But once you open any one of those doors and stroll into that room, three new doors appear, each leading to a brand-new room that you couldn’t have reached from your original starting point.  Keep opening new doors and eventually you’ll have built a palace.”

If it ain’t broke, bricolage it

“If it ain’t broke, don’t fix it” is a common defense of the status quo, which often encourages an environment that stifles innovation and the acceptance of new ideas.  The status quo is like staying in the same familiar and comfortable room and choosing to keep all four of its doors closed.

The change management efforts of data governance often don’t talk about opening one of those existing doors.  Instead they often broadcast the counter-productive message that “everything is so broken, we can’t fix it.”  We need to destroy our existing house and rebuild it from scratch with brand new rooms — and probably with one of those open floor plans without any doors.

Should it really be surprising when this approach to change management is so strongly resisted?

The term bricolage can be defined as making creative and resourceful use of whatever materials are at hand regardless of their original purpose, stringing old parts together to form something radically new, transforming the present into the near future.

“Good ideas are not conjured out of thin air,” explains Johnson, “they are built out of a collection of existing parts.”

The primary reason that the change management efforts of data governance are resisted is because they rely almost exclusively on negative methods—they emphasize broken business and technical processes, as well as bad data-related employee behaviors.

Although these problems exist and are the root cause of some of the organization’s failures, there are also unheralded processes and employees that prevented other problems from happening, which are the root cause of some of the organization’s successes.

It’s important to demonstrate that some data governance policies reflect existing best practices, which helps reduce resistance to change, and so a far more productive change management mantra for data governance is: “If it ain’t broke, bricolage it.”

Data Governance and the Adjacent Possible

As Johnson explains, “in our work lives, in our creative pursuits, in the organizations that employ us, in the communities we inhabit—in all these different environments, we are surrounded by potential new ways of breaking out of our standard routines.”

“The trick is to figure out ways to explore the edges of possibility that surround you.”

Most data governance maturity models describe an organization’s evolution through a series of stages intended to measure its capability and maturity, tendency toward being reactive or proactive, and inclination to be project-oriented or program-oriented.

Johnson suggests that “one way to think about the path of evolution is as a continual exploration of the adjacent possible.”

Perhaps we need to think about the path of data governance evolution as a continual exploration of the adjacent possible, as a never-ending journey which begins by opening that first door, building a palatial data governance program one room at a time.

 

Related Posts