Quantcast
Channel: Management – SCTR7: Data Science and Analytics
Viewing all articles
Browse latest Browse all 75

Welcome to the Agora: The Whys and Hows of Social Network Analysis (SNA) for Analytics Decision Audits

$
0
0

Social Network Analysis (SNA) offers a powerful, low impact method to examine the quality of organizational decision processes.  Such a tool is of interest to evidence-based managers, decision professionals, and analytics practitioners alike.

Continuing our journey through ancient Greek myth and philosophy, today’s missive introduces the Agora.  The Agora was an ancient Greek assembly and gathering place where society would meet to exchange in rigorous debate in order to make significant decisions for the city-state.  Later this space also became a market place, such that roaming citizens would mix commercial activities with speeches and presentations, hence the two Greek verbs ἀγοράζω, agorázō, ‘I shop’, and ἀγορεύω, agoreúō, ‘I speak in public’. We propose here that in order to build-in strategic decision making fortitude, organizations, via decision and analytics managers and practitioners, must consider the composition and structure of their own ‘Agora’ in terms of how it drives decision quality.

The second-part of a series on the role of organizational politics in decision making, this article goes into greater detail concerning the proposed application of Social Network Analysis (SNA) to organizational decision process analysis.  The first article, ‘The Business Analytics Achilles Heel: Organizational Politics’ (http://sctr7.com/2013/02/20/achilles_heel/), proposed that unexamined organizational politics, in terms of organizational structural interactions amongst key decision process stakeholders, can easily lead decision processes and analytics initiatives astray.  This article goes into greater detail concerning the ‘why’ and ‘how’ a decision or analytics practitioner might apply SNA to improve insight into organizational decision process quality.

As the interaction between organizational decision stakeholders can heavily influence the quality of a decision process, examining the relations between decision actors can offer insight into potential organizational process problems and pitfalls.  During the entire chain of a decision process, from problem identification and framing through interpreting and communicating the results of an analytical inquiry, subtle biases and ‘agency interests’ (self-serving tendencies) can creep into and affect process robustness.  Examining potential shortfalls and breakdowns in the network of decision stakeholders can provide a potential intervention tool to ensure high-quality deliberations.

Although the term often has a negative connotation, we cannot say that ‘organizational politics’, as a phenomenon in-of-itself, is necessarily ‘good’ or ‘bad’.  Indeed, organizational politics are the ‘operating system’ upon which organizations run, not something that should or even can be ‘programmed out’.  As we do have an understanding of ‘good’ and ‘bad’ decision processes, based upon research and organizational best-practices, we can, however, propose that there are ‘healthy’ and ‘unhealthy’ interactions between organizational decision stakeholders in terms of process ‘robustness’ or ‘thoroughness’.

Robust organizational politics indicates a conscious attempt to remove hidden bias and undue subjective influence from decision processes.  Much as a vigorous and healthy debate in a parliament, as opposed to a filibuster or premature closure to discussions, leads to a robust and multi-faceted consideration of legislation at hand, an organizational political process which avoids information-enhancing confrontations can short-circuit decision process quality.

As opposed to the misguided notion that organizational politics are inherently bad and need to be removed, from a broad frame, an organization can be considered as being inherently “politics all the way down”. From this we can propose that, in terms of decision making process quality (robustness of processes for problem identification, framing, analysis, validation, etc.), there are ‘more’ and ‘less’ healthy political processes. This is certainly visible in the news in the interactions of national governments (within nations and between nations): we see constructive political decision dialogues and dysfunctional dialogues.

Some theories of the firm believe that an organization is essentially an assemblage of agents held together, in tension, by incentives (some shared, some selfish), assessment systems, and decision rights accompanied with access to knowledge, some of which is protected. From the HR discipline, this is the basis of ‘organizational architecture’, bound by Management Control Systems.

In a 2010 Harvard Business Review article, ‘The Decision-Driven Organization’ (http://hbr.org/2010/06/the-decision-driven-organization/ar/1), it is proposed that organizations focus too heavily on formal management top-down structures when they should be more concerned with robust decision processes in terms of the interacting network of stakeholder roles and decision rights (Blenko et al, 2010).  The claim is that the fashion for perpetual reorganizations and restructuring misses a fundamental point: it is not the ‘power’ of individuals which drives strategic fortitude, but rather the network of interactions and role-based rights which drive decision efficacy and agility.  If we accept this ‘story’, even as a simple useful allegory, we can ‘map’ the ways in which actors interact in order to assess the robustness associated with decision-focused processes.

A simple, yet powerful way to understand the more complex socio-organizational context related to organizational actor interactions (as specifically associated with decision processes) is Social Network Anlaysis (SNA). SNA goes beyond linear understandings of decision processes (i.e. business process maps, information / data flows, governance structures) and results in distributed network-based ‘maps’ of interacting roles.

An organizational decision stakeholder map clarifies who interacts with whom, and on what basis. What becomes visible is hubs and spokes, chains of communication, and strong versus weak coalitions, both cooperating and competing.  Coalition competition can be healthy when it builds resilience in decision analysis and analytical interpretation (i.e. a formal ‘shadow CEO’ function, for instance, much as the UK parliament has a shadow chancellor or the U.S. Congress has a Minority Whip to focus informed dissent and deliberation in debate).  For an executive decision stakeholder, such maps can provide an intervention guide in terms of protecting against bias (implicit or explicit) and addressing the dangers of hidden agency influence (i.e. empire building, destructive competitiveness, manipulating decision processes or analytics framing).

The notion is that the organization can be seen as a type of ‘decision making machine’, albeit one which is quite slow and at times also quite flawed (i.e. through the influence of both inherent decision biases and agency forces). To the degree we consciously attempt to map potential process breakdowns which occur at the organizational communication network level, one can attempt to introduce intervention to overcome decision process shortcomings (i.e. when a process is followed, but the participants are not interacting in a robust way).

Again, this proposes the notion that given that organizations are inherently ‘political’, decision professionals should consider that there are ‘healthy’ and ‘unhealthy’ political processes. This is otherwise represented in business literature and research concerning strategy formation: healthy organizational dialogues and structured analytics are a fundament of high-quality strategic decision making.  For a representative survey of recent related articles, see the references at the close of this article.

Now, to get down to ‘brass tacks’, as we say in English (http://en.wiktionary.org/wiki/brass_tacks): how does one get started with SNA for conducting organizational decision structure audits?

Attending a conference and/or training seminar on SNA analysis is a good start for those just beginning. One might consider joining INSNA (http://www.insna.org/index.html), for instance, and taking a training or practicum at an area conference (or networking with local members to locate training opportunities).  Otherwise, for self-starters there is a wealth of resources available on the internet.  Also, several introductory texts are included in the list of references at the close of this article.

For high-level analysis: an SNA analysis & visualization tool is a solid foundation to starting your inquiry. UCINET is regarded as a good all-around choice: https://sites.google.com/site/ucinetsoftware/home .

While one can quickly conduct and visualize with impressive results, a good amount of forethought into the type of network one is studying should inform the ‘snowball surveys’ that form the basis of data collection (considering the basis of social connections and roles that one wishes to ‘map’).

In the case of decision networks, surveying base sociological data is a start, providing demographic factors such as gender, tenure, organizational status. The core SNA data relates to identifying and categorizing ROLES in terms of decision processes: collaborative exchanges (functional, role mandated) and communicative exchanges (information exchange), as well as purely social connections (affinities and rivalries).

The RACI Matrix method (charting who is Responsible, Accountable, Consulted, and Informed in a decision process) concept is perhaps a good way to categorize roles in relation to an organizational decision process (see: http://en.wikipedia.org/wiki/RACI_matrix). This forces a strict and focused accounting of who is involved in the decision process, how they relate / participate, and their relative power in terms of the final decision. The relational data (data on the interactions between participants) is gathered via a simple survey of the participants.  The resulting feedback, when collated, outlines the central decision agents, specifying the participant roles and their connections to others in a network format.  This provides a foundation for identifying ‘social architectural’ strengths and weaknesses in the organizational decision making network.

A snowball survey is the most common approach for active data gathering: a series of three or four surveys sent out in an organization (as grounded to a particular analytics initiative or decision process). The first survey released asks the core decision stakeholders who they interact with concerning a decision and on what basis (also including people perceived as ‘foils’ or ‘rivals’). The second survey goes out to these people (the ones identified but not included in the first survey), who are asked the same set of questions.  Likewise, third and fourth surveys are distributes for new participants identified (perhaps more in particularly large and complex organizations). The surveys themselves are low impact, taking perhaps 10 – 15 minutes to complete for each agent. What comes back is a rich set of data showing the composition of the organizational decision network. This can be run through a program such as UCINET for deeper analysis and visualization.

Passive data collection is also possible: tracking phone calls, email, SMS messages, etc. to build the relational network. However, challenges here are getting permission / access to this information: not every corporation will just allow one to scan their email server to build an SNA map. Also, the proliferation of corporate communication tools and devices makes this more complicated (people may be doing quite a lot of communication outside email alone, making a focus on email potentially self-limiting).

There are increasing uses of specialty tools to drive decision processes. For instance, Yammer (https://www.yammer.com/) offer a private Facebook type platform for corporations. This can offer an excellent foundation for passive SNA data collection.

In complex, multi-staged decisions, there may be several social networks involved (each grounded by a different ‘phase’ in a staged decision process). In this case, a series of social decision networks begin to resemble something quite familiar: a neural network.  The degree to which staged social decision networks can be formally optimized to build a better organizational decision making ‘brain’ will be examined in a future article.  Again, if we think of an organization as a type of flawed decision making computer, we can consider building in redundancy and robustness in the ‘organizational brain’ via formal mathematical techniques for architecting neural network-based decision methods.  Not science fiction, merely a recasting of our perspective concerning the nature of an organization as a network of agents, rather than a fractious and flawed mob of argumentative and self-centered individuals.  If we cast off the notion that ‘politics are inherently perverse’, we get closer to the ability to engineer organizational politics for strategic decision making fortitude.

References:  Selected Works Related to Decision Management, Analytics, and SNA

  • Blenko, M. W., M. C. Mankins, et al. (2010). “The Decision-Driven Organization.” Harvard Business Review.
  • Carrington, P. J., J. Scott, et al., Eds. (2005). Models and Methods in Social Network Analysis. New York, Cambridge University Press.
  • Davenport, T. H. and J. G. Harris (2007). Competing on Analytics: The New Science of Winning. Boston, MA, USA, Harvard Business School Press.
  • Davenport, T. H., J. G. Harris, et al. (2010). Analytics at Work: Smarter Decisions, Better Results. Boston, MA, USA, Harvard Business Review Press.
  • Grant, R. M. (1997). “The Knowledge-based View of the Firm: Implications for Management Practice.” Long Range Planning 30(3): 4.
  • Kaner, M. and R. Karni (2004). “A Capability Maturity Model for Knowledge-Based Decision making.” Information Knowledge Systems Management 4: 27.
  • Kiron, D. and R. Shockley (2011). “Creating Business Value with Analytics.” MIT Sloan Management Review 53(1): 10.
  • Kiron, D., R. Shockley, et al. (2011). “Analytics: The Widening Divide.” MIT Sloan Management Review(Special Report): 21.
  • Knoke, D., Yang, S. (2008). Social Network Analysis. London, SAGE Publications, Inc.
  • LaValle, S., M. S. Hopkins, et al. (2010). “Analytics: The New Path to Value.” MIT Sloan Management Review: 22.
  • LaValle, S., E. Lesser, et al. (2011). “Big Data, Analytics and the Path From Insights to Value.” MIT Sloan Management Review 52(2): 13.
  • Nutt, P. C. (2002). Why Decisions Fail: Avoiding the Blunders and Traps that Lead to Debacles. San Francisco, CA, USA, Berrett-Koehler.
  • Tan, C.-S., Y.-W. Sim, et al. (2011). “A Maturity Model of Enterprise Business Intelligence.” Communications of the IBIMA 2011: 11.


Viewing all articles
Browse latest Browse all 75

Trending Articles