Show Summary Details

Page of

PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, BUSINESS AND MANAGEMENT (business.oxfordre.com). (c) Oxford University Press USA, 2016. All Rights Reserved. Personal use only; commercial use is strictly prohibited. Please see applicable Privacy Policy and Legal Notice (for details see Privacy Policy).

date: 21 November 2017

University Technology Commercialization

Summary and Keywords

For decades researchers have studied various aspects of the technology transfer and commercialization process in universities in hopes of discovering effective methods for enabling more research to leave the university as technologies that benefit society. However, this effort has fallen short, as only a very small percentage of applied research finds its way to the marketplace through licenses to large companies or to new ventures. Furthermore, the reasons for this failure have yet to be completely explained.

In some respects, this appears to be an ontological problem. In their effort to understand the phenomenon of university commercialization, researchers tend to reduce the process into its component parts and study each part in isolation. The result is conclusions that ignore a host of variables that interact with the part being studied and frameworks that describe a linear process from invention to market rather than a complex system. To understand how individuals in the technology commercialization system make strategic choices around outcomes, studies have been successful in identifying some units of analysis (the tech transfer office, the laboratory, the investment community, the entrepreneurship community); but they have been less effective at integrating the commercialization process, contexts, behaviors, and potential outcomes to explain the forces and reciprocal interactions that might alter those outcomes.

The technology commercialization process that leads to new technology products and entrepreneurial ventures needs to be viewed as a complex adaptive system that operates under conditions of risk and uncertainty with nonlinear inputs and outputs such that the system is in a constant state of change and reorganization. There is no overall project manager managing tasks and relationships; therefore, the individuals in the system act independently and codependently. No single individual is aware of what is going on in any other part of the system at any point in time, and each individual has a different agenda with different metrics on which their performance is judged. What this means is that a small number of decision makers in the university commercialization system can have a disproportionate impact on the effectiveness and success of the entire system and its research outcomes.

Critics of reductionist research propose that understanding complex adaptive systems, such as university technology commercialization, requires a different mode of thinking—systems thinking—which looks at the interrelationships and dependencies among all the parts of the system. Combined with real options reasoning, which enables resilience in the system to mitigate uncertainty and improve decision-making, it may hold the key to better understanding the complexity of the university technology commercialization process and why it has not been as effective as it could be.

Keywords: technology commercialization, complex adaptive systems, complexity theory, tech transfer, real options, complexity, uncertainty, university

Introduction

As the birthplace of some of the world’s greatest inventions and innovations, research universities have long been a rich source of supply for entrepreneurs seeking to exploit opportunities that might otherwise lie dormant in a research lab. At the same time, research universities have awakened to the possibilities for reputational and financial rewards from translating research into good and services. As a result, they have invested in people and processes that can make that happen.

For decades researchers have studied various aspects of the commercialization process in the university environment in the hope of identifying effective methods for enabling more research to leave the university as products and services that benefit society. Sadly, that effort has fallen short. Only a relatively small percentage of applied research finds its way to the marketplace, and that is the case not only in the United States but around the world (Bercovitz & Feldman, 2008).

The Association for University Technology Managers (AUTM) reported on the gross licensing income generated by the top university earners from 2003–2012 (155 universities reported income). What they found was not encouraging:

  • The distribution of licensing income is a long-tail curve with the top 5% of earners—eight universities—earning 50% of the total licensing income.

  • The top 10%, or 16 universities, earn about 75% of total licensing income.

  • The top half of the 155 universities that reported licensing income control 90% of research dollars as well as licensing income (“AUTM Licensing Survey,” 2013).

With these results, it is not surprising that over a period of 20 years, about 87% of Technology Transfer Offices (TTOs) did not break even, causing significant problems for commercialization efforts. Although universities have in place the systems and people needed to produce effective commercialization of research, the results continue to be dismal at best.

For researchers, university technology commercialization became a new field of study that had much in common with the research on entrepreneurship. As a new field focused on startups, entrepreneurship had to justify its existence, and it did that in two ways: (1) conducting empirical research, and (2) borrowing theory from other disciplines such as sociology and strategy. Some scholars have argued that all the research produced in entrepreneurship appears to be nothing more than “a collection of researchers studying similar phenomena from disparate disciplinary origins.” (Aldrich, 2015). Others have proposed that the fundamental issues in the field are often obscured by the fragmentation of perspectives and theories stemming from the failure to connect all the dots (Anderson, Dodd, & Jack, 2012). Still others have engaged in reductionism in their pursuit of the truth. As humans we tend to want to simplify phenomena in an effort to understand. Therefore, when faced with two hypotheses having equivalent predictive power about a given phenomenon, the tendency is to choose the more elegant one—the one that is simpler. Fragmentation of research is inherently an ontological problem; it is the challenge of understanding the existence or reality of a phenomenon, whether that is entrepreneurship or technology commercialization and its various components. The easiest way to tackle the problem is to break it into its component parts and study each in the hope of arriving at meaningful conclusions that advance the field. Therefore, this type of research typically produces frameworks that purport to explain how the commercialization process works. Over the years, a multitude of frameworks have been proposed, most of which view commercialization as a linear process from invention to market. These frameworks consider decisions, actions, and success metrics, but a few have also looked at drivers, capabilities, and implementation mechanisms in an effort to provide a new perspective (Manoukian, Hassab Elnaby, & Odabashian, 2015). While there is nothing inherently wrong with the existing compartmentalized body of research and the frameworks that have resulted, we do lose the insights that might have emerged from cross-disciplinary collaboration and from seeking to study more complex issues about what we shall call the university commercialization system (UCS).

Key research themes in the UCS literature have included (1) the embryonic nature of university technology; (2) university policies that help or hinder technology transfer; (3) the challenge of brain drain and the associated back-door commercialization; (4) technology transfer office productivity; (5) academic freedom to publish versus the need to protect intellectual property; (6) new firm creation through university spinoffs; and (7) research parks, incubators, and accelerators. In addition, some studies have recognized that the different phases of commercialization have different objectives (Parker & Mainelli, 2001). We commonly refer to three phases:

  • Scientific phase, where discovery of something novel occurs—an invention.

  • Development phase, where the discovery becomes an innovation with potential applications. This phase is often referred to as the “valley of death” to reflect the enormous challenges associated with product development.

  • Exploitation phase, where the value of the innovation is delivered to the customer. This is known as the go-to-market phase.

Within these phases, it is common to study elements or aspects in a vacuum, such as tech transfer office productivity or processes for project vetting. However, rarely do studies look at interactions among the various elements in the broader system that might yield new knowledge and different outcomes. For example, we might consider the effect of university policy on technology transfer effectiveness and draw conclusions from that research without ever looking at the potential interactional effects of reward systems for faculty or the availability of seed funding for prototype development. Much of what is studied is operationalized on a “paradigm of simplification, compartmentalization, and boundaries, when we need a paradigm of diversity, complexity, relationships, and process” (Brockman, 2015).

In discussing an analogous problem in the field of entrepreneurship, Aldrich argued that “… feedback from outcomes modifies entrepreneurs’ strategies, which, in turn, alter the likelihood of achieving a new outcome” (Aldrich & Martinez, 2001). In other words, complex systems are evolutionary, not static; they are teeming masses of reciprocal relationships that are influenced by endogenous and exogenous changes and adaptations, most of which cannot be predicted. Any change in any area of the system has the potential to alter the outcome of a commercialization effort. Given that uncertainty and change are constants, the system must adapt. Therefore, it may be that the reason for the lack of clear evidence to support the effectiveness of the UCS is an epistemological problem—we are studying technology commercialization the wrong way.

Complex Adaptive Systems

The UCS is an example of a complex adaptive system. Complexity theory (CT) is the study of nonlinear dynamic systems. Its origins lie in physics and mathematics, but its application has grown to encompass biological systems, networks, and economic and social systems such as the UCS. Applying CT to business, specifically the behavior of organizations and firms within networks, has been the subject of much interest (Lewin, 1999).

Complex adaptive systems (CAS) are dynamic systems capable of adapting and interacting with changing environments, especially in cases of uncertainty, to produce new and unpredictable patterns and outcomes. If a CAS were simply complicated, there would be rules by which we could assign probabilities and assess risk. We would be able to understand all the elements in the system and how they interact. However, a CAS is a dynamic, interconnected whole that defies simple explanation and cause and effect relationships. “From an evolutionary point of view, an adaptive organization is like a ship on the open sea that has to rebuild itself [while] staying afloat” (Fontana & Ballati, 1999).

CAS is a subset of the larger field of complexity theory. It is interesting to note that researchers cannot agree on one clear definition of complexity (Wallis, 2009); rather, a number of theories about complex systems come together under the broad classification of complexity research (Pollack, Adler, & Sankaran, 2014). The theoretical basis for CT derives from the natural sciences—such as physics, biology, chemistry, and mathematics—and generally refers to the study of phenomena emerging from a system with many interacting parts. Management theorists continue to search for ways to effectively apply complexity theory to organizations while at the same time recognizing that it is not applied in management research in quite the same way that it is applied in mathematical and computer science research (Pollack et al., 2014).

Because a CAS introduces dynamism and uncertainty into the mix, it has been referred to as “the ultimate interdisciplinary science” (McKelvey, 1999). The dynamism of a CAS creates the opportunity for new discoveries that enable the system to grow and evolve in ways that can’t be foreseen deterministically. Nor can these changes be orchestrated by a manager or project leader. Any new order at the macro level emerges from microprocesses at multiple levels that involve behaviors, interactions, and feedback loops. Because of the complexity of the interactions, distinguishing between dependent and independent variables for purposes of research is reduced if not irrelevant (Lewin & Volberda, 1999). As a dynamic approach, CAS theory overcomes the limitations of social network theory, which generally focuses on the static structure of the network (Dominici & Levanti, 2011). Behavioral psychology tells us that multiple individuals within the system have their own objectives and aspirations and because each is operating in a different context, it’s nearly impossible to predict the outcome of any interactions among those individuals. What exacerbates the complexity of the system is the fact that individuals make decisions about aspects of the system that are not only rational but also emotional (Foster, 2000). Therefore, for a system to work, the various individuals in it must find ways to organize the complexity. The logic of a CAS is both holistic and multilevel and as such offers a tremendous opportunity to increase the effectiveness and speed of the processes within the system if complexity can be managed (Dominici & Levanti, 2011).

Complex adaptive systems have several attributes that apply in the context of university technology transfer and commercialization under conditions of uncertainty, and they are presented in Table 1 (Allen, 2012).

Table 1. The CAS attributes of university technology transfer and commercialization

Complex Adaptive Systems Attributes

University Technology Transfer & Commercialization Manifestation

Environmental Uncertainty

Inability to predict the external environment or the outcomes of technology transfer efforts in that environment.

Autonomous but Interdependent Components

Multiple organizations within and outside the university that perform specific functions within the process and that change and evolve without influence from the external environment.

Emergent

New institutional forms, behaviors, and outcomes develop as a result of stimuli from the internal and external environments as well as collective behavior of individuals within the system.

Self-Organizing

Unplanned creation of activities resulting from internal stimuli such as learning, improvement, and process variation.

Complexity

Multiple, nonlinear inputs and outputs that coevolve over time.

Path Dependence

Legacy policies, behaviors, and other stimuli embedded in the process.

Operational Openness and Closure

The need for the university to engage in the mutual exchange of both open and closed resources, knowledge, and capabilities with the external environment to achieve its goals.

The following sections examine these attributes in more depth.

UCS Risk and Uncertainty

De Meyer, Loch, and Pich (2002) identified four types of uncertainty that typically are found in any project:

Variation

Variation can be seen in the deviations in such things as schedules and budgets that may affect the critical path of a project moving through the UCS. Generally, these variations are small in nature and are normally monitored for time and cost.

Foreseen Uncertainty

Uncertainty that is possible but not probable has been called “known unknowns.” This type of uncertainty requires having alternatives that can be brought to bear quickly should the uncertainty become certain. For example, whether or not the United States Patent and Trademark Office (USPTO) issues a patent on a particular application is a foreseen uncertainty that requires a Plan B in place for going to market without patent protection should the patent be rejected.

Unforeseen Uncertainty

Uncertainty that cannot be identified a priori is called “unknown unknowns.” For this type of uncertainty there is no preexisting contingency plan, so both the event and its outcomes are uncertain. It can also occur when people and things that normally aren’t in contact are suddenly juxtaposed, thereby producing an unexpected or surprising outcome with huge impact. For example, Botox was originally widely used to combat chronic migraine and eye muscle disorders when it was discovered to be effective as a cosmetic treatment for wrinkles; many years later it was also found to be effective in the treatment of incontinence.

Chaos

Chaos is a type of uncertainty that results in a system ending up in a completely different place from where it started. A defining attribute of chaotic systems is that a miniscule change in the initial system conditions can alter the behavior and outcomes of the entire system. Weather is an example of a chaotic system, although the reality is that most systems are chaotic to some degree.

If viewed as a complex adaptive system rather than as linearly-connected activities, the UCS has much in common with other large-scale efforts such as manufacturing or development programs in terms of the levels of uncertainty and risk. The decision makers of the UCS must make a number of strategic decisions around such things as research investments, patenting, prototype seed funding, business support, and incubation, most of which involve variation, foreseen uncertainty, and unforeseen uncertainty. The chaotic aspect of the UCS system derives from the fact that it is people who are making all the decisions in the system. How those decisions are made has important implications for the success of the commercialization effort, an effort that entails both risk and uncertainty. The UCS faces a great deal of uncertainty from decisions made by individuals throughout the internal and external parts of the system. These individuals are difficult to identify and control, but we know that they have individual agendas that may impede the commercialization process, and their decisions can often exert a major impact on the outcomes of the entire UCS. For example, one of the common metrics of success for the TTO is the number of licenses granted to large companies and to entrepreneurs who start new ventures. While TTO organizational effectiveness is critical to UCS success, its efforts are frequently hampered by individual decision makers in other parts of the system. Some studies have found, for instance, that researchers are less likely to engage in commercialization activities if their department chair is not active in that effort and the effort is not rewarded in terms of promotion and reputation (Bercovitz & Feldman, 2008). Because of the nature of the reward system at universities, researchers typically regard sponsored research as more significant a motivator than licensing income (Thursby, Thursby, & Gupta-Mukherjee, 2007). Furthermore, those who desire to commercialize their research may attempt to avoid the bureaucracy of the TTO altogether, using “back-door” approaches to commercialization such as trade secrets or spin-off ventures with technology that doesn’t require patents (Shane, 2004).

In the UCS, there is no overall project manager managing tasks and relationships because it would be impossible for any single person to oversee the entire system (McDaniel, 2007). The individuals in the UCS system are scattered across the university as well as outside the university. Each acts independently and codependently as the need for the individual’s efforts arises. No single individual is aware of what is going on in any other part of the system at any point in time, but rather individuals focus solely on what is happening in their local environment.

TTOs have many tools at their disposal to manage and predict risk within whatever level of risk tolerance they might have. Often TTOs are faced with making very early-stage bets on research developed at the university. A scientist submits a disclosure document on his or her research in the hope of eventually filing for patents. The TTO will make a relatively small, low-risk bet by filing for a provisional patent. However, when it comes time to file the nonprovisional patent one year later, a much risker proposition, the decision is often made on relatively little market information and certainly without any assurances that a patent will be granted. In fact, the highest probability may be that the patent won’t be granted, based on USPTO statistics. Nevertheless, the TTO must make a judgment about the probability that the research will be granted a patent and thereby be available for commercialization through a license agreement. Then it must predict what the value of that license agreement will be. What the TTO has in its arsenal from which to determine these probabilities is historical patterns for previously filed applications in the particular area under consideration. This is problematic, as the work of Tversky and Kahneman has demonstrated. It is likely the TTO will suffer from representativeness bias in attempting to calculate the probability of successfully bringing the research to the commercialization stage (Tversky & Kahneman, 1974). This means that licensing officers will exhibit great confidence in their predictions about the probability of securing a patent if the new application fits the stereotypical application for a technology that either was successful or failed in past. Finding a good fit between the input information and the predicted outcome, licensing officers move forward under the “the illusion of validity” (Tversky & Kahneman, 1974). The reality is that the greater the number of redundant and correlated inputs they rely on, the greater the inaccuracy of the prediction because they can’t control for all the dynamic variables in the system.

The UCS as Complex, Emergent, and Self-Organizing

The university environment is characterized by a loosely connected group of schools and other entities that are weakly integrated and not subject to hierarchical rigors and authoritarian decision making. In fact, there is no central authority that can impose decisions on the university as a whole; instead, decisions are negotiated or incentivized to encourage the diverse population of faculty, researchers, students, and staff—each group with a different agenda—to cooperate and collaborate. Consequently, the university suffers from asymmetric accountability—many people are responsible for the success and failure of multiple efforts within its system, so no one person can be held accountable for negative outcomes from the system as a whole. It is within this complex environment that the UCS is embedded.

As a CAS, the UCS is faced with a multiplicity of nonlinear inputs and outputs such that the system is in a constant state of change and reorganization. Consequently, a type of emergent order or self-organizing arises “naturally” out of those interactions (Tsoukas & Chia, 2002). In this emergent environment, it is difficult, if not impossible, to do strategic planning or even contingency planning because any changes in any part of the system precipitate new learning and insights that can ultimately transform the entire system. Although essential for risk mitigation, in this type of environment it is also difficult to know when to abandon a commercialization effort because the chance for success is too small (De Meyer et al., 2002). Therefore, we can argue that the UCS’s primary job is to manage emergent behavior in a way that creates increased order rather than anything else.

Studies have found that emergent behavior displays power law outcomes. In a study that analyzed 46 variables related to entrepreneurial resources, cognition, action, and environments, researchers found that 95% of the variables distribute on a Pareto curve instead of the often-assumed normal or Gaussian curve (Crawford, Aguinis, Lichtenstein, Davidsson, & McKelvey, 2015). What this means is that a small number of components in the UCS, whether resources or activities, have a disproportionate impact on the entire system and its research outcomes. That is why lack of incentives to commercialize can diminish the potential pipeline of innovation and disrupt the flow of research to commercial markets. Similarly, a risk-avoidance culture in the TTO or legal department can stall or even disrupt the progress of an innovation and cause it to miss its optimal window of opportunity.

Catastrophe, Chaos, and the UPC

One can argue that in a CAS the whole may be more or substantially different than the sum of its parts. Therefore, studying one or more parts ignores the effects of interactions among the parts as well as interactions with variables outside the system that may cause the system to reach a tipping point and shift from one stable state or mode of behavior to another. Catastrophe theory has been used as a conceptual framework to define the impact that evolutionary and revolutionary change might have on a system such as a company, an industry, or, in this case, a technology commercialization eco-system at a university (Baack & Cullen, 1992). Catastrophe theory argues that relatively small events can trigger a systemic impact, but it is difficult to determine which variable or variables are responsible for actually triggering the event. This is often the case with disruptive technology that faces a difficult path to mass adoption because no market exists when it is conceived. The success of penetrating the early adopters is overshadowed by the challenge of getting beyond the early adopters, crossing the chasm to reach the mainstream adopters with practical applications of the technology. To successfully achieve mass adoption, entrepreneurs often get their products into as many different markets and applications as possible in an effort to hedge their bets, because there is no way to know for certain which market or application will trigger a movement toward mass adoption—the sudden shift that is described by catastrophe theory.

Chaos theory is a complement to catastrophe theory. Proposed by Lorenz in 1963, chaos tends to occur in nonlinear models like complex adaptive systems where a very small change in the initial conditions can produce an outsized and unpredictable or random change in the outcomes of an event or process. Arguably, that is one of the most important achievements of chaos theory: the idea that a set of deterministic relationships or interactions has the ability to produce new patterns with unpredictable outcomes. In the case of the UCS, for example, the discovery of a competing research program at another university or a failure to secure funding to complete the killer experiment that will lead to investor interest can derail the predicted outcome of the commercialization effort.

Those who have applied CT and its subsets Chaos and Catastrophe Theory to the study of organizations have undertaken two broad approaches. One approach is faithful to the original scientific methodology that employs advanced mathematics and computer simulations to capture patterns of behavior in a system. The second approach employs a more social science-based methodology with longitudinal studies, ethnography, and analogies (Stacey, 1995). Both approaches seek to understand networks of causation in dynamic systems.

The UCS and Path Dependence, Openness, and Closure

The last two characteristics of a complex adaptive system—path dependence and openness and closure—represent the ongoing challenge of a CAS. The UCS is plagued with legacy subsystems in the form of policies, behaviors, and resources that are difficult to change, often for the reason that “if it’s not broken, don’t fix it.” The fallacy in this path-dependent viewpoint is that a legacy subsystem may appear to work if studied in a vacuum, but it can turn out to be a significant bottleneck for the larger system if examined in context. For example, at one university, the legal department saw itself as effective in terms of creating and executing strong contracts for technology licensing. However, the TTO was frustrated that it couldn’t complete its work with potential licensors in a timely fashion. Licensors were also irritated because the process always stalled in the legal department, which was usually backlogged.

The Theory of Constraints argues that at any moment, every organization has at least one constraint that prevents that organization from achieving its goal (Goldratt & Cox, 2004). In a typical UCS, there exists a lot of work in process in the various agencies within the system. Every step in the commercialization process requires a different amount of time, in addition to each technology having different needs relative to those processes. University systems rarely have all the resources they require to be as efficient as possible, and there are no incentives for efficiency when it comes to the commercialization process. So when a problem occurs, it is solved only for the part of the process where it occurred, and that results in the entire system becoming unbalanced. Goldratt argues that every system has a weakest link, which he calls the constraint, and that weak link is the drum that beats the cadence for the entire system. That is why any system can do no better than its weakest link. It is common for the weak links to be represented by intangibles such as patents, contracts, and markets. But systems also have conflicts. In the case of the UCS, the conflict may appear as the demand for researchers to publish to achieve a degree or move up the tenure ladder against the desire on the part of researchers to disclose inventions and patent them. The TTO may be measured on the number of disclosures it receives, so it puts pressure on researchers to disclose before valid claims can be identified and protected.

The need to engage in a mutual exchange of knowledge, capabilities, and resources is critical to the work of a CAS and is referred to in the literature as openness and closure. Openness describes the system structure and how it engages with the external environment, while operational closure is simply how the system is organized within a set of boundaries that define it (Dominici & Levanti, 2011). Therefore, it is entirely possible that a system such as the UCS can be dynamically open and responsive to the external environment and, at the same time, operationally closed in that changes can occur that are not precipitated by the external environment. The success of a UCS is a function of its ability to adapt to changes in both the internal and external environments while maintaining its core identity as a system.

Systems Thinking and Real Options Reasoning in the UCS

Decision-making in complex adaptive systems operating under high degrees of risk and uncertainty requires a different mode of thinking, namely, systems thinking, which focuses on the interrelationships and dependencies between and among all the parts of the system. So, the UCS can be viewed as a sociocultural system with multiple interrelationships that create complexity. In any system there are internal constraints stemming from the system architecture and constraints imposed upon the system by external factors. In the case of the UCS, some of the internal constraints have already been discussed: TTO effectiveness, reward systems, and business support programs, to name a few. External factors that impact the effectiveness of the UCS include government policies, needs and requirements of the investment community, the availability of licensors, and the economic climate. It has been suggested that the ability to avoid performance problems in a CAS, such as the UCS, is enhanced by introducing flexibility or options into the system (Madni & Jackson, 2011). This type of flexibility enables the UCS to stage investment and deployment decisions, as well as change the process and its components as needed to reflect the influence of new information.

Real Options Reasoning (ROR) is a framework that enables the introduction of flexibility into a system such as the UCS. ROR creates the right but not the obligation to pursue a potential action at a predetermined cost (the risk) for a predetermined time (McGrath, Ferrier, & Mendelow, 2004). Therefore, a decision maker can look at both the technical side of the commercialization process and the business side and make informed decisions based on available structural and resource flexibility. Any part of the UCS can pause to wait for new information or make early investments to test the waters and move the research along the path. The UCS has the option to stage the commitment, change the inputs, alter the scale of the commitment, or abandon the option. ROR operates under the presumption that decision points in the UCS process have to deal with information asymmetries, legacy processes, and uncertainty; so it is fundamentally an exploratory process with a sense-and-respond approach that can overcome the inherent behavioral biases of system decision makers (Miller & Arikan, 2004). Furthermore, studies have found that decision-making in complex systems produces higher impact outcomes, largely for the reason that the interrelationships and interdependencies are taken into account (Kogut & Kulatilaka, 1994).

As a strategy, ROR is grounded in three broad research areas: (1) the resource-based view of the organization, (2) basic organizational theory, and (3) complex adaptive systems theory (Kogut & Kulatilaka, 2001). Organizations have long been viewed as complex, sociocultural systems able to self-organize to adapt to uncertainty and reorganize to achieve desired outcomes. Unlike financial options, real options typically do not involve contracts. Instead, there is an initial decision that opens up the opportunity to make another decision at a later point in time. That later decision is made based on new information gathered. At both points there is a cost associated with the decision whether that be a financial cost, a time cost, an opportunity cost, or the cost of risk.

Using ROR Strategies for Resiliency

Optimizing for efficiency, the common approach in traditional management practices, is not suitable for CAS in highly uncertain environments. While in theory it may be possible to optimize the operations of, say, the TTO or the legal department, there will likely be unintended consequences at the system level due to interdependencies among all the decision makers in the UCS (Sargut & McGrath, 2010). That is why an ROR approach using systems thinking yields such potential. ROR exploits the value of uncertainty in a system while systems thinking ensures that all elements and agents in the system are considered, that all the loops are closed with respect to relevant variables, and that the real options are conceptualized correctly and introduced appropriately into the system architecture.

To achieve desired outcomes, a CAS requires resilience and flexibility, which can be accomplished using ROR. One method for determining the current resilience and flexibility in a CAS such as the UCS is to consider the UCS’s policies around abandonment of an effort. Because ROR decisions and commitments are typically made very early in the life of a technology being considered for commercialization, it is impossible to determine ex ante which path will yield the best result at a particular moment in time. In other words, we don’t know the precise window of opportunity for any technology in play, so high levels of uncertainty are created around where to place bets and for how long. Moreover, it is not economically reasonable to hold all options open until sufficient information is gathered to reduce uncertainty. Therefore, it is important to consider how a UCS makes the decision to abandon a commercialization effort, whether that be early on in the laboratory, at the point of disclosure to the TTO, during market research, or whenever critical information is acquired that suggests that the probability of a good outcome is negative.

ROR can be deployed in a number of ways to account for environment influences on decision-making relative to technologies going through the university commercialization process. Four approaches serve as possibilities although other combinations can be created: immediate entry, immediate exit, delayed entry, and delayed exit (Janney & Dess, 2004).

Immediate Entry

This strategy is relatively low risk and entails spending a small amount of money to secure the right to enter the market and pursue a greater commitment at a later time. For the UCS, an example would be the TTO filing a provisional patent to secure first-to-file status. This approach is often used to prevent others from going down the same path or securing valuable resources. Immediate entry can enable the UCS to influence the development of standards in an industry or to secure a market niche ahead of potential competitors. Conveying relatively little risk, this approach offers an opportunity to gather more information within a protected time period before making a more substantial investment.

Immediate Exit

In this all-in approach, a UCS decision maker would use the first decision on a commercialization strategy to make a full commitment with the right to reverse that decision in the future. It is typically used in situations where a full complement of resources is required to gain any usable information; in other words, there is no way to make a more limited investment in the decision. UCSs that have pools of seed funding for prototyping often have to make a fairly costly investment in early-stage research to learn if commercialization is even a possibility. The UCS could stipulate that if the results of this early investment are not promising against some defined metrics, the UCS would not be obligated to do any additional investment of its limited resources in this technology. Clearly, the negative aspect of immediate exit options is that they are more expensive than immediate entry options, although often carrying similar levels of risk.

Delayed Entry

Delayed entry is a way to secure the right but not the obligation to enter a market at a later date and is used to deal with markets where the decision is to go all the way into the market or all the way out. It is essentially a “major and largely irreversible commitment” (Janney & Dess, 2004). Typically a UCS will not have to make such a decision, but its partners, the licensees, may be faced with just such a decision. For example, suppose a pharmaceutical company has licensed a drug candidate developed in university laboratories after successful preclinical testing in large animals. We know that later-stage clinical trials in humans are very costly with the total cost to bring a new pharmaceutical drug to market exceeding $2.5 billion (Mullin, 2014). Using the defensive maneuver of delayed entry, the pharma might license the technology and invest in next-stage trials, which are expensive, but not nearly as expensive as completing all the clinical trials required to bring the drug to market. After the next stage, the pharma has the option to go forward with additional investment and another option agreement or lose what has been invested to date, which avoids a much greater loss that could be incurred by completing the project if it does not appear promising.

Delayed Exit

Here the UCS is able to buy time before having to make an irreversible commitment to abandon an effort. Unlike the immediate exit strategy, delayed exit is much more costly, and the decision is made only when the relevant parties are absolutely certain they will not change their minds. Returning to the example of the pharmaceutical company that acquires a license to a university drug technology, suppose the pharma gets through all the clinical trials and then discovers once the drug is on the market that it carries side effects that never appeared during clinical trials. Given that the FDA will likely pull the drug from the market, the pharma needs to figure out if there is a way to salvage any of the investment or whether it can operate at a loss while retesting the drug. Meanwhile, depending on how the UCS structured the license agreement, it will likely not be able to benefit from its projected income stream or recoup all its original investment in the researchers and their laboratory.

The biggest downfall of delayed exit decisions is that they are prone to the problem of escalating commitment to a failing course of action. Escalation of commitment occurs when a decision maker fails to abandon a course of action that cannot succeed often because the current course of action has produced negative feedback and there is personal responsibility on the part of the decision maker (Ross & Staw, 1986). Research has learned that where the decision maker has been able to effectively analyze information prior to undertaking a course of action, the probability that the decision maker will escalate the commitment declines (Sleesman, Conlon, McNamara, & Miles, 2012). Unfortunately, decision makers are often provided with optimistic forecasts of such things as the benefit of a technology, the costs associated with commercialization, and the probability of obtaining intellectual property, to name a few (Flyvbjerg, Garbuio, & Lovallo, 2009). That optimism can yield a level of confidence on the part of the decision maker that is unwarranted.

A body of research has studied the role of escalation of commitment in large infrastructure projects such as the Eurotunnel and the Denver Airport, but the same issues of optimism and self-interest can be found in complex systems such as the UCS. For example, where the TTO has requested information from a researcher’s team to inform decisions about intellectual property and licensing, it subjects itself to the problem of asymmetric information—the TTO doesn’t know what it doesn’t know. The overoptimism in the information it receives often stems from two sources: (1) self-interest on the part of the research team that wants to initiate or continue a commercialization effort or (2) self-interest on the part of the TTO, which needs to meet the success metrics the university has placed on it.

ROR Is Not a Cure for Uncertainty

While ROR can introduce flexibility into the UCS and increase the chances of responding effectively to uncertainty, it is not a cure for uncertainty. If not employed properly, understanding its limitations, ROR can lead to decisions that incorporate an unwarranted level of confidence. UCS decision makers may believe that they have control over and can influence the outcomes of interactions when in reality that is not possible. This overconfidence, called managerial adventurism in the literature, can lead to a bias for action and the escalation of commitment to a failing course of action problem discussed in the previous section. With real options, when the rewards are great, there is more value in waiting to make a decision. Likewise, if the rewards are of less value, decision makers will be more likely to take more risk and less likely to abandon a poor decision. Engaging in multiple small options in an effort to manage risk can also result in the illusion of confidence to the extent that decision makers don’t do sufficient due diligence on each option (Janney & Dess, 2004). Because the cost to write the first option may be small, TTOs tend to suffer from the illusion of control and, therefore, may not respond to change appropriately and in a timely fashion. In other words, they might undertake a real options decision with a lesser degree of care than they would make a full commitment.

There is another situation where irrational escalation of commitment can often be found in the UCS process. That is where one unit in the system bears the bulk of the cost of bringing research to market while the other units share very little in the cost but enjoy the benefits of a successful outcome. Generally, that asymmetric cost is born by the TTO; but a research lab that has been operating on time-based Federal grants, for example, is much less inclined to abandon an effort even when doing so would benefit the entire UCS in terms of time and expense (Janney & Dess, 2004).

Perhaps the biggest problem in effectively deploying ROR relates to the portfolio approach that most UCSs take. This “plant a thousand seeds” strategy is largely undertaken to spread the risk and uncertainty associated with predicting which scientific and engineering discoveries and research efforts will produce an outcome worth more than the investment in them. Because of their low cost, these initial options appear to give the UCS a riskless portfolio. The reality may be very different, however. While an individual option may be small in size, the aggregate of all the options that a UCS has in its portfolio can present a substantial investment of time and resources. It is easy to see how the multiple small bets approach could waste limited resources and the time of the TTO licensing officers while simultaneously managing risk.

Conclusion

Current research on technology commercialization in universities focuses primarily on explaining a posteriori events that are neither static nor deterministic, but rather constantly changing and evolving. Applying a complex adaptive system approach to the dynamic nature of a university technology transfer and commercialization system offers a wealth of opportunity to understand this system more completely. There are a number of advantages to taking a CAS approach to research on the university commercialization process. Chief among them are:

  1. 1. CAS overcomes the limitations of the reductionist approach and looks at the UCS more holistically.

  2. 2. Research topics that have been explored previously using the reductionist approach may yield new and valuable insights that can only be revealed when considered in the context of all the elements of the system.

  3. 3. A CAS approach requires the researcher to take a more dynamic look at a system that is emergent, self-organizing, and in constant evolution. Much of the research on UCSs is static rather than dynamic.

  4. 4. Real options reasoning holds the potential to provide structural resilience in the UCS system, to mitigate uncertainty, and to improve decision-making.

It is clear that the UCS’s inherent complexity and its relationship to uncertainty in the environment will be a rich source of study for years to come. A systems approach using real options might hold the key to better understanding the UCS and explaining why it is not consistently effective in translating new research into products and services that benefit society. However, if the goal of applying CAS theory to commercialization research is to be able to predict outcomes in the system, it may be a fool’s errand due to the practical impossibility of determining the complete landscape of relationships and interactions in any system data set. Organizations such as the UCS are social systems, so it is challenging to define and measure the conceptual constructs found in the system. To add to the problem, there is no way to control for unpredictable behavior at the individual level. Nevertheless, if as researchers the aim is to understand and explain by applying methods appropriate to complex adaptive systems, the effort can move our current understanding of the UCS to a new level.

References

Aldrich, H. (2015). Dimly through the fog: Institutional forces affecting the multidisciplinary nature of entrepreneurship. Research Gate. Retrieved from https://www.researchgate.net/publication/280297400.Find this resource:

Aldrich, H. E., & Martinez, M. A. (2001). Many are called, but few are chosen: An evolutionary perspective for the study of entrepreneurship. Entrepreneurship: Theory and Practice, 25(4), 41.Find this resource:

Allen, K. R. (2012). Technology commercialization: Have we learned anything? The Journal of Engineering Entrepreneurship, 3(1).Find this resource:

Anderson, A. R., Dodd, S. D., & Jack, S. L. (2012). Entrepreneurship as connecting: Some implications for theorising and practice. Management Decision, 50(5), 958–971.Find this resource:

AUTM Licensing Survey. (2013). Association of University Technology Managers. Retrieved from http://www.autm.net/resources-surveys/research-reports-databases/licensing-surveys/fy-2013-licensing-survey/.

Baack, D., & Cullen, J. B. (1992). A catastrophe theory model of technological and structural change. Journal of High Technology Management Research, 3(1), 125–145.Find this resource:

Bercovitz, J., & Feldman, M. (2008). Academic entrepreneurs: Organizational change at the individual level. Organization Science, 19(1), 69–89, 184–185.Find this resource:

Brockman, J. (2015). This idea must die: Scientific theories that are blocking progress. New York: Harper Perennial.Find this resource:

Crawford, G. C., Aguinis, H., Lichtenstein, B., Davidsson, P., & McKelvey, B. (2015). Power law distributions in entrepreneurship: Implications for theory and research. Journal of Business Venturing, 30(5), 696.Find this resource:

De Meyer, A., Loch, C. H., & Pich, M. T. (2002). Managing project uncertainty: From variation to chaos. MIT Sloan Management Review, 43(2), 60–67.Find this resource:

Dominici, G., & Levanti, G. (2011). The complex system theory for the analysis of inter-firm networks: A literature overview and theoretic framework. International Business Research, 4(2), 31–37.Find this resource:

Flyvbjerg, B., Garbuio, M., & Lovallo, D. (2009). Delusion and deception in large infrastructure projects: Two models for explaining and preventing executive disaster. California Management Review, 51(2), 170–193.Find this resource:

Fontana, W., & Ballati, S. (1999). Complexity. Complexity, 4(3), 14–16.Find this resource:

Foster, J. (2000). Is there a role for transaction cost economics if we view firms as complex adaptive systems? Contemporary Economic Policy, 18(4), 369–385.Find this resource:

Goldratt, E. M., & Cox, J. (2004). The goal: A process of ongoing improvement (3d rev., 20th anniversary ed.). Great Barrington, MA: North River Press.Find this resource:

Janney, J. J., & Dess, G. G. (2004). Can real-options analysis improve decision-making? Promises and pitfalls. The Academy of Management Executive, 18(4), 60.Find this resource:

Kogut, B., & Kulatilaka, N. (1994). Operating flexibility, global manufacturing, and the option value of a multinational network. Management Science, 40(1), 123–139.Find this resource:

Kogut, B., & Kulatilaka, N. (2001). Capabilities as real options. Organization Science, 12(6), 744–758.Find this resource:

Lewin, A. Y. (1999). Application of complexity theory to organization science. Organization Science, 10(3), 215.Find this resource:

Lewin, A. Y., & Volberda, H. W. (1999). Prolegomena on coevolution: A framework for research on strategy and new organizational forms. Organization Science, 10(5), 519–534.Find this resource:

Lorenz, E. N. (1963). Deterministic non-periodic flow. Journal of the Atmospheric Sciences, 20, 130–141.Find this resource:

Madni, A. M., & Jackson, S. (2011). Towards a conceptual framework for resilience engineering. IEEE Engineering Management Review, 39(4), 85–102.Find this resource:

Manoukian, A., Hassab Elnaby H. R., & Odabashian, V. (2015). Technology commercialization review: Aiming at a fresher perspective based on partnership synergy. International Journal of Management Research and Reviews, 5(7), 488–520.Find this resource:

McDaniel, R. R., Jr. (2007). Management strategies for complex adaptive systems: Sense making, learning, and improvisation. Performance Improvement Quarterly, 20(2), 21–41.Find this resource:

McGrath, R. G., Ferrier, W. J., & Mendelow, A. L. (2004). Real options as engines of choice and heterogeneity. Academy of Management Review, 29(1), 88.Find this resource:

McKelvey, B. (1999). Complexity theory in organization science: Seizing the promise or becoming a fad? Emergence, 1(1), 5–32.Find this resource:

Miller, K. D., & Arikan, A. T. (2004). Technology search investments: Evolutionary, option reasoning, and option pricing approaches. Strategic Management Journal, 25(5), 473–485.Find this resource:

Mullin, R. (2014). Cost to develop new pharmaceutical drug now exceeds $2.5b. Chemical and Engineering News, November 20, 2014. Retrieved from http://cen.acs.org/articles/92/web/2014/11/Tufts-Study-Finds-Big-Rise.html.Find this resource:

Parker, K., & Mainelli, M. (2001). Great mistakes in technology commercialization. Strategic Change, 10(7), 383.Find this resource:

Pollack, J., Adler, D., & Sankaran, S. (2014). Mapping the field of complexity theory: A computational approach to understanding changes in the field. Emergence: Complexity and Organization, 16(2), 74–92.Find this resource:

Ross, J., & Staw, B. M. (1986). Expo 86: An escalation prototype. Administrative Science Quarterly, 31(2), 274–297.Find this resource:

Sargut, G., & McGrath, R. G. (2010). Managing under complexity: Where is Einstein when you really need him? Ivey Business Journal Online, 1. Retrieved from http://iveybusinessjournal.com/publication/managing-under-complexity-where-is-einstein-when-you-really-need-him/.Find this resource:

Shane, S. (2004). Encouraging university entrepreneurship? The effect of the Bayh-Dole act on university patenting in the United States. Journal of Business Venturing, 19(1), 127.Find this resource:

Sleesman, D. J., Conlon, D. E., McNamara, G., & Miles, J. E. (2012). Cleaning up the Big Muddy: A meta-analytic review of the determinants of escalation of commitment. Academy of Management Journal, 55(3), 541–562.Find this resource:

Stacey, R. D. (1995). The science of complexity: An alternative perspective for strategic change processes. Strategic Management Journal, 16(6), 477–495.Find this resource:

Thursby, M., Thursby, J., & Gupta-Mukherjee, S. (2007). Are there real effects of licensing on academic research? A life cycle view. Journal of Economic Behavior & Organization, 63(4), 577.Find this resource:

Tsoukas, H., & Chia, R. (2002). On organizational becoming: Rethinking organizational change. Organization Science, 13(5), 567–582.Find this resource:

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131.Find this resource:

Wallis, S. E. (2009). The complexity of complexity theory: An innovative analysis. Emergence : Complexity and Organization, 11(4), 26–38.Find this resource: