Coevolving Innovations

… in Business Organizations and Information Technologies

Robust Yet Fragile Complexity, or Scale Free Network?

A mention of “Robust-Yet-Fragile” label by resilience author @andrew_zolli led me to John Doyle‘s research at Caltech.  Andrew Zolli writes:

We rightfully add safety systems to things like planes and oil rigs, and hedge the bets of major banks, in an effort to encourage them to run safely yet ever-more efficiently. Each of these safety features, however, also increases the complexity of the whole.  Add enough of them, and soon these otherwise beneficial features become potential sources of risk themselves, as the number of possible interactions — both anticipated and unanticipated — between various components becomes incomprehensibly large.

This, in turn, amplifies uncertainty when things go wrong, making crises harder to correct: Is that flashing alert signaling a genuine emergency? Is it a false alarm? Or is it the result of some complex interaction nobody has ever seen before? [….]

CalTech system scientist John Doyle has coined a term for such systems: he calls them Robust-Yet-Fragile — and one of their hallmark features is that they are good at dealing with anticipated threats, but terrible at dealing with unanticipated ones. As the complexity of these systems grow, both the sources and severity of possible disruptions increases, even as the size required for potential ‘triggering events’ decreases — it can take only a tiny event, at the wrong place or at the wrong time, to spark a calamity.

In an 2007 Discover Magazine article, Carl Zimmer provides a simplified description of research conducted from a theoretical foundation (in Scale Free Networks) in contrast to that from empirical practicalities in control engineering.

In the 1990s, studying complex systems of all sorts became something of a fad following the emergence of “chaos theory.” Competing versions of this theory were emerging left and right; chaos was being touted as the science of the future. Doyle was unimpressed by most of the new ideas. “It was clear to me that they were just so far off the mark,” he says. Doyle made up a name that combined all the trendy buzzwords he came across: “emergilent chaoplexity.”

One reason that Doyle loathes emergilent chaoplexity is because it relies on superficial patterns. Doyle, by contrast, insists that his analyses draw from the gritty details of how things actually work.

As an example, Doyle points to what are known as scale-free networks. Many of these networks—interlinked sets of airports, friends, nerves in the body, and so on—have the same basic structure. A few nodes are highly connected hubs, while most other nodes have only a few connections. Any given small city airport probably connects to just a few others. Passengers rely on being able to transfer at a hub to reach most other places. But if you live in Chicago, you can take a direct flight from O’Hare Airport to hundreds of destinations.

Some researchers, like Albert-László Barabási at the University of Notre Dame, have argued (pdf) that the Internet shares a similar structure and that this accounts for why the Internet keeps humming even when some of its systems fail. Since hubs are rare, failures involving them are even rarer. But should a hub fail, researchers warned, it would lead to catastrophe. Their warning made headlines, with CNN reporting in 2000: “Scientists Spot Achilles’ Heel of Internet.”

Doyle was not impressed. “Everybody who knew how the Internet worked was puzzled by all this,” he says. He decided to test the Achilles’ heel theory by joining up with a group of collaborators and mapping a section of the Internet in unprecedented detail.

In that map, they found no Achilles’ heel. The Internet does have a few large servers at its core, but those servers are actually not very well connected. Each one has only a few links, mainly to other large servers through high-bandwidth connections. Much of the activity that occurs on the Internet actually lies out on its edges, where computers are linked by relatively low-bandwidth connections to small servers; think about how many e-mails office workers send to people in their building compared with how many they send overseas. If one of the big links at the core of the Internet crashed, Doyle and his colleagues discovered, it would not take the Internet down with it. Traffic could simply be rerouted through other big links.

The Internet works spectacularly well, despite the fact that over the past 30 years it has expanded a million-fold, absorbing new technology from BlackBerries to the iTunes music store with hardly any major changes to the basic rules it uses to move data. Doyle now knows why. It’s not just the physical arrangement of cables and servers that makes the Net so robust. Doyle and his colleagues showed that the software that runs the Internet uses feedback, in much the same way a jetliner’s computer does. The Internet can sense changing conditions and adjust itself.

The Internet has two kinds of feedback. It maintains a constantly updated picture of the entire network so that messages can be directed along the fastest routes. It also breaks down those messages and encapsulates them inside standardized packets of data, a little like using the standardized waybills and boxes provided by FedEx. Each packet can take its own path through the Internet. As packets arrive at the recipient’s computer, the message fragments in each packet are extracted and reassembled. Critically, as each packet arrives, it sends back a receipt to the sender’s computer. In heavy traffic, some packets get lost. In response to lost packets, computers slow down the rate at which they send their data, reducing congestion.

Together, these two types of feedback give the Internet a robustness more powerful than anyone anticipated. “These Internet engineers weren’t control theorists, but they built this incredibly robust network,” Doyle says. “Man, that’s awesome.” Then again, the engineers were doing something that evolution figured out long ago.

Looking at the original 2005 PNAS paper, Doyle and his coauthors create two models of networks, and then compares them to the real Internet.

One line of research portrays the Internet as ‘‘scale-free’’ (SF) with a ‘‘hub-like’’ core structure that makes the network simultaneously robust to random losses of nodes yet fragile to targeted attacks on the highly connected nodes or ‘‘hubs’’(1–3). The resulting error tolerance with attack vulnerability has been proposed as a previously overlooked ‘‘Achilles’ heel’’ of the Internet. The appeal of such a surprising discovery is understandable, because SF methods are quite general and do not depend on any details of Internet technology, economics, or engineering (4, 5).
[…]
Highly optimized/organized tolerance/tradeoffs (HOT) has been proposed as a conceptual framework for capturing the highly organized, optimized, and RYF structure of complex highly evolved systems.

The researchers summarize the findings in a table.

Table 1. SFnet vs. HOTnet and the real Internet

Feature SFnet HOTnet Real Internet
High-degree vertices Core Periphery Periphery
Degree distributions Power law Power law Highly variable
Generated by Random Design Design
Core vertices High degree Low degree Low degree
Throughput Low High High
Attack tolerance Fragile Robust Robust
Fragility High-degree / hubs Low degree / core Hijack network

The research concludes that the real internet is more like a highly optimized/organized tolerance/tradeoff network than a scale free network.

It is certainly appealing that SF network models can avoid all Internet-specific structures, such as protocol stacks, technological or economic constraints, and user heterogeneity, yet make interesting and testable predictions. Unfortunately, this fact yields results that collapse when tested with real data or when examined by domain experts. Here, we have shown that there exist technological, economic, and graph theoretic reasons why the most important SF claim (i.e., that the Internet has ‘‘hubs’’ that form an Achilles’ heel through which most traffic flows and the loss of which would fragment the Internet and constitute its attackv vulnerability) cannot be (and is not) true for the current router-level Internet. More generally, Table 1 shows that SFnet and HOTnet are opposite in essentially every meaningful sense, and the real Internet network is much more like HOTnet.

In a 2012 article, Karen Heyman wrote critically about scientific knowledge in the popular press as more concerned with internal consistency than external validity.

In 2000, Nature ran a cover story that warned the Internet was vulnerable to targeted attacks on central routers. It built its conclusions on a paper published by the same authors the year before in Science that famously defined “scale free networks,” (SFN). According to the Science paper, large-scale networks self-organize into SFN, rather than being randomly disorganized, as had been previously assumed. These networks follow a power law distribution (popularly known as the “80/20 rule”), in which the majority of traffic goes through only a few central hubs.

The follow-up Nature paper, widely covered in the popular press, claimed to have demonstrated that due to the 80/20 rule of SFN, the majority of Internet traffic went through only a few central, critical routers. While the Internet could survive random failures, a targeted attack against these hubs could leave the entire Internet as useless as a frayed phone chord.

John Doyle, Professor of Control and Dynamical Systems at Caltech, has maintained for years that this analysis was not just wrong, but so “comically wrong” to anyone who knew anything at all about Internet architecture that “most engineers thought it was a hoax.” As Doyle explains, “The irony of the claim is that the Internet is incredibly robust to physical challenges—but it’s terribly fragile to more sophisticated, software attacks that leave the routers intact but send unwanted packets and malware to the users.”

The author of the Nature paper is the highly respected physicist Albert-László Barabási, who heads the Northeastern University Center for Complex Research. Doyle praises him as “a maestro, he is brilliant.” And that’s the part that scares Doyle the most.

What rattles Doyle is that this was not a one-off mistake by a well-intended theorist ignorant of how the Internet actually works, but a symptom of how some of the smartest members of the research community have fallen in love with, and misapplied, what has been dubbed the “Science of Complexity.”

In this new approach, theorists (often physicists) apply statistical physics to problems in fields like engineering and biology. After reading through many such papers, Doyle says it’s a recipe: Authors combine the same physics theories in the same definable steps to reach sometimes far-fetched conclusions. “What they’ve done is standard operating procedure in the physics literature, but now they’ve stretched it into a regime where we can check the answers,” says Doyle, “If you’re not a particle physicist, it’s hard to check a measurement in quantum mechanics; but if you’re an engineer or a biologist, you know they make claims about how things work in engineering and biology that are obviously absurd.”

“The real stories of how technology and biology work are very complicated and hard to tell,” says Doyle, who admits most of his own papers are so mathematically dense, they’re all but unreadable except by a handful of experts. By contrast, Doyle says, the complexity papers tell an appealingly simple narrative about how systems work: “Take Domain X, then add some version of ‘random plus minimal tuning’ such as a scale free network.”

Such papers entice journal editors because they appear to have novel insights, and from a purely theoretical viewpoint, their internal logic is consistent. If you’re still wondering why something so self-evidently wrong as the Internet failure paper could make the cover of a peer-reviewed journal, the problem is how “peer” is defined. More than likely, Nature’s editors turned to other physicists, or possibly graph theorists, to vet the paper, because its research topic was scale-free networks; the Internet was merely a real-world example. In other words, if a reviewer’s concern is the physics of an acoustical theory that describes how air travels through a French horn, nobody’s very likely to check if a sound actually came out.

Doyle has been doing research into Robust Yet Fragile Complexity in biological domains, and has written a brief description in a 2010 Cal Tech booklet.

Biological systems are robust and evolvable in the face of even large changes in environment and system components, yet can simultaneously be extremely fragile to small perturbations. Such universally robust yet fragile (RYF) complexity is found wherever we look. The amazing evolution of microbes into humans (robustness of lineages on long timescales) is punctuated by mass extinctions (extreme fragility). Diabetes, obesity, cancer, and autoimmune diseases are side-effects of biological control and compensatory mechanisms so robust as to normally go unnoticed. RYF complexity is not confined to biology. The complexity of technology is exploding around us, but in ways that remain largely hidden. Modern institutions and technologies facilitate robustness and accelerate evolution, but enable catastrophes on a scale unimaginable without them (from network and market crashes to war, epidemics, and global warming). Understanding RYF means understanding architecture—the most universal, high-level, persistent elements of organization—and protocols. Protocols define how diverse modules interact, and architecture defines how sets of protocols are organized.

Insights into the architectural and organizational principles of networked systems can be drawn from three converging research themes.

(1) With molecular biology’s description of components and growing attention to systems biology, the organizational principles of biological networks are becoming increasingly apparent. Biologists are articulating richly detailed explanations of biological complexity, robustness, and evolvability that point to universal principles.

(2) Advanced technology’s complexity is now approaching biology’s. While the components differ, there is striking convergence at the network level of architecture and the role of layering, protocols, and feedback control in structuring complex multiscale modularity. New theories of the Internet and related networking technologies have led to test and deployment of new protocols for high performance networking.

(3) A new mathematical framework for the study of complex networks suggests that this apparent network-level evolutionary convergence within/between biology/technology is not accidental, but follows necessarily from the universal system requirements to be efficient, adaptive, evolvable, and robust to perturbations in their environment and component parts.

This crossover from biology and engineering dates was first outlined in a 2002 paper by Carlson and Doyle.

Highly optimized tolerance (HOT) (1–6) is one recent attempt, in a long history of efforts, to develop a general framework for studying complexity. The HOT view is motivated by examples from biology and engineering. Theoretically, it builds on mathematics and abstractions from control, communications, and computing. In this paper, we retain the motivating examples but avoid theories and mathematics that may be unfamiliar to a nonengineering audience. Instead, we aim to make contact with the models, concepts, and abstractions that have been loosely collected under the rubric of a ‘‘new science of complexity’’ (NSOC) (7) or ‘‘complex adaptive systems’’ (CAS), and particularly the concept of self-organized criticality (SOC) (8, 9). SOC is only one element of NSOC/CAS but is a useful representative, because it has a well-developed theory and broad range of claimed applications.

Table 1. Characteristics of SOC, HOT, and Data

Property SOC HOT and Data
1 Internal configuration Generic, homogenous, self-similar Structured heterogeneous, self-dissimilar
2 Robustness Generic Robust, yet fragile
3 Density and yield Low High
4 Max event size Infinitesimal Large
5 Large event shape Fractal Compact
6 Mechanism for power laws Critical internal fluctuations Robust performance
7 Exponent ? Small Large
8 ? vs. dimension d ? ? (d -1)/10 ? ? 1/d
9 DDOFs Small (1) Large (?)
10 Increase model resolution No change New structures, new sensitivities
11 Response to forcing Homogenous RobusVariable

In Table 1, we contrast HOT’s emphasis on design and rare configurations with the perspective provided by NSOC/CAS/SOC, which emphasizes structural complexity as ‘‘emerging between order and disorder,’’ (i) at a bifurcation or phase transition in an interconnection of components that is (ii) otherwise largely random. Advocates of NSOC/CAS/SOC are inspired by critical phenomena, fractals, self-similarity, pattern formation, and self-organization in statistical physics, and bifurcations and deterministic chaos from dynamical systems. Motivating examples vary from equilibrium statistical mechanics of interacting spins on a lattice to the spontaneous formation of spatial patterns in systems far from equilibrium. This approach suggests a unity from apparently wildly different examples, because details of component behavior and their interconnection are seen as largely irrelevant to system-wide behavior.

Table 1 shows that SOC and HOT predict not just different but exactly opposite features of complex systems. HOT suggests that random interconnections of components say little about the complexity of real systems, that the details can matter enormously, and that generic (e.g., low codimension) bifurcations and phase transitions play a peripheral role. In principle, Table 1 could have a separate column for Data, by which we mean the observable features of real systems. Because HOT and Data turn out to be identical for these features, we can collapse the table as shown. This is a strong claim, and the remainder of this paper is devoted to justifying it in as much detail as space permits.

In the search for isomorphisms, crossing over the disciplines from biology to engineering follows the spirit of the systems sciences.

References

Carlson, Jean M., and John C. Doyle. 2002. “Complexity and Robustness.” Proceedings of the National Academy of Sciences 99 (90001): 2538–2545. doi:10.1073/pnas.012582499. http://dx.doi.org/10.1073/pnas.012582499.

Doyle, John C. 2010. “The Architecture of Robust, Evolvable Networks: Organization, Layering, Protocols, Optimization, and Control.” In The Lee Center for Advanced Networking, 12–13. Booklet. Pasadena, California: California Institute of Technology. http://leecenter.caltech.edu/booklet.html.

Doyle, John C., David L. Alderson, Lun Li, Steven Low, Matthew Roughan, Stanislav Shalunov, Reiko Tanaka, and Walter Willinger. 2005. “The ‘robust yet Fragile’ Nature of the Internet.” Proceedings of the National Academy of Sciences of the United States of America 102 (41): 14497–14502. doi:10.1073/pnas.0501426102. http://dx.doi.org/10.1073/pnas.0501426102.

Heyman, Karen. 2012. “A Contrarian Worth Listening To: Caltech Professor John Doyle Thinks ‘Complexity’ Is Over-Simplified.” Input Outputhttp://h30565.www3.hp.com/t5/Feature-Articles/A-Contrarian-Worth-Listening-To-Caltech-Professor-John-Doyle/ba-p/1864.

Zimmer, Carl. 2007. “This Man Wants to Control the Internet (and You Should Let Him).” Discoverhttp://discovermagazine.com/2007/nov/this-man-wants-to-control-the-internet.

Zolli, Andrew. 2012. “Want to Build Resilience? Kill the Complexity.” Harvard Business Reviewhttp://blogs.hbr.org/cs/2012/09/want_to_build_resilience_kill_the_complexity.html .


Leave a Reply

Your email address will not be published. Required fields are marked *

  • RSS qoto.org/@daviding (Mastodon)

    • Dec 19, 2024, 13:00 December 19, 2024
      From the 1982 publication of _Organizations: Rational, Natural, and Open Systems_, W. Richard Scott in 2004 reflected back on the history of organizational sociology.> Before open system ideas, organizational scholars had concentrated on actors (workers, work groups, managers) and processes (motivation, cohesion, control) within organizations. Scant attention was accorded to the environment within which the […]
    • Dec 19, 2024, 12:58 December 19, 2024
      For those interested in detailed distinctions between systems approach, systems thinking, General Systems Theory, system science, etc, Aleksandra A. Nikiforova (Lomonosov Moscow State University) started an entry in the Encyclopedia of Knowledge Organization in 2022 that has been revised to 2024. https://www.isko.org/cyclo/systems .The International Society for Knowledge Organization is a “scholarly society devoted to the […]
    • Dec 15, 2024, 10:28 December 15, 2024
      The Future of Life Institute Safety Index is criticized by Mark Daley as too narrow, with an implicit bias disfavoring open sourcing.> The “Future of Life Institute” released their FLI Safety Index this week. [....] > By celebrating only those models that impose rigid controls on allowable thought and scorning those that grant the user […]
    • Dec 15, 2024, 10:11 December 15, 2024
      In understanding the precursors to the Gunderson and Holling 2001 _Panarchy_ book, it's good to keep in mind that when ecologists refer to "Adaptive Management", the clearer longer label is "Adaptive Environmental Assessment and Management".Holling, C.S. (1979). Adaptive Environmental Assessment and Management -- Current Progress and Prospects for the Approach: Summary Report of the First […]
    • Dec 10, 2024, 13:59 December 10, 2024
      In describing "go energy" and "stop energy", @pahlkadot approaches yang qi and yin qi, in a dyadic processual approach.> This is a useful nuance as I develop a framework for building state capacity. One of my admittedly obvious and oversimplified tenets is that systems have both “go energy” and “stop energy,” much as a car […]
  • RSS on IngBrief

    • Notion of Change in the Yijing | JeeLoo Lin 2017
      The appreciation of change is different in Western philosophy than in classical Chinese philosophy. JeeLoo Lin published a concise contrast on differences. Let me parse the Introduction to the journal article, that is so clearly written. The Chinese theory of time is built into a language that is tenseless. The Yijing (Book of Changes) there […]
    • World Hypotheses (Stephen C. Pepper) as a pluralist philosophy [Rescher, 1994]
      In trying to place the World Hypotheses work of Stephen C. Pepper (with multiple root metaphors), Nicholas Rescher provides a helpful positioning. — begin paste — Philosophical perspectivism maintains that substantive philosophical positions can be maintained only from a “perspective” of some sort. But what sort? Clearly different sorts of perspectives can be conceived of, […]
    • The Nature and Application of the Daodejing | Ames and Hall (2003)
      Ames and Hall (2003) provide some tips for those studyng the DaoDeJing.
    • Diachronic, diachrony
      Finding proper words to express system(s) change(s) can be a challenge. One alternative could be diachrony. The Oxford English dictionary provides two definitions for diachronic, the first one most generally related to time. (The second is linguistic method) diachronic ADJECTIVE Oxford English Dictionary, s.v. “diachronic (adj.), sense 1,” July 2023, https://doi.org/10.1093/OED/3691792233. For completeness, prochronic relates “to […]
    • Introduction, “Systems Thinking: Selected Readings, volume 2”, edited by F. E. Emery (1981)
      The selection of readings in the “Introduction” to Systems Thinking: Selected Readings, volume 2, Penguin (1981), edited by Fred E. Emery, reflects a turn from 1969 when a general systems theory was more fully entertained, towards an urgency towards changes in the world that were present in 1981. Systems thinking was again emphasized in contrast […]
    • Introduction, “Systems Thinking: Selected Readings”, edited by F. E. Emery (1969)
      In reviewing the original introduction for Systems Thinking: Selected Readings in the 1969 Penguin paperback, there’s a few threads that I only recognize, many years later. The tables of contents (disambiguating various editions) were previously listed as 1969, 1981 Emery, System Thinking: Selected Readings. — begin paste — Introduction In the selection of papers for this […]
  • Recent Posts

  • Archives

  • RSS on daviding.com

    • 2024/11 Moments November 2024
      Road trip to Rochester NY and Ithaca, with visits to art galleries as the days get shorter.
    • 2024/10 Moments October 2024
      Journey from Lugano Switzerland, return via Milan Italy, to fall in Toronto
    • 2024/09 Moments September 2024
      September neighbourhood music performances, day out with father, son's birthday party, travel via Milan to Genoa, systems conversation in Lugano
    • 2024/08 Moments August 2024
      Summer finishing with family events, and lots of outdoor music performances, captured with a new mirrorless camera for video from mid-month
    • 2024/07 Moments July 2024
      Summer festivals and music incubator shows in Toronto, all within biking distance.
    • 2024/06 Moments June 2024
      Summer jazz at the Distillery District, in Washington DC while at the annual systems conference, and then Toronto Jazz Festival
  • RSS on Media Queue

    • What to Do When It’s Too Late | David L. Hawk | 2024
      David L. Hawk (American management theorist, architect, and systems scientist) has been hosting a weekly television show broadcast on Bold Brave Tv from the New York area on Wednesdays 6pm ET, remotely from his home in Iowa. Live, callers can join…Read more ›
    • 2021/06/17 Keekok Lee | Philosophy of Chinese Medicine 2
      Following the first day lecture on Philosophy of Chinese Medicine 1 for the Global University for Sustainability, Keekok Lee continued on a second day on some topics: * Anatomy as structure; physiology as function (and process); * Process ontology, and thing ontology; * Qi ju as qi-in-concentrating mode, and qi san as qi-in-dissipsating mode; and […]
    • 2021/06/16 Keekok Lee | Philosophy of Chinese Medicine 1
      The philosophy of science underlying Classical Chinese Medicine, in this lecture by Keekok Lee, provides insights into ways in which systems change may be approached, in a process ontology in contrast to the thing ontology underlying Western BioMedicine. Read more ›
    • 2021/02/02 To Understand This Era, You Need to Think in Systems | Zeynep Tufekci with Ezra Klein | New York Times
      In conversation, @zeynep with @ezraklein reveal authentic #SystemsThinking in (i) appreciating that “science” is constructed by human collectives, (ii) the west orients towards individual outcomes rather than population levels; and (iii) there’s an over-emphasis on problems of the moment, and…Read more ›
    • 2019/04/09 Art as a discipline of inquiry | Tim Ingold (web video)
      In the question-answer period after the lecture, #TimIngold proposes art as a discipline of inquiry, rather than ethnography. This refers to his thinking On Human Correspondence. — begin paste — [75m26s question] I am curious to know what art, or…Read more ›
    • 2019/10/16 | “Bubbles, Golden Ages, and Tech Revolutions” | Carlota Perez
      How might our society show value for the long term, over the short term? Could we think about taxation over time, asks @carlotaprzperez in an interview: 92% for 1 day; 80% within 1 month; 50%-60% tax for 1 year; zero tax for 10 years.Read more ›
  • Meta

  • Creative Commons License
    This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License
    Theme modified from DevDmBootstrap4 by Danny Machal