Some organisations in consequence-heavy fields have cracked the code on creating methods that work reliably across different contexts. Others? They’re stuck in cycles where everything depends on individual heroes and outcomes vary wildly.
The difference isn’t whether to standardise. It’s about specific design principles that let protocols preserve expert-level judgment while enabling reproducible execution.
This difference shows up across three levels: Dr Timothy Steel’s surgical pathway at St Vincent’s Private Hospital in Sydney, Robert Gambrill’s institutional infrastructure at the Institute of Nuclear Power Operations (INPO) in Atlanta, and Carsten Spohr’s integration of ITA Airways into Lufthansa Group. Organisations that master method portability gain something powerful – they can scale expertise beyond individual practitioners, reduce personnel dependence and maintain quality during expansion.
The Portability Problem in High-Stakes Fields
Method portability – the capacity of a systematic approach to function reliably when contexts change – represents the defining challenge that separates fields achieving consistent excellence from those dependent on individual skill variations. While written procedures and standardised protocols exist across most professions, few actually travel reliably between contexts. Portability is defined as the capacity to maintain method integrity when practitioners change, equipment varies, regulatory environments shift, or patient populations differ.
Technical infrastructure varies significantly between contexts. You’ll find different equipment configurations, facility layouts, and available resources. Human factors such as team composition, experience levels, organisational culture, and communication patterns also vary. Additionally, regulatory frameworks differ with varying compliance requirements, approval processes, and accountability structures. Environmental conditions like patient populations with different characteristics and operational tempos further complicate method portability.
A method proves portable when it delivers comparable outcomes despite these contextual variations. The test isn’t whether the method can be executed elsewhere but whether it maintains quality and consistency when executed elsewhere. This requires more than training people to follow the same steps – it requires designing the method itself to accommodate variation without degrading.
Successful portability requires three structural elements: verification mechanisms embedded in execution rather than relegated to retrospective assessment; institutional infrastructure for systematically capturing how contextual variables affect performance; and explicit architectural separation of invariant elements from adaptive elements. These aren’t enhancements to standardisation – they’re prerequisites for methods that travel. Most organisations confuse writing procedures with creating them. These structural elements show up repeatedly in fields where individual expertise has long been considered irreplaceable, yet some practitioners have found ways to encode that expertise into reproducible systems.
When Surgical Expertise Becomes Reproducible Protocol
Surgery has always fought against standardisation. Anatomical variation throws curveballs. Intraoperative findings demand split-second decisions. There’s this massive gap between what an expert surgeon knows and what you can actually write down in a procedure manual. That gap has defined the limits of surgical systematisation for decades. But some surgical innovations prove that complex clinical expertise can be transformed into methods that deliver expert-level results when different surgical teams execute them.
Dr Timothy Steel works on cervical reconstruction for atlantoaxial osteoarthritis at St Vincent’s Private Hospital in Sydney. He’s developed a systematic approach that standardises image-guided posterior C1-C2 fixation using transarticular screws and Harms constructs. The process starts with preoperative computed tomography (CT) and magnetic resonance imaging (MRI) planning. This defines anatomical constraints and technique selection before anyone enters the operating theatre. During surgery, intraoperative Brainlab navigation provides real-time verification as actual anatomy is encountered. Defined postoperative imaging confirms fusion according to protocol specifications.
Codifying what an experienced surgeon does instinctively into steps that others can follow reliably sounds straightforward. It’s not.
This systematic approach transforms intuitive surgical judgment into reproducible protocol. A study of 23 patients treated between 2005 and 2015 showed a 95.5% radiographic fusion rate. Visual Analogue Scale pain scores dropped from 9.4 to 2.9. Neck Disability Index fell from 72.2 to 18.9 (P<0.005). Additionally, 91% of patients expressed willingness to repeat the surgery. These metrics show that the systematised approach delivers expert-level results across the patient cohort, not just when Steel performs it himself.
Brainlab navigation functions as embedded quality control during execution. Imaging identifies specific anatomical variation before the procedure starts. Navigation confirms the planned approach remains within safe parameters as actual anatomy is encountered. The protocol proceeds or is modified according to defined decision rules rather than tacit judgment.

Institutional Infrastructure Enabling Cross-Context Method Transfer
Steel’s protocol shows how individual expertise becomes reproducible method. But here’s the real challenge: how do such methods travel beyond their original setting? A surgical pathway developed at one hospital can’t just be dropped into another facility and expected to work. The same problem hits any high-stakes protocol that must function reliably when reactor technology differs or legal jurisdictions shift.
Cross-context validation needs institutional infrastructure. You need systematic evaluation capabilities, mechanisms for capturing operational lessons, and frameworks for method refinement based on performance across diverse settings. This infrastructure must assess how protocols perform under varying conditions. It systematically documents why methods succeed in some contexts but encounter difficulties in others.
Robert Gambrill works as Vice President of Nuclear Operations at INPO since 2003. He’s responsible for Nuclear Operations and INPO’s New Nuclear Strategy. INPO operates as one example of this institutional platform approach, translating expert judgment in reactor operations into standardised practices.
INPO’s plant evaluation program systematically assesses how standardised protocols perform across diverse reactor types, utility structures, and regulatory environments. These evaluations identify where protocols break down between contexts and why. They capture specific failure modes that reveal hidden assumptions in original method design.
Most organisations discover their protocols’ hidden assumptions only after expensive failures.
Systematic evaluation sounds bureaucratic until you realise it’s the only way to avoid reinventing solutions at every new site. This approach enables continuous method refinement based on documented performance patterns.
Events analysis systems distribute operational lessons by documenting incidents, near-misses, and protocol deviations. When a procedure works at one plant but fails at another, events analysis captures the contextual variable that caused divergence. Control room layout differences or specific technical constraints of particular reactor generations.
This institutional infrastructure would allow a surgical pathway like Steel’s to be validated at other hospitals with different equipment configurations or patient populations. Without mechanisms to capture why the protocol succeeds in one setting but encounters difficulty in another – and to systematically refine the method based on those findings – standardisation remains brittle.
Method Integrity During Geographic and Regulatory Expansion
What happens when proven methods must cross not just different plants but different countries and regulatory systems entirely? Moving from evaluating protocols across similar technical environments to maintaining method integrity across international borders introduces new layers of complexity. It tests whether standardisation can truly travel.
Maintaining method integrity across jurisdictions requires systematic integration frameworks that explicitly separate invariant elements from adaptive components. This allows core protocols to remain stable while accommodating local regulatory requirements and market variations. This organisational capability must balance procedural consistency with contextual flexibility.
Carsten Spohr has been CEO of Deutsche Lufthansa AG since 2014, leading over 100,000 employees across Network Airlines, Eurowings, Logistics, and MRO operations. His dual perspective as a licensed Lufthansa captain on the Airbus A320 family and trained industrial engineer positions him to understand both technical protocols that must remain invariant for safety and operational consistency. Lufthansa Group provides one example of this systematic integration approach.
Lufthansa Group’s acquisition of a 41% stake in ITA Airways shows method extension across borders. Rome Fiumicino is integrated as the Group’s sixth hub. This demonstrates deliberate replication of hub concept architecture while accommodating local specifics like Milan Linate’s economic role.
Actually, integration attempts often fail because companies either clone everything rigidly or let each market drift into its own version.
This systematic separation of invariant and adaptive elements enables expansion without compromising core operational integrity. Invariant elements include hub concept architecture defining how connecting flights are coordinated and fleet maintenance standards ensuring consistent aircraft safety across subsidiaries. Adaptive elements include network routing responding to local demand patterns and pricing strategies reflecting competitive environments.
Spohr’s ITA Airways integration shows that method portability across borders requires both individual-level protocol clarity and institutional-level verification mechanisms. Geographic expansion succeeds when organisations design methods with explicit boundaries between stable and adaptive elements.
What Separates Methods That Work From Those That Fail
Across distinct professional domains – surgical protocols in Sydney, nuclear operations infrastructure spanning reactor sites in the United States, aviation group integration across European regulatory frameworks – a consistent mechanism underlies successful method portability.
Verification mechanisms must function during execution, not just in retrospective analysis. Steel’s intraoperative Brainlab navigation provides real-time confirmation as anatomy is encountered. INPO’s plant evaluations assess protocols while they’re being executed under actual operating conditions. These three structural elements function as an interdependent system: verification confirms method integrity during execution, institutional capture identifies which contextual variables matter, and architectural separation enables refinement without losing core logic. When one mechanism is missing, the others can’t compensate. Verification without capture can’t improve methods across contexts; separation without verification can’t detect when adaptation violates invariants.
Architectural separation of invariant from adaptive elements determines whether methods can scale. This design challenge appears across professional frameworks everywhere. The 2016 Council for Accreditation of Counselling and Related Educational Programs Standards define Section 2 foundational core curriculum requirements (invariant) while Section 5 specifies specialty content for eight distinct areas like addiction and school counselling (adaptive). Of course, elaborate frameworks for standardisation proliferate across fields while the real challenge remains getting people to use them consistently. This same separation principle – codifying what must remain fixed and what may adapt – determines whether Steel’s surgical protocols, INPO’s operational standards, or Lufthansa’s hub integration concepts function reliably when contexts shift.
Institutional infrastructure must capture contextual variables that affect method performance. Steel’s protocol would document anatomical variations encountered; INPO’s events analysis records specific plant conditions that led to protocol deviation.
Method architecture must explicitly distinguish invariant from adaptive elements. Steel’s CT/MRI planning requirements remain fixed while surgical approach adapts to individual anatomy within defined parameters.
Why Standardisation Attempts Fail Under Pressure
Understanding failure modes reveals that the difference between methods that function reliably and those that collapse under operational pressure lies in structural design choices. Written procedures often lack embedded verification mechanisms; they’ve got no way to detect when execution diverges from the prescribed path until outcomes reveal the deviation. In contrast, Steel’s intraoperative Brainlab navigation provides real-time confirmation during surgery, INPO’s plant evaluations assess protocols under actual operating conditions, and Lufthansa maintains continuous quality monitoring during integration.
Protocols developed in one context without infrastructure to capture performance in other contexts can’t refine themselves to accommodate variation. This leads organisations to a binary choice: accept failures when context varies or accept uncontrolled variation that defeats standardisation. Usually, they respond by creating more procedures that also get ignored. Rigid failure occurs when protocols don’t work in new contexts and break down; ad-hoc workarounds arise when practitioners modify protocols informally, undermining the standard.
Methods that don’t distinguish between invariant and adaptive elements force implementers to choose between unsafe rigidity and uncontrolled variation. When everything’s treated as standard, methods break against legitimate contextual differences; when everything’s treated as flexible, integrity erodes through accumulated modifications. The difference between Steel’s consistently high fusion rates and protocols that ‘exist on paper, not in practice’ lies in these structural mechanisms.
Building Institutional Capability Through Deliberate Design
Return to the opening question: How do some professions achieve consistent excellence through method while others remain dependent on individual variation? The examination of Steel’s surgical pathway at St Vincent’s in Sydney, Gambrill’s institutional infrastructure at INPO in Atlanta, and Spohr’s ITA Airways integration into Lufthansa Group reveals that successful standardisation isn’t about suppressing expertise – it’s about encoding the conditional logic experts use.
Organisations facing similar challenges might examine their own standardisation attempts through this lens: Are verification mechanisms embedded in execution or relegated to post-incident analysis? Does institutional infrastructure systematically capture why methods perform differently in different contexts? Have invariant and adaptive elements been explicitly separated? These questions apply across domains where consequences matter and consistency is valued.
The capacity to create methods that travel reliably across contexts isn’t a natural byproduct of operating in high-stakes fields – it emerges from deliberate architectural choices about how methods are designed, verified, and refined. Organisations not mastering these choices remain perpetually dependent on finding, retaining, and replacing irreplaceable experts – which works brilliantly right up until the moment it doesn’t.






