Hi SIKM,
I am hoping to draw on your hivemind to see if there's a good
term out there for a very particular phenomena that I am
observing.
Most of us would be familiar with the "sunk cost fallacy", the
idea that any decision should ignore past costs (either time or
money) when making a future decision. It is common to stick with
initiatives long past any rational reason to do so, typically for
reasons of commitment bias and loss aversion.
The phenomenon I am seeking to explain is one rooted in a
knowledge failure. It occurs when an organisation implements
solutions in response to a problem, but then sustains
those solution long past their useful life. I suspect that this is
especially common after an extended period of process optimisation
that is built on base knowledge which then becomes outdated.
After some reflection, I have reminded myself that the "double
loop learning" process proposed by Argyris can be a solution
to this problem. But I don't think this helpfully describes the failure.
"Failure to engage in double loop learning" is gobbledygook to
anyone outside of KM. "Retaining bad assumptions" is too vague for
the situation.
The scenario I am particularly thinking of is:
- The solution made sense and worked when it was devised
- The environment changes, making some prior knowledge invalid
and the previous solution ineffective or an outright failure
(generally the failure must be partial or subtle, excusable as
an "outlier" or "temporary" aberration)
- The organisation is biased towards keeping the practice in
place despite rising evidence to the contrary since everyone
"knows it works"
A high-profile example of this failure was the shift
to digital downloads at the turn of the millennium. The
music industry lost nearly half its revenue during a consumer-led
revolt against the traditional model of album-based, physical CD
sales.
The problem is that while in a competitive marketplace such
flawed reasoning gets exposed relatively quickly, in a
monopolistic situation (particularly in government) there is less
pressure to fix these issues. It is generally only after a
significant number of patently absurd outcomes get publicised that
serious reform is considered -- and until then, lots of
unnecessary human suffering can occur.
So: I need a snappy name to describe this knowledge failure. Got
any good ideas?
Cheers,
Stephen.
====================================
Stephen Bounds
Executive, Information Management
Cordelta
E: stephen.bounds@...
M: 0401 829 096
====================================
|
|

Andy Farnsworth
Hi Stephen,
This is a bit abstract, but I think you're describing the adaptive cycle, commonly used to describe complex adaptive systems like forest ecosystems. I'm working on an article now that relates it to organizational process management.

Resources accumulate through the growth and conservation phases (blue and green above) until growth slows and stability/equilibrium is reached. System resilience approaches a minimum as resources are committed to maintaining stability. However equilibrium is only ever a temporary condition in a complex adaptive system, with agents constantly adjusting to find advantage, and chaos / cycles of large-scale change interacting with "stable" systems. The result of these interactions is an eventual disturbance in the environment, triggering a rapid breakdown in the system (think forest fires). Resources are released, connections are broken, and the system begins (is free) to reorganize (red into orange above).
From an organizational perspective, often a crisis reveals critical weaknesses in the system (think COVID). If leadership has the presence and foresight to recognize a precarious state, it can initiate a transformation or change process. The change team can highlight the urgency and push the system into an artificial release stage.
Happy (so happy) to discuss further if it's of interest! Andrew
|
|

Alex Zichettello
Hi Stephen,
How about "Obsolescence Aversion"? Not the snappiest but I thought I'd share since I relate to this situation and found myself wondering the same thing. The idea here is that by rational assessment, the solution should be obsolesced but (due to any number of biases) there is an aversion to do so.
-Alex Zichettello
|
|
Hi Stephen,
I call it “epistemic latency” – when knowledge lags “what is” and what currently works. I call its opposite “knowledge dynamics” or “knowledge kinetics” – the desired state in managing knowledge.
As you point out, this is a huge vulnerability for “best practices” and institutional knowledge in general – which in practice, is typically stored in the form of (inherently static) information. Knowledge
– instead of being a buffer against environmental change, as it should be -- then becomes a self-contained meta-reality that no longer accurately represents “real” reality.
At worse, knowledge (as practiced) can be a drag on forward progress! To wit, “We already know this, so there’s no need to re-assess the situation – we’ll just plug and play our canned solution.”
Naming this is important, I agree – but, more importantly, how do we fix it -- or
prevent it in the first place?
In my opinion,
Tim
From:
<main@SIKM.groups.io> on behalf of Stephen Bounds <km@...>
Reply-To: "main@SIKM.groups.io" <main@SIKM.groups.io>
Date: Wednesday, January 12, 2022 at 8:51 PM
To: "main@SIKM.groups.io" <main@SIKM.groups.io>
Subject: [SIKM] How would you describe this knowledge failure?
Hi SIKM,
I am hoping to draw on your hivemind to see if there's a good term out there for a very particular phenomena that I am observing.
Most of us would be familiar with the "sunk cost fallacy", the idea that any decision should ignore past costs (either time or money) when making a future decision. It is common to stick with initiatives long past any rational reason
to do so, typically for reasons of commitment bias and loss aversion.
The phenomenon I am seeking to explain is one rooted in a knowledge failure. It occurs when an organisation implements solutions in response to a problem, but then
sustains those solution long past their useful life. I suspect that this is especially common after an extended period of process optimisation that is built on base knowledge which then becomes outdated.
After some reflection, I have reminded myself that the "double loop learning" process proposed by Argyris can be a
solution to this problem. But I don't think this helpfully describes the
failure. "Failure to engage in double loop learning" is gobbledygook to anyone outside of KM. "Retaining bad assumptions" is too vague for the situation.
The scenario I am particularly thinking of is:
1.
The solution made sense and worked when it was devised
2.
The environment changes, making some prior knowledge invalid and the previous solution ineffective or an outright failure (generally the failure must be partial or subtle, excusable as an "outlier" or "temporary" aberration)
3.
The organisation is biased towards keeping the practice in place despite rising evidence to the contrary since everyone "knows it works"
A high-profile example of this failure was the
shift to digital downloads at the turn of the millennium. The music industry lost nearly half its revenue during a consumer-led revolt against the traditional model of album-based, physical CD sales.
The problem is that while in a competitive marketplace such flawed reasoning gets exposed relatively quickly, in a monopolistic situation (particularly in government) there is less pressure to fix these issues. It is generally only
after a significant number of patently absurd outcomes get publicised that serious reform is considered -- and until then, lots of unnecessary human suffering can occur.
So: I need a snappy name to describe this knowledge failure. Got any good ideas?
Cheers,
Stephen.
====================================
Stephen Bounds
Executive, Information Management
Cordelta
E: stephen.bounds@...
M: 0401 829 096
====================================
|
|
Ha Stephen,
I would simply call this 'culture' :). Cultures are not that fast to change. As is illustrated by the 5 monkey story. Long after the sprinkling stopped they are scared..
Cheers, Joitske
toggle quoted message
Show quoted text
On Thu, 13 Jan 2022 at 02:51, Stephen Bounds < km@...> wrote:
Hi SIKM,
I am hoping to draw on your hivemind to see if there's a good
term out there for a very particular phenomena that I am
observing.
Most of us would be familiar with the "sunk cost fallacy", the
idea that any decision should ignore past costs (either time or
money) when making a future decision. It is common to stick with
initiatives long past any rational reason to do so, typically for
reasons of commitment bias and loss aversion.
The phenomenon I am seeking to explain is one rooted in a
knowledge failure. It occurs when an organisation implements
solutions in response to a problem, but then sustains
those solution long past their useful life. I suspect that this is
especially common after an extended period of process optimisation
that is built on base knowledge which then becomes outdated.
After some reflection, I have reminded myself that the "double
loop learning" process proposed by Argyris can be a solution
to this problem. But I don't think this helpfully describes the failure.
"Failure to engage in double loop learning" is gobbledygook to
anyone outside of KM. "Retaining bad assumptions" is too vague for
the situation.
The scenario I am particularly thinking of is:
- The solution made sense and worked when it was devised
- The environment changes, making some prior knowledge invalid
and the previous solution ineffective or an outright failure
(generally the failure must be partial or subtle, excusable as
an "outlier" or "temporary" aberration)
- The organisation is biased towards keeping the practice in
place despite rising evidence to the contrary since everyone
"knows it works"
A high-profile example of this failure was the shift
to digital downloads at the turn of the millennium. The
music industry lost nearly half its revenue during a consumer-led
revolt against the traditional model of album-based, physical CD
sales.
The problem is that while in a competitive marketplace such
flawed reasoning gets exposed relatively quickly, in a
monopolistic situation (particularly in government) there is less
pressure to fix these issues. It is generally only after a
significant number of patently absurd outcomes get publicised that
serious reform is considered -- and until then, lots of
unnecessary human suffering can occur.
So: I need a snappy name to describe this knowledge failure. Got
any good ideas?
Cheers,
Stephen.
====================================
Stephen Bounds
Executive, Information Management
Cordelta
E: stephen.bounds@...
M: 0401 829 096
====================================
-- Joitske Hulsebosch, Ennuonline 06-44730093 Tip: Ons nieuwe boek 'Blended leren ontwerpen' is uit. Je kunt het hier bestellen of de preview lezen op managementboek.
|
|

Dan Ranta
Dear Stephen, when I do a strategy for any organization (KM or otherwise) I like to share options. For me, a strategy is largely about options and when you make a selection among options, you are prioritizing and setting a direction or trajectory. As the graphic shows below (this is from a strategy that I did in 2020), I provide some basic characteristics of each option. I like to have some fun with a slogan column. The first option below (which this company tragically selected) is one I call "plow ahead." In the UK one might say "plough ahead." Either is ok with me. This company might as well have said "blindly plow ahead" or "let's play the 'hope' game." As in, let's do more of the same (which has not worked) and hope it works.
I am not too snappy - sorry about that! Dan
toggle quoted message
Show quoted text
Hi Stephen,
I call it “epistemic latency” – when knowledge lags “what is” and what currently works. I call its opposite “knowledge dynamics” or “knowledge kinetics” – the desired state in managing knowledge.
As you point out, this is a huge vulnerability for “best practices” and institutional knowledge in general – which in practice, is typically stored in the form of (inherently static) information. Knowledge
– instead of being a buffer against environmental change, as it should be -- then becomes a self-contained meta-reality that no longer accurately represents “real” reality.
At worse, knowledge (as practiced) can be a drag on forward progress! To wit, “We already know this, so there’s no need to re-assess the situation – we’ll just plug and play our canned solution.”
Naming this is important, I agree – but, more importantly, how do we fix it -- or
prevent it in the first place?
In my opinion,
Tim
Hi SIKM,
I am hoping to draw on your hivemind to see if there's a good term out there for a very particular phenomena that I am observing.
Most of us would be familiar with the "sunk cost fallacy", the idea that any decision should ignore past costs (either time or money) when making a future decision. It is common to stick with initiatives long past any rational reason
to do so, typically for reasons of commitment bias and loss aversion.
The phenomenon I am seeking to explain is one rooted in a knowledge failure. It occurs when an organisation implements solutions in response to a problem, but then
sustains those solution long past their useful life. I suspect that this is especially common after an extended period of process optimisation that is built on base knowledge which then becomes outdated.
After some reflection, I have reminded myself that the "double loop learning" process proposed by Argyris can be a
solution to this problem. But I don't think this helpfully describes the
failure. "Failure to engage in double loop learning" is gobbledygook to anyone outside of KM. "Retaining bad assumptions" is too vague for the situation.
The scenario I am particularly thinking of is:
1.
The solution made sense and worked when it was devised
2.
The environment changes, making some prior knowledge invalid and the previous solution ineffective or an outright failure (generally the failure must be partial or subtle, excusable as an "outlier" or "temporary" aberration)
3.
The organisation is biased towards keeping the practice in place despite rising evidence to the contrary since everyone "knows it works"
A high-profile example of this failure was the
shift to digital downloads at the turn of the millennium. The music industry lost nearly half its revenue during a consumer-led revolt against the traditional model of album-based, physical CD sales.
The problem is that while in a competitive marketplace such flawed reasoning gets exposed relatively quickly, in a monopolistic situation (particularly in government) there is less pressure to fix these issues. It is generally only
after a significant number of patently absurd outcomes get publicised that serious reform is considered -- and until then, lots of unnecessary human suffering can occur.
So: I need a snappy name to describe this knowledge failure. Got any good ideas?
Cheers,
Stephen.
====================================
Stephen Bounds
Executive, Information Management
Cordelta
E: stephen.bounds@...
M: 0401 829 096
====================================
-- Daniel Ranta Mobile: 603 384 3308
|
|
If I understand your issue correctly, Dorothy Leonard-Barton described this rather usefully as a “core rigidity” - i.e. a core capability that is no longer useful; a
deeply embedded knowledge set that actively inhibits our ability to innovate or
respond to competitive needs: “the very same values, norms and attitudes that
support a core capability and thus enable development can also constrain it”
(Leonard-Barton 1992: 118-119).
I think it is useful to think of it in capability terms, because capabilities are bundles of structures, people, skills, routines, habits, which means this is not just a psychological issue, but one of unwinding the structural elements that reinforce the previous practices. There is a whole literature on deliberate forgetting/unlearning approaches in relation to these rigidities.
Leonard-Barton, D. (1992) ‘Core capabilities and core rigidities: a paradox in managing new product development’ Strategic Management Journal, 13 (Summer): 111-25.
P
toggle quoted message
Show quoted text
On 13 Jan 2022, at 9:55 PM, Joitske Hulsebosch < joitske@...> wrote:
Ha Stephen,
I would simply call this 'culture' :). Cultures are not that fast to change. As is illustrated by the 5 monkey story. Long after the sprinkling stopped they are scared..
Cheers, Joitske
On Thu, 13 Jan 2022 at 02:51, Stephen Bounds < km@...> wrote:
Hi SIKM, I am hoping to draw on your hivemind to see if there's a good
term out there for a very particular phenomena that I am
observing. Most of us would be familiar with the "sunk cost fallacy", the
idea that any decision should ignore past costs (either time or
money) when making a future decision. It is common to stick with
initiatives long past any rational reason to do so, typically for
reasons of commitment bias and loss aversion.
The phenomenon I am seeking to explain is one rooted in a
knowledge failure. It occurs when an organisation implements
solutions in response to a problem, but then sustains
those solution long past their useful life. I suspect that this is
especially common after an extended period of process optimisation
that is built on base knowledge which then becomes outdated.
After some reflection, I have reminded myself that the "double
loop learning" process proposed by Argyris can be a solution
to this problem. But I don't think this helpfully describes the failure.
"Failure to engage in double loop learning" is gobbledygook to
anyone outside of KM. "Retaining bad assumptions" is too vague for
the situation. The scenario I am particularly thinking of is:
- The solution made sense and worked when it was devised
- The environment changes, making some prior knowledge invalid
and the previous solution ineffective or an outright failure
(generally the failure must be partial or subtle, excusable as
an "outlier" or "temporary" aberration)
- The organisation is biased towards keeping the practice in
place despite rising evidence to the contrary since everyone
"knows it works"
A high-profile example of this failure was the shift
to digital downloads at the turn of the millennium. The
music industry lost nearly half its revenue during a consumer-led
revolt against the traditional model of album-based, physical CD
sales. The problem is that while in a competitive marketplace such
flawed reasoning gets exposed relatively quickly, in a
monopolistic situation (particularly in government) there is less
pressure to fix these issues. It is generally only after a
significant number of patently absurd outcomes get publicised that
serious reform is considered -- and until then, lots of
unnecessary human suffering can occur.
So: I need a snappy name to describe this knowledge failure. Got
any good ideas?
Cheers,
Stephen.
====================================
Stephen Bounds
Executive, Information Management
Cordelta
E: stephen.bounds@...
M: 0401 829 096
====================================
-- Joitske Hulsebosch, Ennuonline 06-44730093 Tip: Ons nieuwe boek 'Blended leren ontwerpen' is uit. Je kunt het hier bestellen of de preview lezen op managementboek.
|
|
To be snappy, an alliteration often works best. First, find a good description, then focus on making it an alliteration.
toggle quoted message
Show quoted text
Hi Stephen,
How about "Obsolescence Aversion"? Not the snappiest but I thought I'd share since I relate to this situation and found myself wondering the same thing. The idea here is that by rational assessment, the solution should be obsolesced but (due to any number of biases) there is an aversion to do so.
-Alex Zichettello
|
|

Dennis Pearce
This seems like the organizational version of anchoring bias in cognitive psychology, or more specifically conservatism bias (see https://en.wikipedia.org/wiki/Conservatism_(belief_revision)). (One of the items on my list of things to explore someday is to look across all the known cognitive biases in human individual learning and see how many might have analogs in organizational learning.)
|
|
Hi all,
Thank you so much for your suggestions! There was a lot of
variety in your answers, which is perhaps to be expected. Some
consolidated comments in response:
- Rigidity trap (Andrew Farnsworth) - The adaptive cycle
is an excellent concept and I think that "peak rigidity" point
which captures what I'm looking for. I particularly like the
phrase that "system resilience approaches a minimum as resources
are committed to maintaining stability".
- Obsolescence Aversion (Alex Zichettello) - Nice, but it
has the slight drawback that it sounds like it is the
organisation seeking to avoid obsolescence, whereas the intent
is to describe
- Epistemic latency (Tim Wood) - Another good term, but
quite technical for a lay person, and doesn't fully capture the
sense of an evolving environment
- Culture (Joitske Hulsebosch) - Correct in the broadest
sense! But not specific enough to support diagnosis and action
IMO.
- Plow ahead (Dan Ranta) - I like that this captures the
problem of unreflective practices. It needs some nuance to tie
it to the idea of changed assumptions / environments
- Core rigidity (Patrick Lambe) - Quite similar
to the rigidity trap, but as Patrick rightly points out, the
problem is "one of unwinding the structural elements that
reinforce the previous practices ... deliberate
forgetting/unlearning" (and not just psychology)
- Conservatism bias (Dennis Pearce) - I think this
touches on another important aspect of the problem, which is an
unwillingness to believe evidence indicating change is required
- Dangle berries (Matt Moore) - Aside from the origin of
the name, my problem with the term is that it doesn't capture
the sense of something that "used to be" useful. But thanks for
putting me off my dinner!
After some more research into the adaptive cycle, I found this
useful quote from Daniel Wahl:

As any system begins to mature, there is an accompanying
increase in fixed and ordered patterns of interactions and
resource flows. The system becomes over-connected, or
better, the existing qualities and quantities of connections are
such that they inhibit the formation of new pathways needed
for the system’s overall adaptation to outside changes and
its continued evolution. Eventually this leads to rigidity
within the system, and it becomes brittle, less resilient, and
more susceptible to disturbances from the outside.
At this point, the effects of detrimental run-away feedback
loops inside the system can further challenge viability. The
often resulting gradual or sudden breakdown of the old order and
structures moves the system closer to ‘the edge of chaos’ — the
edge of its current stability (dynamic equilibrium) domain. The
reorganization of resource flows and changes in the quality
and quantity of interconnections within the system at this
point create a crisis that can be turned into an opportunity
for transformation and innovation. (my emphasis)
Curious, I looked further back in Google Scholar and found what
appears to be one of the foundational
papers of this concept by CS Hollings (2008), who himself
credits Herbert
A Simon with the concept of dynamic hierarchies that
underpins it.
It's actually a fascinating paper and worth a read in its own
right, with Hollings pointing out both the rigidity trap (high
connectedness and potential) and the poverty trap at the opposite
end of the cycle (low connectedness and potential).
(As an aside, I am sure the idea of a "poverty trap" organisation
neatly describes somewhere most of us have worked in our past!)
Hollings (per Simon) represents the hierarchies of a typical
social system in this way:

The excerpts I particularly like are:
As long as the transfer from one level to
the other [of a dynamic hierarchy] is maintained, the
interactions within the levels themselves can be transformed, or
the variables changed, without the whole system losing its
integrity ... this structure allows wide latitude for
experimentation ... each of the levels of a dynamic
hierarchy serves two functions ... One is to conserve and
stabilize conditions for the faster and smaller levels; the
other is to generate and test innovations by experiments
occurring within a level ...
It is as if two separate objectives are functioning, but in
sequence. The first maximizes production and accumulation;
the second maximizes invention and reassortment. The two
objectives cannot be maximized simultaneously but only occur
sequentially. And the success in achieving one
inexorably sets the stage for its opposite. The adaptive cycle
therefore embraces two opposites: growth and stability on the
one hand, change and variety on the other...
[Systems] have a minimal complexity we call the "Rule of Hand"
whose features make linear polices more likely to produce
temporary solutions and a greater number of escalating
problems. Only an actively adaptive approach can
minimize the consequences [of the inevitable release].
When a level in [a dynamic hierarchy] enters its [release]
phase of creative destruction, the collapse can cascade to
[a higher] level by triggering a crisis ["revolt")].
Such an event is most likely if the slower level is at its
[conservation] phase, because at this point the ... level is
particularly vulnerable ... [Similarly once] a catastrophe is
triggered at one level, the opportunities for, or
constraints against, the renewal of the cycle are strongly
influenced by the [conservation] phase of the next slower
and larger level [("remember"]) ...
Extremely large events can overwhelm the sustaining
properties of panarchies, destroying levels, and triggering
destructive cascades down the successive levels of a
[dynamic hierarchy] ... Modern democracies ... diffuse large
episodes of creative destruction by creating smaller cycles of
renewal and change through periodic political elections. (my
emphasis)
This all helps me to describe my original question more precisely.
There may in fact be several diagnoses relevant to the situation:
- The organisation or social system may be stuck in a rigidity
trap, where active attempts are made to inhibit the
adaptive cycle but which eventually and inevitably lead to
catastrophic collapse
- It may be that leaders have conservatism
bias, ie they are not actively preventing change but are
cognitively not predisposed to recognise that renewal is already
occurring, or is necessary to commence. In these scenarios, it
is a crisis that normally precipitates the change imperative but
it would be better to commence in a planned way
- A system may have accumulated insufficient capital
for change even though a release of potential for the next
phase of adaptation is needed, When thinking about staff, this
is often talked about as "change fatigue". A temporary holding
pattern may be a short term solution, if and only if external
pressures for change - both above and below - are low.
- However, if a lower level release is causing a "revolt" at a
higher level, or a higher level is at risk of catastrophic
collapse, systems may be forced into premature change
to avert a full collapse and rely upon the higher levels of the
hierarchy to re-establish norms and productivity after a chaotic
period. (The alternative is a rigidity trap as described above,
which is likely to lead to worse long-term outcomes.)
A very interesting exercise to work through. Many thanks to
everyone for helping me out!
Cheers,
Stephen.
====================================
Stephen Bounds
Executive, Information Management
Cordelta
E: stephen.bounds@...
M: 0401 829 096
====================================
On 14/01/2022 4:04 am, Dennis Pearce wrote:
This seems like the organizational version of anchoring bias in
cognitive psychology, or more specifically conservatism bias (see https://en.wikipedia.org/wiki/Conservatism_(belief_revision)).
(One of the items on my list of things to explore someday is to
look across all the known cognitive biases in human individual
learning and see how many might have analogs in organizational
learning.)
|
|

Andy Farnsworth
Thanks for the summary, Stephen! It's really helpful to see how a post author - as the local expert on the nature of the problem space - synthesizes the responses they get.
I wanted to follow up with a link to this article from Sundstrom and Allen, which I also found really interesting: https://digitalcommons.unl.edu/natrespapers/1063/
Gets me thinking that stability is scaled to the size or level of the system in the hierarchy. Maybe revolts should be encouraged at lower levels, like innovation cycles, with the results fed upwards for deeper synthesis and strategic decision making.

Andrew
|
|
Great job, Stephen -- in asking a challenging and provocative question, then fielding, summarizing, evaluating the various results -- and doing this all rapidly, while we’re still thinking about it. Thanks
too for all this relevant bonus material!
And thanks, all, for your interesting and diverging contributions – each one of which adds insight. I am struck that (1)
so many of us have experienced and/or observed this phenomenon, while (2) we have
little common language (yet) to even describe the problem. I’m not totally surprised, though, since I view this is a “dirty little secret” of KM – seldom discussed openly, but challenging to the credibility and practical
usefulness of some of KM’s core principles and practices.
Am I the only one here who feels this “thing” – however one labels it -- is a
profound, even existential, challenge to the knowledge discipline(s)? Has any of you addressed it in a rigorous way? How? What resulted – and, in particular, what progress have you been able to make?
Stephen, your bringing this to light and focusing our “hive mind” represents a major leap forward, in my view.
Thanks,
Tim
PS: Fair call that you assess the term I use as “technical,” which frankly I’m pleased about. I am often uncomfortable that much of KM – and management theory in general – is anecdotal and non-falsifiable,
whereas my unwavering belief is that (as Peter Drucker said decades ago) we need a
science of knowledge. That’s my own goal – though, perhaps, an impossible dream…
From:
<main@SIKM.groups.io> on behalf of Stephen Bounds <km@...>
Reply-To: "main@SIKM.groups.io" <main@SIKM.groups.io>
Date: Friday, January 14, 2022 at 1:48 AM
To: "main@SIKM.groups.io" <main@SIKM.groups.io>, "main@SIKM.groups.io" <main@SIKM.groups.io>
Subject: Re: [SIKM] How would you describe this knowledge failure? #strategy #culture #question
Hi all,
Thank you so much for your suggestions! There was a lot of variety in your answers, which is perhaps to be expected. Some consolidated comments in response:
·
Rigidity trap (Andrew Farnsworth) - The adaptive cycle is an excellent concept and I think that "peak rigidity" point which captures what I'm looking for. I particularly like the phrase that "system resilience approaches
a minimum as resources are committed to maintaining stability".
·
Obsolescence Aversion (Alex Zichettello) - Nice, but it has the slight drawback that it sounds like it is the organisation seeking to avoid obsolescence, whereas the intent is to describe
·
Epistemic latency (Tim Wood) - Another good term, but quite technical for a lay person, and doesn't fully capture the sense of an evolving environment
·
Culture (Joitske Hulsebosch) - Correct in the broadest sense! But not specific enough to support diagnosis and action IMO.
·
Plow ahead (Dan Ranta) - I like that this captures the problem of unreflective practices. It needs some nuance to tie it to the idea of changed assumptions / environments
·
Core rigidity (Patrick Lambe) - Quite similar to the rigidity trap, but as Patrick rightly points out, the problem is "one of unwinding the structural elements that reinforce the previous practices ... deliberate
forgetting/unlearning" (and not just psychology)
·
Conservatism bias (Dennis Pearce) - I think this touches on another important aspect of the problem, which is an unwillingness to believe evidence indicating change is required
·
Dangle berries (Matt Moore) - Aside from the origin of the name, my problem with the term is that it doesn't capture the sense of something that "used to be" useful. But thanks for putting me off my dinner!
After some more research into the adaptive cycle, I found
this useful quote from Daniel Wahl:

As any system begins to mature, there is an accompanying increase in fixed and ordered patterns of interactions and resource flows.
The system becomes over-connected, or better, the existing qualities and quantities of connections are such that they
inhibit the formation of new pathways needed for the system’s overall adaptation to outside changes and its continued evolution.
Eventually this leads to rigidity within the system, and it becomes brittle, less resilient, and more susceptible to disturbances from the outside.
At this point, the effects of detrimental run-away feedback loops inside the system can further challenge viability. The often resulting gradual or sudden breakdown of the old order and structures moves the system closer to ‘the edge of chaos’ — the edge of
its current stability (dynamic equilibrium) domain. The reorganization of resource flows and changes in the quality and quantity of interconnections within the system at this point create a crisis that can be turned into an opportunity for transformation
and innovation. (my emphasis)
Curious, I looked further back in Google Scholar and found what appears to be one of the
foundational papers of this concept by CS Hollings (2008), who himself credits
Herbert A Simon with the concept of dynamic hierarchies that underpins it.
It's actually a fascinating paper and worth a read in its own right, with Hollings pointing out both the rigidity trap (high connectedness and potential) and the poverty trap at the opposite end of the cycle (low connectedness and
potential).
(As an aside, I am sure the idea of a "poverty trap" organisation neatly describes somewhere most of us have worked in our past!)
Hollings (per Simon) represents the hierarchies of a typical social system in this way:

The excerpts I particularly like are:
As long as the transfer from one level to the other [of a dynamic hierarchy] is maintained, the interactions within the levels themselves can be transformed, or the variables changed, without the whole system losing
its integrity ... this structure allows wide latitude for experimentation ...
each of the levels of a dynamic hierarchy serves two functions ... One is to conserve and stabilize conditions for the faster and smaller levels; the other is to generate and test innovations by experiments occurring within a level
...
It is as if two separate objectives are functioning, but in sequence.
The first maximizes production and accumulation; the second maximizes invention and reassortment. The two objectives cannot be maximized simultaneously but only occur sequentially. And the success in achieving one inexorably sets the stage for its opposite.
The adaptive cycle therefore embraces two opposites: growth and stability on the one hand, change and variety on the other...
[Systems] have a minimal complexity we call the "Rule of Hand" whose features make
linear polices more likely to produce temporary solutions and a greater number of escalating problems.
Only an actively adaptive approach can minimize the consequences [of the inevitable release].
When a level in [a dynamic hierarchy] enters its [release] phase of creative destruction, the
collapse can cascade to [a higher] level by triggering a crisis ["revolt")]. Such an event is most likely if the slower level is at its [conservation] phase, because at this point the ... level is particularly vulnerable ... [Similarly once] a catastrophe
is triggered at one level, the opportunities for, or constraints against, the renewal of the cycle are strongly influenced by the [conservation] phase of the next slower and larger level [("remember"]) ...
Extremely large events can overwhelm the sustaining properties of panarchies, destroying levels, and triggering destructive cascades down the successive levels of a [dynamic hierarchy] ... Modern democracies ... diffuse large episodes of creative destruction
by creating smaller cycles of renewal and change through periodic political elections. (my emphasis)
This all helps me to describe my original question more precisely. There may in fact be several diagnoses relevant to the situation:
·
The organisation or social system may be stuck in a rigidity trap, where active attempts are made to inhibit the adaptive cycle but which eventually and inevitably lead to catastrophic collapse
·
It may be that leaders have conservatism bias, ie they are not actively preventing change but are cognitively not predisposed to recognise that renewal is already occurring, or is necessary to commence. In these scenarios,
it is a crisis that normally precipitates the change imperative but it would be better to commence in a planned way
·
A system may have accumulated insufficient capital for change even though a release of potential for the next phase of adaptation is needed, When thinking about staff, this is often talked about as "change fatigue". A temporary
holding pattern may be a short term solution, if and only if external pressures for change - both above and below - are low.
·
However, if a lower level release is causing a "revolt" at a higher level, or a higher level is at risk of catastrophic collapse, systems may be
forced into premature change to avert a full collapse and rely upon the higher levels of the hierarchy to re-establish norms and productivity after a chaotic period. (The alternative is a rigidity trap as described above, which is likely to lead to worse
long-term outcomes.)
A very interesting exercise to work through. Many thanks to everyone for helping me out!
Cheers,
Stephen.
====================================
Stephen Bounds
Executive, Information Management
Cordelta
E: stephen.bounds@...
M: 0401 829 096
====================================
On 14/01/2022 4:04 am, Dennis Pearce wrote:
This seems like the organizational version of anchoring bias in cognitive psychology, or more specifically conservatism bias (see https://en.wikipedia.org/wiki/Conservatism_(belief_revision)).
(One of the items on my list of things to explore someday is to look across all the known cognitive biases in human individual learning and see how many might have analogs in organizational learning.)
|
|
Two thoughts:
1. My first work in management consulting was in org development. Kathleen Dannemiller was my mentor and the founder of the boutique I would eventually start working for. She used to talk about the inflexibility that large corporations experience as they get complacent and become rigid. Her turn of phrase for this was organizational arthritis. Kathie died many years ago, but her legacy is carried on by my former colleague and housemate, Bob "Jake" Jacobs. He wrote about Kathie's approaches and this concept in particular in his first book, Real Time Strategic Change. You can read an excerpt from it on google books here.
2. The book Military Misfortunes by Cohen and Gooch may provide some clues. The authors analyze failures in war, mining the historical record for material and then analyzing an actual failure that took place on the battlefield. The book is broken up into chapters for each of the different types of failures they identified:
- Failure to Learn - Failure to Anticipate - Failure to Adapt - Aggregate Failure - Catastrophic Failure Although the scope and scale of these types of failures in a war setting are vastly different from what goes on in companies, I think it's pretty clear just based on those chapter titles that there are a lot of similarities at the causal level. Might be another source of insight for your work.
-- -Tom --
Tom Short Consulting TSC +1 415 300 7457
|
|
Stephen,
Thanks for the summary - some great ideas here.
Finally: There was a reason I posted my suggestion to you off-list but seeing as you reposted it here: the ‘berry used to be something very useful (indeed vital). But all the vital value has been extracted and the remains are hard and painful to dislodge. I don’t expect my suggestion to be included in any management textbooks any time soon.
Regards, Matt Moore +61 423 784 504
toggle quoted message
Show quoted text
On Jan 14, 2022, at 5:48 PM, Stephen Bounds <km@...> wrote:
Hi all,
Thank you so much for your suggestions! There was a lot of
variety in your answers, which is perhaps to be expected. Some
consolidated comments in response:
- Rigidity trap (Andrew Farnsworth) - The adaptive cycle
is an excellent concept and I think that "peak rigidity" point
which captures what I'm looking for. I particularly like the
phrase that "system resilience approaches a minimum as resources
are committed to maintaining stability".
- Obsolescence Aversion (Alex Zichettello) - Nice, but it
has the slight drawback that it sounds like it is the
organisation seeking to avoid obsolescence, whereas the intent
is to describe
- Epistemic latency (Tim Wood) - Another good term, but
quite technical for a lay person, and doesn't fully capture the
sense of an evolving environment
- Culture (Joitske Hulsebosch) - Correct in the broadest
sense! But not specific enough to support diagnosis and action
IMO.
- Plow ahead (Dan Ranta) - I like that this captures the
problem of unreflective practices. It needs some nuance to tie
it to the idea of changed assumptions / environments
- Core rigidity (Patrick Lambe) - Quite similar
to the rigidity trap, but as Patrick rightly points out, the
problem is "one of unwinding the structural elements that
reinforce the previous practices ... deliberate
forgetting/unlearning" (and not just psychology)
- Conservatism bias (Dennis Pearce) - I think this
touches on another important aspect of the problem, which is an
unwillingness to believe evidence indicating change is required
- Dangle berries (Matt Moore) - Aside from the origin of
the name, my problem with the term is that it doesn't capture
the sense of something that "used to be" useful. But thanks for
putting me off my dinner!
After some more research into the adaptive cycle, I found this
useful quote from Daniel Wahl:
<E1lYSf7HIMWcbLHW.png>
As any system begins to mature, there is an accompanying
increase in fixed and ordered patterns of interactions and
resource flows. The system becomes over-connected, or
better, the existing qualities and quantities of connections are
such that they inhibit the formation of new pathways needed
for the system’s overall adaptation to outside changes and
its continued evolution. Eventually this leads to rigidity
within the system, and it becomes brittle, less resilient, and
more susceptible to disturbances from the outside.
At this point, the effects of detrimental run-away feedback
loops inside the system can further challenge viability. The
often resulting gradual or sudden breakdown of the old order and
structures moves the system closer to ‘the edge of chaos’ — the
edge of its current stability (dynamic equilibrium) domain. The
reorganization of resource flows and changes in the quality
and quantity of interconnections within the system at this
point create a crisis that can be turned into an opportunity
for transformation and innovation. (my emphasis)
Curious, I looked further back in Google Scholar and found what
appears to be one of the foundational
papers of this concept by CS Hollings (2008), who himself
credits Herbert
A Simon with the concept of dynamic hierarchies that
underpins it.
It's actually a fascinating paper and worth a read in its own
right, with Hollings pointing out both the rigidity trap (high
connectedness and potential) and the poverty trap at the opposite
end of the cycle (low connectedness and potential).
(As an aside, I am sure the idea of a "poverty trap" organisation
neatly describes somewhere most of us have worked in our past!)
Hollings (per Simon) represents the hierarchies of a typical
social system in this way:
<3mEVlXM8Ma2B0Smt.png>
The excerpts I particularly like are:
As long as the transfer from one level to
the other [of a dynamic hierarchy] is maintained, the
interactions within the levels themselves can be transformed, or
the variables changed, without the whole system losing its
integrity ... this structure allows wide latitude for
experimentation ... each of the levels of a dynamic
hierarchy serves two functions ... One is to conserve and
stabilize conditions for the faster and smaller levels; the
other is to generate and test innovations by experiments
occurring within a level ...
It is as if two separate objectives are functioning, but in
sequence. The first maximizes production and accumulation;
the second maximizes invention and reassortment. The two
objectives cannot be maximized simultaneously but only occur
sequentially. And the success in achieving one
inexorably sets the stage for its opposite. The adaptive cycle
therefore embraces two opposites: growth and stability on the
one hand, change and variety on the other...
[Systems] have a minimal complexity we call the "Rule of Hand"
whose features make linear polices more likely to produce
temporary solutions and a greater number of escalating
problems. Only an actively adaptive approach can
minimize the consequences [of the inevitable release].
When a level in [a dynamic hierarchy] enters its [release]
phase of creative destruction, the collapse can cascade to
[a higher] level by triggering a crisis ["revolt")].
Such an event is most likely if the slower level is at its
[conservation] phase, because at this point the ... level is
particularly vulnerable ... [Similarly once] a catastrophe is
triggered at one level, the opportunities for, or
constraints against, the renewal of the cycle are strongly
influenced by the [conservation] phase of the next slower
and larger level [("remember"]) ...
Extremely large events can overwhelm the sustaining
properties of panarchies, destroying levels, and triggering
destructive cascades down the successive levels of a
[dynamic hierarchy] ... Modern democracies ... diffuse large
episodes of creative destruction by creating smaller cycles of
renewal and change through periodic political elections. (my
emphasis)
This all helps me to describe my original question more precisely.
There may in fact be several diagnoses relevant to the situation:
- The organisation or social system may be stuck in a rigidity
trap, where active attempts are made to inhibit the
adaptive cycle but which eventually and inevitably lead to
catastrophic collapse
- It may be that leaders have conservatism
bias, ie they are not actively preventing change but are
cognitively not predisposed to recognise that renewal is already
occurring, or is necessary to commence. In these scenarios, it
is a crisis that normally precipitates the change imperative but
it would be better to commence in a planned way
- A system may have accumulated insufficient capital
for change even though a release of potential for the next
phase of adaptation is needed, When thinking about staff, this
is often talked about as "change fatigue". A temporary holding
pattern may be a short term solution, if and only if external
pressures for change - both above and below - are low.
- However, if a lower level release is causing a "revolt" at a
higher level, or a higher level is at risk of catastrophic
collapse, systems may be forced into premature change
to avert a full collapse and rely upon the higher levels of the
hierarchy to re-establish norms and productivity after a chaotic
period. (The alternative is a rigidity trap as described above,
which is likely to lead to worse long-term outcomes.)
A very interesting exercise to work through. Many thanks to
everyone for helping me out!
Cheers,
Stephen.
====================================
Stephen Bounds
Executive, Information Management
Cordelta
E: stephen.bounds@...
M: 0401 829 096
====================================
On 14/01/2022 4:04 am, Dennis Pearce wrote:
This seems like the organizational version of anchoring bias in
cognitive psychology, or more specifically conservatism bias (see https://en.wikipedia.org/wiki/Conservatism_(belief_revision)).
(One of the items on my list of things to explore someday is to
look across all the known cognitive biases in human individual
learning and see how many might have analogs in organizational
learning.)
|
|
Thanks, Tom, this book looks interesting.
You remind me that there’s a significant body of literature on failures in general, and specifically intelligence/knowledge failures, in both war and business. (The discipline of competitive intelligence,
where I have worked extensively, largely consists of applying lessons from the former to problems of the latter.)
Most of my books are in my warehouse following a recent office downsize – so, sadly, I can no longer look over and see them. But one that I keep near is
https://www.amazon.com/Knowing-Doing-Gap-Companies-Knowledge-Action/dp/1578511240/.
tp
From:
<main@SIKM.groups.io> on behalf of Tom Short <tshortconsulting@...>
Reply-To: "main@SIKM.groups.io" <main@SIKM.groups.io>
Date: Friday, January 14, 2022 at 1:54 PM
To: "main@SIKM.groups.io" <main@SIKM.groups.io>
Subject: Re: [SIKM] How would you describe this knowledge failure? #strategy #culture #question
Two thoughts:
1. My first work in management consulting was in org development. Kathleen Dannemiller was my mentor and the founder of the boutique I would eventually start working for. She used to talk about the inflexibility that large corporations
experience as they get complacent and become rigid. Her turn of phrase for this was
organizational arthritis. Kathie died many years ago, but her legacy is carried on by my former colleague and housemate, Bob "Jake" Jacobs. He wrote about Kathie's approaches and this concept in
particular in his first book, Real Time Strategic Change. You can read an excerpt from it on google books
here.
2. The book
Military Misfortunes by Cohen and Gooch may provide some clues. The authors analyze failures in war, mining the historical record for material and then analyzing an actual failure that took place on the battlefield. The book is broken up into chapters
for each of the different types of failures they identified:
- Failure to Learn
- Failure to Anticipate
- Failure to Adapt
- Aggregate Failure
- Catastrophic Failure
Although the scope and scale of these types of failures in a war setting are vastly different from what goes on in companies, I think it's pretty clear just based on those chapter titles that there are a lot of similarities at the causal level. Might be another
source of insight for your work.
--
-Tom
--
Tom Short Consulting
TSC
+1 415 300 7457
|
|
Hi Matt, I completely failed to notice that you had emailed me
privately! Apologies for that.
For what it's worth, you made me laugh and humour can be
important is recontextualising problems anyway :)
A podcast with Wardley is exciting indeed! I am not familiar with
Castlin's work but plan to become more familiar in the near
future. Do we have a date for the podcast yet?
Cheers,
Stephen.
====================================
Stephen Bounds
Executive, Information Management
Cordelta
E: stephen.bounds@...
M: 0401 829 096
====================================
On 15/01/2022 5:30 am, Matt Moore
wrote:
toggle quoted message
Show quoted text
Stephen,
Thanks for the summary - some great ideas here.
Finally: There was a reason I posted my suggestion to you
off-list but seeing as you reposted it here: the ‘berry used to
be something very useful (indeed vital). But all the vital value
has been extracted and the remains are hard and painful to
dislodge. I don’t expect my suggestion to be included in any
management textbooks any time soon.
Regards,
Matt Moore
+61 423 784 504
On Jan 14, 2022, at 5:48 PM, Stephen
Bounds <km@...> wrote:
Hi all,
Thank you so much for your suggestions! There was a lot
of variety in your answers, which is perhaps to be
expected. Some consolidated comments in response:
- Rigidity trap (Andrew Farnsworth) - The
adaptive cycle is an excellent concept and I think that
"peak rigidity" point which captures what I'm looking
for. I particularly like the phrase that "system
resilience approaches a minimum as resources are
committed to maintaining stability".
- Obsolescence Aversion (Alex Zichettello) -
Nice, but it has the slight drawback that it sounds like
it is the organisation seeking to avoid obsolescence,
whereas the intent is to describe
- Epistemic latency (Tim Wood) - Another good
term, but quite technical for a lay person, and doesn't
fully capture the sense of an evolving environment
- Culture (Joitske Hulsebosch) - Correct in the
broadest sense! But not specific enough to support
diagnosis and action IMO.
- Plow ahead (Dan Ranta) - I like that this
captures the problem of unreflective practices. It needs
some nuance to tie it to the idea of changed assumptions
/ environments
- Core rigidity (Patrick Lambe) - Quite
similar to the rigidity trap, but as Patrick rightly
points out, the problem is "one of unwinding the
structural elements that reinforce the previous
practices ... deliberate forgetting/unlearning" (and not
just psychology)
- Conservatism bias (Dennis Pearce) - I think
this touches on another important aspect of the problem,
which is an unwillingness to believe evidence indicating
change is required
- Dangle berries (Matt Moore) - Aside from the
origin of the name, my problem with the term is that it
doesn't capture the sense of something that "used to be"
useful. But thanks for putting me off my dinner!
After some more research into the adaptive cycle, I found
this
useful quote from Daniel Wahl:
<E1lYSf7HIMWcbLHW.png>
As any system begins to mature, there is an
accompanying increase in fixed and ordered patterns of
interactions and resource flows. The system becomes
over-connected, or better, the existing qualities
and quantities of connections are such that they inhibit
the formation of new pathways needed for the system’s
overall adaptation to outside changes and its
continued evolution. Eventually this leads to
rigidity within the system, and it becomes brittle,
less resilient, and more susceptible to disturbances
from the outside.
At this point, the effects of detrimental run-away
feedback loops inside the system can further challenge
viability. The often resulting gradual or sudden
breakdown of the old order and structures moves the
system closer to ‘the edge of chaos’ — the edge of its
current stability (dynamic equilibrium) domain. The
reorganization of resource flows and changes in the
quality and quantity of interconnections within the
system at this point create a crisis that can be
turned into an opportunity for transformation and
innovation. (my emphasis)
Curious, I looked further back in Google Scholar and
found what appears to be one of the foundational papers of this
concept by CS Hollings (2008), who himself credits Herbert A Simon with the
concept of dynamic hierarchies that underpins it.
It's actually a fascinating paper and worth a read in its
own right, with Hollings pointing out both the rigidity
trap (high connectedness and potential) and the poverty
trap at the opposite end of the cycle (low connectedness
and potential).
(As an aside, I am sure the idea of a "poverty trap"
organisation neatly describes somewhere most of us have
worked in our past!)
Hollings (per Simon) represents the hierarchies of a
typical social system in this way:
<3mEVlXM8Ma2B0Smt.png>
The excerpts I particularly like are:
As long as the transfer from one
level to the other [of a dynamic hierarchy] is maintained,
the interactions within the levels themselves can be
transformed, or the variables changed, without the whole
system losing its integrity ... this structure allows wide
latitude for experimentation ... each of the levels of
a dynamic hierarchy serves two functions ... One is to
conserve and stabilize conditions for the faster and
smaller levels; the other is to generate and test
innovations by experiments occurring within a level ...
It is as if two separate objectives are functioning,
but in sequence. The first maximizes production and
accumulation; the second maximizes invention and
reassortment. The two objectives cannot be maximized
simultaneously but only occur sequentially. And
the success in achieving one inexorably sets the stage
for its opposite. The adaptive cycle therefore embraces
two opposites: growth and stability on the one hand,
change and variety on the other...
[Systems] have a minimal complexity we call the "Rule of
Hand" whose features make linear polices more likely
to produce temporary solutions and a greater number of
escalating problems. Only an actively adaptive
approach can minimize the consequences [of the
inevitable release].
When a level in [a dynamic hierarchy] enters its
[release] phase of creative destruction, the collapse
can cascade to [a higher] level by triggering a crisis
["revolt")]. Such an event is most likely if the
slower level is at its [conservation] phase, because at
this point the ... level is particularly vulnerable ...
[Similarly once] a catastrophe is triggered at one
level, the opportunities for, or constraints
against, the renewal of the cycle are strongly
influenced by the [conservation] phase of the next
slower and larger level [("remember"]) ...
Extremely large events can overwhelm the sustaining
properties of panarchies, destroying levels, and
triggering destructive cascades down the
successive levels of a [dynamic hierarchy] ... Modern
democracies ... diffuse large episodes of creative
destruction by creating smaller cycles of renewal and
change through periodic political elections. (my
emphasis)
This all helps me to describe my original question more
precisely. There may in fact be several diagnoses relevant
to the situation:
- The organisation or social system may be stuck in
a rigidity trap, where active attempts are
made to inhibit the adaptive cycle but which eventually
and inevitably lead to catastrophic collapse
- It may be that leaders have conservatism
bias, ie they are not actively preventing change
but are cognitively not predisposed to recognise that
renewal is already occurring, or is necessary to
commence. In these scenarios, it is a crisis that
normally precipitates the change imperative but it would
be better to commence in a planned way
- A system may have accumulated insufficient capital
for change even though a release of potential for
the next phase of adaptation is needed, When thinking
about staff, this is often talked about as "change
fatigue". A temporary holding pattern may be a short
term solution, if and only if external pressures for
change - both above and below - are low.
- However, if a lower level release is causing a
"revolt" at a higher level, or a higher level is at risk
of catastrophic collapse, systems may be forced into
premature change to avert a full collapse
and rely upon the higher levels of the hierarchy to
re-establish norms and productivity after a chaotic
period. (The alternative is a rigidity trap as described
above, which is likely to lead to worse long-term
outcomes.)
A very interesting exercise to work through. Many thanks
to everyone for helping me out!
Cheers,
Stephen.
====================================
Stephen Bounds
Executive, Information Management
Cordelta
E: stephen.bounds@...
M: 0401 829 096
====================================
On 14/01/2022 4:04 am, Dennis Pearce wrote:
This seems like the organizational version of anchoring
bias in cognitive psychology, or more specifically
conservatism bias (see https://en.wikipedia.org/wiki/Conservatism_(belief_revision)).
(One of the items on my list of things to explore someday
is to look across all the known cognitive biases in human
individual learning and see how many might have analogs in
organizational learning.)
|
|
No worries.
I am also excited about podcast.
Date will be late Feb (TBC).
And thank you for posing some questions! Matt Moore +61 423 784 504
toggle quoted message
Show quoted text
On Jan 15, 2022, at 11:45 AM, Stephen Bounds <km@...> wrote:
Hi Matt, I completely failed to notice that you had emailed me
privately! Apologies for that.
For what it's worth, you made me laugh and humour can be
important is recontextualising problems anyway :)
A podcast with Wardley is exciting indeed! I am not familiar with
Castlin's work but plan to become more familiar in the near
future. Do we have a date for the podcast yet?
Cheers,
Stephen.
====================================
Stephen Bounds
Executive, Information Management
Cordelta
E: stephen.bounds@...
M: 0401 829 096
====================================
On 15/01/2022 5:30 am, Matt Moore
wrote:
Stephen,
Thanks for the summary - some great ideas here.
Finally: There was a reason I posted my suggestion to you
off-list but seeing as you reposted it here: the ‘berry used to
be something very useful (indeed vital). But all the vital value
has been extracted and the remains are hard and painful to
dislodge. I don’t expect my suggestion to be included in any
management textbooks any time soon.
Regards,
Matt Moore
+61 423 784 504
On Jan 14, 2022, at 5:48 PM, Stephen
Bounds <km@...> wrote:
Hi all,
Thank you so much for your suggestions! There was a lot
of variety in your answers, which is perhaps to be
expected. Some consolidated comments in response:
- Rigidity trap (Andrew Farnsworth) - The
adaptive cycle is an excellent concept and I think that
"peak rigidity" point which captures what I'm looking
for. I particularly like the phrase that "system
resilience approaches a minimum as resources are
committed to maintaining stability".
- Obsolescence Aversion (Alex Zichettello) -
Nice, but it has the slight drawback that it sounds like
it is the organisation seeking to avoid obsolescence,
whereas the intent is to describe
- Epistemic latency (Tim Wood) - Another good
term, but quite technical for a lay person, and doesn't
fully capture the sense of an evolving environment
- Culture (Joitske Hulsebosch) - Correct in the
broadest sense! But not specific enough to support
diagnosis and action IMO.
- Plow ahead (Dan Ranta) - I like that this
captures the problem of unreflective practices. It needs
some nuance to tie it to the idea of changed assumptions
/ environments
- Core rigidity (Patrick Lambe) - Quite
similar to the rigidity trap, but as Patrick rightly
points out, the problem is "one of unwinding the
structural elements that reinforce the previous
practices ... deliberate forgetting/unlearning" (and not
just psychology)
- Conservatism bias (Dennis Pearce) - I think
this touches on another important aspect of the problem,
which is an unwillingness to believe evidence indicating
change is required
- Dangle berries (Matt Moore) - Aside from the
origin of the name, my problem with the term is that it
doesn't capture the sense of something that "used to be"
useful. But thanks for putting me off my dinner!
After some more research into the adaptive cycle, I found
this
useful quote from Daniel Wahl:
<E1lYSf7HIMWcbLHW.png>
As any system begins to mature, there is an
accompanying increase in fixed and ordered patterns of
interactions and resource flows. The system becomes
over-connected, or better, the existing qualities
and quantities of connections are such that they inhibit
the formation of new pathways needed for the system’s
overall adaptation to outside changes and its
continued evolution. Eventually this leads to
rigidity within the system, and it becomes brittle,
less resilient, and more susceptible to disturbances
from the outside.
At this point, the effects of detrimental run-away
feedback loops inside the system can further challenge
viability. The often resulting gradual or sudden
breakdown of the old order and structures moves the
system closer to ‘the edge of chaos’ — the edge of its
current stability (dynamic equilibrium) domain. The
reorganization of resource flows and changes in the
quality and quantity of interconnections within the
system at this point create a crisis that can be
turned into an opportunity for transformation and
innovation. (my emphasis)
Curious, I looked further back in Google Scholar and
found what appears to be one of the foundational papers of this
concept by CS Hollings (2008), who himself credits Herbert A Simon with the
concept of dynamic hierarchies that underpins it.
It's actually a fascinating paper and worth a read in its
own right, with Hollings pointing out both the rigidity
trap (high connectedness and potential) and the poverty
trap at the opposite end of the cycle (low connectedness
and potential).
(As an aside, I am sure the idea of a "poverty trap"
organisation neatly describes somewhere most of us have
worked in our past!)
Hollings (per Simon) represents the hierarchies of a
typical social system in this way:
<3mEVlXM8Ma2B0Smt.png>
The excerpts I particularly like are:
As long as the transfer from one
level to the other [of a dynamic hierarchy] is maintained,
the interactions within the levels themselves can be
transformed, or the variables changed, without the whole
system losing its integrity ... this structure allows wide
latitude for experimentation ... each of the levels of
a dynamic hierarchy serves two functions ... One is to
conserve and stabilize conditions for the faster and
smaller levels; the other is to generate and test
innovations by experiments occurring within a level ...
It is as if two separate objectives are functioning,
but in sequence. The first maximizes production and
accumulation; the second maximizes invention and
reassortment. The two objectives cannot be maximized
simultaneously but only occur sequentially. And
the success in achieving one inexorably sets the stage
for its opposite. The adaptive cycle therefore embraces
two opposites: growth and stability on the one hand,
change and variety on the other...
[Systems] have a minimal complexity we call the "Rule of
Hand" whose features make linear polices more likely
to produce temporary solutions and a greater number of
escalating problems. Only an actively adaptive
approach can minimize the consequences [of the
inevitable release].
When a level in [a dynamic hierarchy] enters its
[release] phase of creative destruction, the collapse
can cascade to [a higher] level by triggering a crisis
["revolt")]. Such an event is most likely if the
slower level is at its [conservation] phase, because at
this point the ... level is particularly vulnerable ...
[Similarly once] a catastrophe is triggered at one
level, the opportunities for, or constraints
against, the renewal of the cycle are strongly
influenced by the [conservation] phase of the next
slower and larger level [("remember"]) ...
Extremely large events can overwhelm the sustaining
properties of panarchies, destroying levels, and
triggering destructive cascades down the
successive levels of a [dynamic hierarchy] ... Modern
democracies ... diffuse large episodes of creative
destruction by creating smaller cycles of renewal and
change through periodic political elections. (my
emphasis)
This all helps me to describe my original question more
precisely. There may in fact be several diagnoses relevant
to the situation:
- The organisation or social system may be stuck in
a rigidity trap, where active attempts are
made to inhibit the adaptive cycle but which eventually
and inevitably lead to catastrophic collapse
- It may be that leaders have conservatism
bias, ie they are not actively preventing change
but are cognitively not predisposed to recognise that
renewal is already occurring, or is necessary to
commence. In these scenarios, it is a crisis that
normally precipitates the change imperative but it would
be better to commence in a planned way
- A system may have accumulated insufficient capital
for change even though a release of potential for
the next phase of adaptation is needed, When thinking
about staff, this is often talked about as "change
fatigue". A temporary holding pattern may be a short
term solution, if and only if external pressures for
change - both above and below - are low.
- However, if a lower level release is causing a
"revolt" at a higher level, or a higher level is at risk
of catastrophic collapse, systems may be forced into
premature change to avert a full collapse
and rely upon the higher levels of the hierarchy to
re-establish norms and productivity after a chaotic
period. (The alternative is a rigidity trap as described
above, which is likely to lead to worse long-term
outcomes.)
A very interesting exercise to work through. Many thanks
to everyone for helping me out!
Cheers,
Stephen.
====================================
Stephen Bounds
Executive, Information Management
Cordelta
E: stephen.bounds@...
M: 0401 829 096
====================================
On 14/01/2022 4:04 am, Dennis Pearce wrote:
This seems like the organizational version of anchoring
bias in cognitive psychology, or more specifically
conservatism bias (see https://en.wikipedia.org/wiki/Conservatism_(belief_revision)).
(One of the items on my list of things to explore someday
is to look across all the known cognitive biases in human
individual learning and see how many might have analogs in
organizational learning.)
|
|

Ninez Piezas-Jerbi
Hi Stephen,
Just by reading your description below made me think of what we’ve been advocating to my colleagues about getting our act together about innovation, particularly given current demand from our Member States for WTO Reform. I called the attitude we currently have the «What got us here will get us there » syndrome. Of course what we really wanted to say was to replace the « WILL » with « WON’T ».
Hope you find the right name you’re looking for!
Best wishes for 2022 to all! Ninez
toggle quoted message
Show quoted text
On 13 Jan 2022, at 02:51, Stephen Bounds <km@...> wrote:
Hi SIKM,
I am hoping to draw on your hivemind to see if there's a good
term out there for a very particular phenomena that I am
observing.
Most of us would be familiar with the "sunk cost fallacy", the
idea that any decision should ignore past costs (either time or
money) when making a future decision. It is common to stick with
initiatives long past any rational reason to do so, typically for
reasons of commitment bias and loss aversion.
The phenomenon I am seeking to explain is one rooted in a
knowledge failure. It occurs when an organisation implements
solutions in response to a problem, but then sustains
those solution long past their useful life. I suspect that this is
especially common after an extended period of process optimisation
that is built on base knowledge which then becomes outdated.
After some reflection, I have reminded myself that the "double
loop learning" process proposed by Argyris can be a solution
to this problem. But I don't think this helpfully describes the failure.
"Failure to engage in double loop learning" is gobbledygook to
anyone outside of KM. "Retaining bad assumptions" is too vague for
the situation.
The scenario I am particularly thinking of is:
- The solution made sense and worked when it was devised
- The environment changes, making some prior knowledge invalid
and the previous solution ineffective or an outright failure
(generally the failure must be partial or subtle, excusable as
an "outlier" or "temporary" aberration)
- The organisation is biased towards keeping the practice in
place despite rising evidence to the contrary since everyone
"knows it works"
A high-profile example of this failure was the shift
to digital downloads at the turn of the millennium. The
music industry lost nearly half its revenue during a consumer-led
revolt against the traditional model of album-based, physical CD
sales.
The problem is that while in a competitive marketplace such
flawed reasoning gets exposed relatively quickly, in a
monopolistic situation (particularly in government) there is less
pressure to fix these issues. It is generally only after a
significant number of patently absurd outcomes get publicised that
serious reform is considered -- and until then, lots of
unnecessary human suffering can occur.
So: I need a snappy name to describe this knowledge failure. Got
any good ideas?
Cheers,
Stephen.
====================================
Stephen Bounds
Executive, Information Management
Cordelta
E: stephen.bounds@...
M: 0401 829 096
====================================
|
|

Ginetta Gueli
Hi Stephen, I am not sure that mine could be a snappy name to describe this type of knowledge failure, but my suggestion is "Erase and Rewind", which is also the name of a famous song that could fit this situation in a certain way...as you mentioned the "music industry" :-).
https://www.youtube.com/watch?v=6WOYnv59Bi8&list=RD6WOYnv59Bi8&start_radio=1
Good luck and all the best, Ginetta -- Ginetta Gueli Information & Knowledge Manager | Project Manager
|
|
To add to what was mentioned. When I started reading tor description it simply reminded me the product development lifecycle. The last phase is called “Maintain or kill” which could be too dramatic if you try to find the appropriate term ;) or sunsetting.
Here’s just an example of an article explaining all the phases.
Product Development Process or Product Development Lifecycle - PM Vidya
| | Product Development Process or Product Development Lifecycle - PM Vidya What is Product Development Process? In simple words, the Product Development process is the process or a series of steps that every product goes through | | | |
You could find and use a product example that went through these phases and show how management made executive decision to sunset a product even though it had its success on the market previously. However, just like you described in your example, market conditions changed and it cost more to keep the product than to simply sunset it.
I use a lot of product development and service design approach in my KM practice.
toggle quoted message
Show quoted text
On Saturday, January 15, 2022, 6:44 AM, Ginetta Gueli via groups.io <ginetta.gueli@...> wrote: Hi Stephen, I am not sure that mine could be a snappy name to describe this type of knowledge failure, but my suggestion is "Erase and Rewind", which is also the name of a famous song that could fit this situation in a certain way...as you mentioned the "music industry" :-).
https://www.youtube.com/watch?v=6WOYnv59Bi8&list=RD6WOYnv59Bi8&start_radio=1
Good luck and all the best, Ginetta -- Ginetta Gueli Information & Knowledge Manager | Project Manager
|
|