Following on from Dave's comments, there's definitely something to scratch at about this four-stage model...
In level 1, knowledge is random and the objective is unattainable - so I'm not sure it can entirely be described as resilient. (What exactly resiles under such conditions?).
And level 5 is characterised by embedded processes and active use of knowledge to innovate and improve - which doesn't preclude flexibility in the face of uncertainty...
Maybe it's about flexibility more than formality? The problem with Stage 4 is the standardisation and the predictability (which implies that the context remains sufficiently stable that rigid standards maintain value, and that tomorrow's conditions
can be reliably anticipated based on past experience...).
toggle quoted message
Show quoted text
Well to be a contrary voice. I think the APQC scales might be better termed a scale of ossification.
It’s also worth remembering that Nonaka failed to recognise Polanyi’s key point that no explicit knowledge can exist without a tacit component.
To give a simple illustration. Informal networks are far better than formal systems in knowledge discovery especially under conditions of uncertainty. So a key measure of KM program success is the density and cross solo reach of informal networks.
Stage one is more resilient than stage four.
Prof Dave Snowden
Cynefin Centre & Cognitive Edge
11 Pro
Please excuse predictive text errors and typos
Hi all,
Thanks again for all your valuable inputs to help me solve this. I ended up adapting the APQC model and using your end state of value as key in assessing the knowledge.
What this means practically is that we identified key areas that the knowledge would be used for. In this particular case, we decided on 3 Objectives:
* Project Onboarding
* Problem Resolution (when problems arise in live and troubleshooting is required)
* Further Enhancements
We will then assess the knowledge available to achieve each noted objective using the maturity model. (APQC I hope my slight adaptation is allowed).
(I was thinking of also categorising the knowledge as tacit and explicit so that targeted KM activities can be further enhanced to improve the maturity of the knowledge.)
<Knowledge Maturity.png>
I think it will work for us but welcome your thoughts and further input.
|
|
OK look forward it - keep us posted.
I’m also sceptical about maturity models for a number of reasons, some of them posted here:
We were persuaded once to develop a maturity model by a client, but we developed it more as a means of collective self evaluation of capabilities. And in general I’m much more comfortable thinking about capabilities than maturity.
P
toggle quoted message
Show quoted text
I’m going to write a blog post on this and Nancy’s four stages this week. Aspects not additive would be my initial response. But overall I think maturity models are flawed as a concept Prof Dave Snowden Cynefin Centre & Cognitive Edge 11 Pro Please excuse predictive text errors and typos On 14 Nov 2022, at 02:50, Patrick Lambe <plambe@...> wrote:
<<Stage one is more resilient than stage four.>>
That’s a very interesting observation Dave.
Would this be better represented as an additive framework rather than a progression framework as in a traditional maturity model? i.e. All layers need to be in place for the knowledge ecosystem to function effectively? The balance of investment across the layers to be dependent on the contextual needs.
P
Well to be a contrary voice. I think the APQC scales might be better termed a scale of ossification. It’s also worth remembering that Nonaka failed to recognise Polanyi’s key point that no explicit knowledge can exist without a tacit component.
To give a simple illustration. Informal networks are far better than formal systems in knowledge discovery especially under conditions of uncertainty. So a key measure of KM program success is the density and cross solo reach of informal networks. Stage one is more resilient than stage four. Prof Dave Snowden Cynefin Centre & Cognitive Edge 11 Pro Please excuse predictive text errors and typos Hi all,
Thanks again for all your valuable inputs to help me solve this. I ended up adapting the APQC model and using your end state of value as key in assessing the knowledge. What this means practically is that we identified key areas that the knowledge would be used for. In this particular case, we decided on 3 Objectives: * Project Onboarding * Problem Resolution (when problems arise in live and troubleshooting is required) * Further Enhancements
We will then assess the knowledge available to achieve each noted objective using the maturity model. (APQC I hope my slight adaptation is allowed). (I was thinking of also categorising the knowledge as tacit and explicit so that targeted KM activities can be further enhanced to improve the maturity of the knowledge.) <Knowledge Maturity.png>
I think it will work for us but welcome your thoughts and further input.
|
|
I’m going to write a blog post on this and Nancy’s four stages this week. Aspects not additive would be my initial response. But overall I think maturity models are flawed as a concept Prof Dave Snowden Cynefin Centre & Cognitive Edge 11 Pro Please excuse predictive text errors and typos
toggle quoted message
Show quoted text
On 14 Nov 2022, at 02:50, Patrick Lambe <plambe@...> wrote:
<<Stage one is more resilient than stage four.>>
That’s a very interesting observation Dave.
Would this be better represented as an additive framework rather than a progression framework as in a traditional maturity model? i.e. All layers need to be in place for the knowledge ecosystem to function effectively? The balance of investment across the layers to be dependent on the contextual needs.
P
Well to be a contrary voice. I think the APQC scales might be better termed a scale of ossification. It’s also worth remembering that Nonaka failed to recognise Polanyi’s key point that no explicit knowledge can exist without a tacit component.
To give a simple illustration. Informal networks are far better than formal systems in knowledge discovery especially under conditions of uncertainty. So a key measure of KM program success is the density and cross solo reach of informal networks. Stage one is more resilient than stage four. Prof Dave Snowden Cynefin Centre & Cognitive Edge 11 Pro Please excuse predictive text errors and typos Hi all,
Thanks again for all your valuable inputs to help me solve this. I ended up adapting the APQC model and using your end state of value as key in assessing the knowledge. What this means practically is that we identified key areas that the knowledge would be used for. In this particular case, we decided on 3 Objectives: * Project Onboarding * Problem Resolution (when problems arise in live and troubleshooting is required) * Further Enhancements
We will then assess the knowledge available to achieve each noted objective using the maturity model. (APQC I hope my slight adaptation is allowed). (I was thinking of also categorising the knowledge as tacit and explicit so that targeted KM activities can be further enhanced to improve the maturity of the knowledge.) <Knowledge Maturity.png>
I think it will work for us but welcome your thoughts and further input.
|
|
<<Stage one is more resilient than stage four.>>
That’s a very interesting observation Dave.
Would this be better represented as an additive framework rather than a progression framework as in a traditional maturity model? i.e. All layers need to be in place for the knowledge ecosystem to function effectively? The balance of investment across the layers to be dependent on the contextual needs.
P
toggle quoted message
Show quoted text
Well to be a contrary voice. I think the APQC scales might be better termed a scale of ossification. It’s also worth remembering that Nonaka failed to recognise Polanyi’s key point that no explicit knowledge can exist without a tacit component.
To give a simple illustration. Informal networks are far better than formal systems in knowledge discovery especially under conditions of uncertainty. So a key measure of KM program success is the density and cross solo reach of informal networks. Stage one is more resilient than stage four. Prof Dave Snowden Cynefin Centre & Cognitive Edge 11 Pro Please excuse predictive text errors and typos Hi all,
Thanks again for all your valuable inputs to help me solve this. I ended up adapting the APQC model and using your end state of value as key in assessing the knowledge. What this means practically is that we identified key areas that the knowledge would be used for. In this particular case, we decided on 3 Objectives: * Project Onboarding * Problem Resolution (when problems arise in live and troubleshooting is required) * Further Enhancements
We will then assess the knowledge available to achieve each noted objective using the maturity model. (APQC I hope my slight adaptation is allowed). (I was thinking of also categorising the knowledge as tacit and explicit so that targeted KM activities can be further enhanced to improve the maturity of the knowledge.) <Knowledge Maturity.png>
I think it will work for us but welcome your thoughts and further input.
|
|
Well to be a contrary voice. I think the APQC scales might be better termed a scale of ossification. It’s also worth remembering that Nonaka failed to recognise Polanyi’s key point that no explicit knowledge can exist without a tacit component.
To give a simple illustration. Informal networks are far better than formal systems in knowledge discovery especially under conditions of uncertainty. So a key measure of KM program success is the density and cross silo reach of informal networks. Stage one is more resilient than stage four.
Prof Dave Snowden
Cynefin Centre & Cognitive Edge
11 Pro
Please excuse predictive text errors and typos
toggle quoted message
Show quoted text
On 14 Nov 2022, at 01:43, Madeleine Du Toit via groups.io <mdutoit@...> wrote:
Hi all, Thanks again for all your valuable inputs to help me solve this. I ended up adapting the APQC model and using your end state of value as key in assessing the knowledge. What this means practically is that we identified key areas that the knowledge would be used for. In this particular case, we decided on 3 Objectives: * Project Onboarding * Problem Resolution (when problems arise in live and troubleshooting is required) * Further Enhancements We will then assess the knowledge available to achieve each noted objective using the maturity model. (APQC I hope my slight adaptation is allowed). (I was thinking of also categorising the knowledge as tacit and explicit so that targeted KM activities can be further enhanced to improve the maturity of the knowledge.)  I think it will work for us but welcome your thoughts and further input.
|
|
Hi all, Thanks again for all your valuable inputs to help me solve this. I ended up adapting the APQC model and using your end state of value as key in assessing the knowledge. What this means practically is that we identified key areas that the knowledge would be used for. In this particular case, we decided on 3 Objectives: * Project Onboarding * Problem Resolution (when problems arise in live and troubleshooting is required) * Further Enhancements We will then assess the knowledge available to achieve each noted objective using the maturity model. (APQC I hope my slight adaptation is allowed). (I was thinking of also categorising the knowledge as tacit and explicit so that targeted KM activities can be further enhanced to improve the maturity of the knowledge.)  I think it will work for us but welcome your thoughts and further input.
|
|
I’d ‘like’ Tim’s reply 10 or 100 times more if I could.
KM shouldn’t be about getting better at KM. It should be about improving business outcomes, which Drucker described as either increasing productivity or increasing innovation.
That all said, the idea of a maturity model is not a new one - it’s been explored at length by many, including several people in this august group.
I took the liberty of googling it for your inquirer.
https://www.google.com/search?q=knowledge+management+maturity+model&ie=UTF-8&oe=UTF-8&hl=en-us&client=safari
(Ok, I admit to some snark-irony there). -- -Tom --
Tom Short Consulting TSC +1 415 300 7457
|
|

Stan Garfield
|
|
In fact, since this appears to be born out of a concern for project knowledge, it may need to be interpreted specifically in that context. This article from PMI looks at the connections between project management maturity models and knowledge management maturity models, eventually developing a model that combines them: Management of project knowledge at various maturity levels in PMO, a theoretical framework, https://www.pmi.org/learning/library/management-project-knowledge-maturity-levels-8928
Best, Barbara Fillip
toggle quoted message
Show quoted text
On Wed, Nov 2, 2022 at 8:10 PM Tom Olney < tolney@...> wrote:
Hi,
I'm currently assisting an organisation with managing their project knowledge. They are looking to me to define an end-state. Something they can work towards. They keep on throwing Knowledge Maturity into the mix..... what would mature knowledge look like?
I know of APQC's Knowledge Maturity framework but that looks more at KM as a capability. Any ideas on where to look or what to use to define "mature knowledge"? I'm kind of leaning towards - it depends on what you want, but maybe you have some ideas.
Appreciate the input
CAUTION: This email originated outside of PSCU. Do not click links or open attachments unless you recognize the sender and know the content is safe. If in doubt, use the Phish Alert button at the top of
your Outlook toolbar to report suspicious emails.
|
|
Hi Madeleine,
I am happy to see the amount of conversation and engagement your question generated. Thanks for asking!
I'll try to contribute without being redundant to what others have previously written.
In my experience, when organizations use terms like "mature" to describe their end state for knowledge they are generally referring to several things:
- Knowledge is consistently captured and stored. As opposed to knowledge only existing within people's heads, key information and helpful facts are recorded somewhere in a deliberate, systemic way. It doesn't have to be a single, central location -- it can be several places, as long as each of these places are fit for purpose.
- Knowledge is categorized and organized in a predictable way. Once knowledge is captured, it should be organized in such a way that it is easy for people to make sense of what they are looking at. This also ties in to the next point.
- Knowledge is easy to find. People want to reference knowledge in order to do something else. They generally want to get the knowledge quickly so that they can move on with their task.
- Knowledge is reliable. Knowledge is curated and updated consistently to ensure that it is trustworthy. Folks don't want to spend time validating what they've found.
- Knowledge is complete and interconnected. Knowledge gives a full picture of the situation, people don't want to spend time piecing together bits and pieces from multiple sources to understand what is going on.
- Knowledge is secure and protected. This may be more relevant in some settings than others, but generally you want to make sure that only the right people can access sensitive information.
Sadly, I don't have a single resource for a definition of mature knowledge that I can point you to, however, the following blog post may come close: " NERDy Content for the Enterprise". [full disclosure: my boss wrote that]
Beyond helping your organization define an end goal for their knowledge, I would encourage you to help them define the outcomes that they want to get out of it. From the perspective of project management this could be a variety of things: - Improving future project performance
- Reducing costs / increasing profitability
- Preserving/enhancing stakeholder relationships
- Refining and promoting emerging approaches/practices
I hope you find this helpful! Feel free to reach out if you have any questions.
Best,
Guillermo
toggle quoted message
Show quoted text
Hi, I'm currently assisting an organisation with managing their project knowledge. They are looking to me to define an end-state. Something they can work towards. They keep on throwing Knowledge Maturity into the mix..... what would mature knowledge look like? I know of APQC's Knowledge Maturity framework but that looks more at KM as a capability. Any ideas on where to look or what to use to define "mature knowledge"? I'm kind of leaning towards - it depends on what you want, but maybe you have some ideas.
Appreciate the input
|
|
Thank you all so much for your thoughtful input. It has certainly given me many angles to consider.
toggle quoted message
Show quoted text
From: main@SIKM.groups.io <main@SIKM.groups.io> on behalf of Dennis Thomas <dlthomas@...>
Sent: Friday, November 4, 2022 6:37:06 PM
To: main@SIKM.groups.io <main@SIKM.groups.io>
Subject: Re: [SIKM] Knowledge Maturity #maturity
Hello Madeleine Du Toit,
We are a technology company focused on the KM industry. The more advanced queries we get are from companies that want to grow. Their concerns are transferring organizational knowledge. They have ERP systems (Enterprise
Resource Planning) infrastructures, but the end-to-end business processes (the business intelligence) of those systems is hidden in data structures which in general, is unavailable to their workforce for knowledge transfer purposes.
If I were to define an "end-state,” I would say it is the human accessible, understandable, and usable how-to, why, and what-if knowledge that is represented within their existing end-to-end business procedures, tasks
and processes, along with any dependent, contingent, or adhoc relationships required to represent the overall breadth and depth of those end-to-end knowledge structures.
And organization may have several end-to-end business processes, such as: Hire to Retire; Acquire to Retire; Plan to Inventory; Quote to Cash; Market to Order; Idea to Offering; Prospect to Customer; Customer to Retention;
etc. These end-to-end procedures are core to every business. It’s where real-world business intelligence resides. The stuff that people have in their heads.
We use this very language to help clarify the KM issue and to direct the conversation. In most cases, referring to these terms generates a “blinding flash of the obvious” response from potential customers sitting on the
opposite side of the table. It's what practical business people understand because it’s what business owners and operators have spent their careers learning and perfecting.
Once the end-to-end discussion point has been made, accepted, and acknowledged, the next step is to correlate the end-to-end modeling process to their goals and objectives. The beauty of this approach, for us, is that
each end-to-end KM process can be defined in advance, achieved in milestone fashion, and successfully concluded with an end-state result. Then, on to priority #2, #3, #4, etc.
Of course, it doesn’t hurt to have a cognitive technology that has the capacity to model the complexity of end-to-end business procedures and processes. I know there are a few out there.
Also, keep in mind that Conversational AI is here now. It is also called NLU (Natural Language Understanding). So if the discussion centers on Tier One, Tier Two Customer Support, Digital Assistants, Smart Agents, Smart
Assistants, or Smart Chatbots, etc. know that non-hype versions are coming out on the market, but they are few and far between.
Good luck!
Dennis L Thomas
(810) 662-5199
dlthomas@...
IQStrategix.com
Leveraging Organizational Knowledge
On Nov 2, 2022, 1:36 PM -0400, Madeleine Du Toit via groups.io <mdutoit@...>, wrote:
Hi,
I'm currently assisting an organisation with managing their project knowledge. They are looking to me to define an end-state. Something they can work towards. They keep on throwing Knowledge Maturity into the mix..... what would mature knowledge look like?
I know of APQC's Knowledge Maturity framework but that looks more at KM as a capability. Any ideas on where to look or what to use to define "mature knowledge"? I'm kind of leaning towards - it depends on what you want, but maybe you have some ideas.
Appreciate the input
|
|
Hello Madeleine Du Toit,
We are a technology company focused on the KM industry. The more advanced queries we get are from companies that want to grow. Their concerns are transferring organizational knowledge. They have ERP systems (Enterprise Resource Planning) infrastructures, but the end-to-end business processes (the business intelligence) of those systems is hidden in data structures which in general, is unavailable to their workforce for knowledge transfer purposes.
If I were to define an "end-state,” I would say it is the human accessible, understandable, and usable how-to, why, and what-if knowledge that is represented within their existing end-to-end business procedures, tasks and processes, along with any dependent, contingent, or adhoc relationships required to represent the overall breadth and depth of those end-to-end knowledge structures.
And organization may have several end-to-end business processes, such as: Hire to Retire; Acquire to Retire; Plan to Inventory; Quote to Cash; Market to Order; Idea to Offering; Prospect to Customer; Customer to Retention; etc. These end-to-end procedures are core to every business. It’s where real-world business intelligence resides. The stuff that people have in their heads.
We use this very language to help clarify the KM issue and to direct the conversation. In most cases, referring to these terms generates a “blinding flash of the obvious” response from potential customers sitting on the opposite side of the table. It's what practical business people understand because it’s what business owners and operators have spent their careers learning and perfecting.
Once the end-to-end discussion point has been made, accepted, and acknowledged, the next step is to correlate the end-to-end modeling process to their goals and objectives. The beauty of this approach, for us, is that each end-to-end KM process can be defined in advance, achieved in milestone fashion, and successfully concluded with an end-state result. Then, on to priority #2, #3, #4, etc.
Of course, it doesn’t hurt to have a cognitive technology that has the capacity to model the complexity of end-to-end business procedures and processes. I know there are a few out there.
Also, keep in mind that Conversational AI is here now. It is also called NLU (Natural Language Understanding). So if the discussion centers on Tier One, Tier Two Customer Support, Digital Assistants, Smart Agents, Smart Assistants, or Smart Chatbots, etc. know that non-hype versions are coming out on the market, but they are few and far between.
Good luck!
Dennis L Thomas
(810) 662-5199
dlthomas@...
IQStrategix.com
Leveraging Organizational Knowledge
On Nov 2, 2022, 1:36 PM -0400, Madeleine Du Toit via groups.io <mdutoit@...>, wrote:
toggle quoted message
Show quoted text
Hi,
I'm currently assisting an organisation with managing their project knowledge. They are looking to me to define an end-state. Something they can work towards. They keep on throwing Knowledge Maturity into the mix..... what would mature knowledge look like?
I know of APQC's Knowledge Maturity framework but that looks more at KM as a capability. Any ideas on where to look or what to use to define "mature knowledge"? I'm kind of leaning towards - it depends on what you want, but maybe you have some ideas.
Appreciate the input
|
|
This is also where I'm interested in exploring how scenario work can feed into KM - those key questions
- What’s the “End game” for KM?
- What does KM “maturity” look like?
- How should KM be implemented to move the organization forward?
depend so much on extrinsic contextual factors, both within the business environment and beyond, that it might be fascinating as well as useful to manufacture contrasting and challenging plausible visions of the futures which await.
What does "the end game"/maturity/successful implementation look like, depending on the future context that the organization finds itself moving into - and especially if it's not a context that was anticipated or desired?
Matt
Madeleine,
I've been reading this exchange with interest and there are a lot of good insights, resources, etc. that members have already shared, so I'm not sure I'm adding value ... but here goes! Hopefully this adds some beneficial
perspective for you.
I've been in my current KM role for a little over 3 years. I was brought into this particular global business unit (GBU) essentially to establish KM (e.g., define scope, create strategy, execute) as a value-add to the
organization. From this exchange, the things that have resonated most with me are statements regarding:
-
What’s the “End game” for KM?
-
What does KM “maturity” look like?
-
How should KM be implemented to move the organization forward?
End Game? For me, my organization, and in particular my KM Team, we established a very high-level, long-term vision (e.g., 5 years) of what would we be doing differently/what would things look
like if we were doing KM better in 5 years than we are today. In the same breath, also saying "There is no end-point, we will always be moving the goal line as we evolve, improve, increase our understanding of business needs, etc." But,
our focus is "There's a better organizational KM out there somewhere, let's move in that direction."
Maturity? Our GBU consists of more than a dozen Centers of Excellence (CoEs) and Functional Areas that each have different focuses, scope, expertise,
approaches, etc. We created a very simplistic maturity self-assessment asking about their level of maturity/comfort with their efforts around things like:
-
Providing a clear and current mission/vision;
-
Identifying expertise within their area;
-
Encouraging input (e.g., what, if any, vehicles do they have in place for contact – input, feedback and questions);
-
Establishing and capturing standards and reference resources;
-
Capturing, sharing, leveraging lessons learned;
-
Helping their audiences (direct and indirect) navigate (e.g., find, understand and apply) their resources, expertise, etc.;
-
And so on.
For us this accomplishes several things:
-
Gives KM Team a view into each area, what they have and what they don't, highlight areas where KM might help them, and provides
an opportunity to evaluate new and enhanced approaches;
-
Allows each area to 'self-assess' and get a clear picture for themselves of where their focus is, what gaps exist, areas of strength
as well as opportunities for improvement, etc.;
-
Provides leadership insight into what’s going on in different areas, compare areas and prioritize KM-related efforts to increase
the value coming out of each CoE/Function and the GBU as a whole.
Implementation?
For us, implementation of KM efforts and initiatives have always been about small, incremental wins that each build on each other. That way we bring along our audiences without losing them by moving too quickly, and allows us to:
-
Demonstrate and display value,
-
Build credibility for KM activities,
-
Establish a foothold for partnering with CoEs,
-
Grow our presence within the GBU.
Best regards,
David B. Graffagna
|
|

David Graffagna
Madeleine,
I've been reading this exchange with interest and there are a lot of good insights, resources, etc. that members have already shared, so I'm not sure I'm adding value ... but here goes! Hopefully this adds some beneficial perspective for you.
I've been in my current KM role for a little over 3 years. I was brought into this particular global business unit (GBU) essentially to establish KM (e.g., define scope, create strategy, execute) as a value-add to the organization. From this exchange, the things that have resonated most with me are statements regarding:
- What’s the “End game” for KM?
- What does KM “maturity” look like?
- How should KM be implemented to move the organization forward?
End Game? For me, my organization, and in particular my KM Team, we established a very high-level, long-term vision (e.g., 5 years) of what would we be doing differently/what would things look like if we were doing KM better in 5 years than we are today. In the same breath, also saying "There is no end-point, we will always be moving the goal line as we evolve, improve, increase our understanding of business needs, etc." But, our focus is "There's a better organizational KM out there somewhere, let's move in that direction."
Maturity? Our GBU consists of more than a dozen Centers of Excellence (CoEs) and Functional Areas that each have different focuses, scope, expertise, approaches, etc. We created a very simplistic maturity self-assessment asking about their level of maturity/comfort with their efforts around things like:
- Providing a clear and current mission/vision;
- Identifying expertise within their area;
- Encouraging input (e.g., what, if any, vehicles do they have in place for contact – input, feedback and questions);
- Establishing and capturing standards and reference resources;
- Capturing, sharing, leveraging lessons learned;
- Helping their audiences (direct and indirect) navigate (e.g., find, understand and apply) their resources, expertise, etc.;
- And so on.
For us this accomplishes several things:
- Gives KM Team a view into each area, what they have and what they don't, highlight areas where KM might help them, and provides an opportunity to evaluate new and enhanced approaches;
- Allows each area to 'self-assess' and get a clear picture for themselves of where their focus is, what gaps exist, areas of strength as well as opportunities for improvement, etc.;
- Provides leadership insight into what’s going on in different areas, compare areas and prioritize KM-related efforts to increase the value coming out of each CoE/Function and the GBU as a whole.
Implementation? For us, implementation of KM efforts and initiatives have always been about small, incremental wins that each build on each other. That way we bring along our audiences without losing them by moving too quickly, and allows us to:
- Demonstrate and display value,
- Build credibility for KM activities,
- Establish a foothold for partnering with CoEs,
- Grow our presence within the GBU.
Best regards,
David B. Graffagna
|
|
Absolutely Murray, in fact my thought on reading the original question was that the PM mindset is to manage from start to end, and that may be where the particular phrasing of the question came from.
However, I’m not sure I agree on accommodating to this mindset too much, without giving some push back, especially with regard to double loop learning, and managing critical knowledge across multiple projects. I think your distinction between a project and a program is an excellent way to do that.
P
toggle quoted message
Show quoted text
On 3 Nov 2022, at 8:55 PM, Murray Jennex < murphjen@...> wrote:
I agree with you Patrick as I would look at learning how to do the analysis and how to improve that process as a reflection of KM maturity (you would be surprised how many organizations don't think of learning to do the process better and just focus on the outcome). I was just speculating that in the PM world they are probably looking at knowledge processes a little differently. PM is about managing short term activities to a conclusion, it is all about the end state when it comes to achieving project success and realizing project benefits. In that world asking for the end state is a very valid request and I think we need to work within their culture on this. Yes, we know that KM is an evolving set of capabilities and goals and continuous improvement. PM focuses on continuous improvement but with respect to very measurable goals. I don't think we will win many PMs over by not recognizing that they work in a world of short term activities with measurable goals and end states. Currently, PMs think of project knowledge as lessons learned, we can help them expand that concept to be more inclusive of process and capability improvement. However, I think PMs will only shake their heads at us if we refuse to work our concepts into the way their world works. PMs are more concerned with knowledge (singular) than broad knowledge (plural) and they are very focused on measuring benefits. When PMs say end state they mean after the project completes. Our answer that there is no end state means to PMs that we aren't talking projects and instead have moved into programs, perhaps we should state that very clearly PMs are not doing stand alone KM projects but that they need to look at as a KM program with a series of KM projects. The KM projects can and must have an end state with measurable benefits and goals. Most of our discussion has focused on the KM program (and rightfully so). My read of the original question leads me to see this difference in concepts and perhaps that is the first step to answering this question, emphasizing that KM is not a project but a program....murray
-----Original Message-----
From: Patrick Lambe < plambe@...>
To: main@SIKM.groups.io
Sent: Thu, Nov 3, 2022 2:04 am
Subject: Re: [SIKM] Knowledge Maturity #maturity
Hi Murray
You are quite right to make a distinction between specific knowledge artefacts/ resources and knowledge in general or “knowledges” pertaining to a domain, and I accept that specific knowledge resources (usually explicit) may reach an end state.
However, I find it more useful to think of the broader knowledge ground out of which those resources are produced, which guides how they are applied, and which determines when they need to be updated, discarded or replaced. And against that ground, (as you point out) different knowledge resources change at different paces - “knowledge pace layering” if you like. Managing that environment is the real point, I think.
For example, in your case of the engineering solution, yes the solution is interesting and important, but the “ground” of knowledge out of which that solution was produced and in which is it used, is the more important resource, I think, because it is that which tells us how and when to renew it.
Why more useful? Because none of what we do makes sense if we don’t look beyond the resource to the purpose and context of using the resource.
P
Patrick, I understand when you use the term knowledge you are using it in plural, no problem, I just didn't want you to think I was criticizing something I wasn't. My question and point is that as knowledge (in a singular state) matures it may reach an end state where it is only historical/archival in nature and will no longer mature. It may be used again, but it may not. For example, I do an engineering analysis of a problem and reach a solution, it is very useful knowledge as long as I have that component in service, but after a while the component becomes obsolete and is replaced, the knowledge of fixing the problem becomes obsolete in that unless I can relate it to another component it may not be useful directly. I still may retain it for training purposes or historical purposes but for all intents and purposes it has reached its maturity and end of life. I could also apply this logic to many social situations as there are many things we believed as knowledge in the past that would no longer be considered useful or appropriate, that knowledge has reached end of life and is useful only for historical purposes. Frankly, I believe all knowledge has a life cycle with some life cycles being very long and others fairly short. So in this context, I would suggest there is an end state for knowledge....murray jennex
-----Original Message-----
From: Patrick Lambe <plambe@...>
To: main@SIKM.groups.io
Sent: Wed, Nov 2, 2022 11:17 am
Subject: Re: [SIKM] Knowledge Maturity #maturity
Hi Madeleine
I don’t think I would ever use the concept of “end state” with reference to knowledge, because all knowledge has to adapt continuously to changing demands, needs and opportunities. Projects may end, but the knowledge does not.
I do think capabilities are the right way to frame the question, because that covers the base of ensuring that knowledge is kept relevant. Similarly, I think one could define what a desirable knowledge environment/ infrastructure should look like to maintain different classes of knowledge to the necessary levels of relevance, accuracy, completeness, timely production, accessibility, etc.
For these attributes, not all knowledge classes are equal, “it depends” as you say what requirements they might want to set for different classes of knowledge. Some areas of knowledge are more slow moving or fast moving than others, and some forms of knowledge have very high dependencies and risk factors associated with them (e.g. when the technology changes quickly, or there are supply chain disruptions, or new regulatory requirements, or key lessons learned from a major incident).
I hope this is helpful
P
Hi, I'm currently assisting an organisation with managing their project knowledge. They are looking to me to define an end-state. Something they can work towards. They keep on throwing Knowledge Maturity into the mix..... what would mature knowledge look like? I know of APQC's Knowledge Maturity framework but that looks more at KM as a capability. Any ideas on where to look or what to use to define "mature knowledge"? I'm kind of leaning towards - it depends on what you want, but maybe you have some ideas.
Appreciate the input
<SK18th_Anniv2020_emailfooter (2).jpg>
<SK18th_Anniv2020_emailfooter (2).jpg>
|
|
I agree with you Patrick as I would look at learning how to do the analysis and how to improve that process as a reflection of KM maturity (you would be surprised how many organizations don't think of learning to do the process better and just focus on the outcome). I was just speculating that in the PM world they are probably looking at knowledge processes a little differently. PM is about managing short term activities to a conclusion, it is all about the end state when it comes to achieving project success and realizing project benefits. In that world asking for the end state is a very valid request and I think we need to work within their culture on this. Yes, we know that KM is an evolving set of capabilities and goals and continuous improvement. PM focuses on continuous improvement but with respect to very measurable goals. I don't think we will win many PMs over by not recognizing that they work in a world of short term activities with measurable goals and end states. Currently, PMs think of project knowledge as lessons learned, we can help them expand that concept to be more inclusive of process and capability improvement. However, I think PMs will only shake their heads at us if we refuse to work our concepts into the way their world works. PMs are more concerned with knowledge (singular) than broad knowledge (plural) and they are very focused on measuring benefits. When PMs say end state they mean after the project completes. Our answer that there is no end state means to PMs that we aren't talking projects and instead have moved into programs, perhaps we should state that very clearly PMs are not doing stand alone KM projects but that they need to look at as a KM program with a series of KM projects. The KM projects can and must have an end state with measurable benefits and goals. Most of our discussion has focused on the KM program (and rightfully so). My read of the original question leads me to see this difference in concepts and perhaps that is the first step to answering this question, emphasizing that KM is not a project but a program....murray
toggle quoted message
Show quoted text
-----Original Message-----
From: Patrick Lambe <plambe@...>
To: main@SIKM.groups.io
Sent: Thu, Nov 3, 2022 2:04 am
Subject: Re: [SIKM] Knowledge Maturity #maturity
Hi Murray
You are quite right to make a distinction between specific knowledge artefacts/ resources and knowledge in general or “knowledges” pertaining to a domain, and I accept that specific knowledge resources (usually explicit) may reach an end state.
However, I find it more useful to think of the broader knowledge ground out of which those resources are produced, which guides how they are applied, and which determines when they need to be updated, discarded or replaced. And against that ground, (as you point out) different knowledge resources change at different paces - “knowledge pace layering” if you like. Managing that environment is the real point, I think.
For example, in your case of the engineering solution, yes the solution is interesting and important, but the “ground” of knowledge out of which that solution was produced and in which is it used, is the more important resource, I think, because it is that which tells us how and when to renew it.
Why more useful? Because none of what we do makes sense if we don’t look beyond the resource to the purpose and context of using the resource.
P
Patrick, I understand when you use the term knowledge you are using it in plural, no problem, I just didn't want you to think I was criticizing something I wasn't. My question and point is that as knowledge (in a singular state) matures it may reach an end state where it is only historical/archival in nature and will no longer mature. It may be used again, but it may not. For example, I do an engineering analysis of a problem and reach a solution, it is very useful knowledge as long as I have that component in service, but after a while the component becomes obsolete and is replaced, the knowledge of fixing the problem becomes obsolete in that unless I can relate it to another component it may not be useful directly. I still may retain it for training purposes or historical purposes but for all intents and purposes it has reached its maturity and end of life. I could also apply this logic to many social situations as there are many things we believed as knowledge in the past that would no longer be considered useful or appropriate, that knowledge has reached end of life and is useful only for historical purposes. Frankly, I believe all knowledge has a life cycle with some life cycles being very long and others fairly short. So in this context, I would suggest there is an end state for knowledge....murray jennex
-----Original Message-----
From: Patrick Lambe <plambe@...>
To: main@SIKM.groups.io
Sent: Wed, Nov 2, 2022 11:17 am
Subject: Re: [SIKM] Knowledge Maturity #maturity
Hi Madeleine
I don’t think I would ever use the concept of “end state” with reference to knowledge, because all knowledge has to adapt continuously to changing demands, needs and opportunities. Projects may end, but the knowledge does not.
I do think capabilities are the right way to frame the question, because that covers the base of ensuring that knowledge is kept relevant. Similarly, I think one could define what a desirable knowledge environment/ infrastructure should look like to maintain different classes of knowledge to the necessary levels of relevance, accuracy, completeness, timely production, accessibility, etc.
For these attributes, not all knowledge classes are equal, “it depends” as you say what requirements they might want to set for different classes of knowledge. Some areas of knowledge are more slow moving or fast moving than others, and some forms of knowledge have very high dependencies and risk factors associated with them (e.g. when the technology changes quickly, or there are supply chain disruptions, or new regulatory requirements, or key lessons learned from a major incident).
I hope this is helpful
P
Hi, I'm currently assisting an organisation with managing their project knowledge. They are looking to me to define an end-state. Something they can work towards. They keep on throwing Knowledge Maturity into the mix..... what would mature knowledge look like? I know of APQC's Knowledge Maturity framework but that looks more at KM as a capability. Any ideas on where to look or what to use to define "mature knowledge"? I'm kind of leaning towards - it depends on what you want, but maybe you have some ideas.
Appreciate the input
<SK18th_Anniv2020_emailfooter (2).jpg>
|
|
Hi Madeleine,
TLDR: They’re asking you the wrong question. I’d advise you to dig further to determine their intention behind it.
==
I’m always guided by Peter Drucker’s visionary aphorism, “Knowledge is the business.” And by Heraclitus’ that “You cannot step in the same river twice — because it’s never the same river.”
Like the business itself, Knowledge must be dynamic. Is there an “end state” envisioned for the enterprise — a sale of the business, for example, or declaration of bankruptcy? Assuming not, the construct
of an “end state” for knowledge exists in theory only.
In practice, knowledge must continually evolve to meet the ever-changing needs of the enterprise — and changing conditions in the business ecosystem.
I’m seconding similar comments made by Patrick, Nancy and others earlier. Even the name “end state” implies that Knowledge can/should be static — far from the case, in my experience.
This is not just my opinion. The ISO 9001:2015 specification states clearly (in section 7.1.6) that the continual refreshing of organizational knowledge is a core knowledge role. In my experience, though,
it’s too often overlooked in the day-to-day practice of KM.
How to best achieve such “knowledge dynamism” is of great interest to me. In fact, I’m hosting a workshop Monday afternoon at KMWorld on avoiding “zombie knowledge” — the walking dead of knowledge.
Thanks for this question that has already garnered some interesting and useful responses!
tp
From:
<main@SIKM.groups.io> on behalf of "Madeleine Du Toit via groups.io" <mdutoit@...>
Reply-To: "main@SIKM.groups.io" <main@SIKM.groups.io>
Date: Wednesday, November 2, 2022 at 1:36 PM
To: "main@SIKM.groups.io" <main@SIKM.groups.io>
Subject: [SIKM] Knowledge Maturity #maturity
Hi,
I'm currently assisting an organisation with managing their project knowledge. They are looking to me to define an end-state. Something they can work towards. They keep on throwing Knowledge Maturity into the mix..... what would mature knowledge look like?
I know of APQC's Knowledge Maturity framework but that looks more at KM as a capability. Any ideas on where to look or what to use to define "mature knowledge"? I'm kind of leaning towards - it depends on what you want, but maybe you have some ideas.
Appreciate the input
|
|
As I’ve been watching this I’ve been considering the original question and wondering if we’re talking about an end state – or if we’re talking about a vision. The distinction I’m making is that an end state is more prescriptive. A vision
is less specific and more inspiring.
That leads me to the question – how would we know if we have an organization that’s mature in it’s KM initiatives? Is the evidence found in a set of policies and procedures that make knowledge management an explicit part of the functioning
of the organization? (and at what level since 100% isn’t reasonable) Is it the way that people behave in terms of capturing, codifying, and sharing their own knowledge?
Maturity models and the ISO standard are fine but not always so great on knowing when the destination has been reached (if that’s possible.)
Rob
toggle quoted message
Show quoted text
From: main@SIKM.groups.io <main@SIKM.groups.io> On Behalf Of
Patrick Lambe via groups.io
Sent: Thursday, November 3, 2022 3:05 AM
To: main@SIKM.groups.io
Subject: Re: [SIKM] Knowledge Maturity #maturity
Hi Murray
You are quite right to make a distinction between specific knowledge artefacts/ resources and knowledge in general or “knowledges” pertaining to a domain, and I accept that specific knowledge resources (usually explicit) may reach an end
state.
However, I find it more useful to think of the broader knowledge ground out of which those resources are produced, which guides how they are applied, and which determines when they need to be updated, discarded or replaced. And against
that ground, (as you point out) different knowledge resources change at different paces - “knowledge pace layering” if you like. Managing that environment is the real point, I think.
For example, in your case of the engineering solution, yes the solution is interesting and important, but the “ground” of knowledge out of which that solution was produced and in which is it used, is the more important resource, I think,
because it is that which tells us how and when to renew it.
Why more useful? Because none of what we do makes sense if we don’t look beyond the resource to the purpose and context of using the resource.
Patrick, I understand when you use the term knowledge you are using it in plural, no problem, I just didn't want you to think I was criticizing something
I wasn't. My question and point is that as knowledge (in a singular state) matures it may reach an end state where it is only historical/archival in nature and will no longer mature. It may be used again, but it may not. For example, I do an engineering analysis
of a problem and reach a solution, it is very useful knowledge as long as I have that component in service, but after a while the component becomes obsolete and is replaced, the knowledge of fixing the problem becomes obsolete in that unless I can relate it
to another component it may not be useful directly. I still may retain it for training purposes or historical purposes but for all intents and purposes it has reached its maturity and end of life. I could also apply this logic to many social situations as
there are many things we believed as knowledge in the past that would no longer be considered useful or appropriate, that knowledge has reached end of life and is useful only for historical purposes. Frankly, I believe all knowledge has a life cycle with some
life cycles being very long and others fairly short. So in this context, I would suggest there is an end state for knowledge....murray jennex
-----Original Message-----
From: Patrick Lambe <plambe@...>
To: main@SIKM.groups.io
Sent: Wed, Nov 2, 2022 11:17 am
Subject: Re: [SIKM] Knowledge Maturity #maturity
Hi Madeleine
I don’t think I would ever use the concept of “end state” with reference to knowledge, because all knowledge has to adapt continuously to changing demands, needs and opportunities.
Projects may end, but the knowledge does not.
I do think capabilities are the right way to frame the question, because that covers the base of ensuring that knowledge is kept relevant. Similarly, I think one could define
what a desirable knowledge environment/ infrastructure should look like to maintain different classes of knowledge to the necessary levels of relevance, accuracy, completeness, timely production, accessibility, etc.
For these attributes, not all knowledge classes are equal, “it depends” as you say what requirements they might want to set for different classes of knowledge. Some areas of
knowledge are more slow moving or fast moving than others, and some forms of knowledge have very high dependencies and risk factors associated with them (e.g. when the technology changes quickly, or there are supply chain disruptions, or new regulatory requirements,
or key lessons learned from a major incident).
Hi,
I'm currently assisting an organisation with managing their project knowledge. They are looking to me to define an end-state. Something they can work towards. They keep on throwing Knowledge Maturity into the mix..... what would mature knowledge look like?
I know of APQC's Knowledge Maturity framework but that looks more at KM as a capability. Any ideas on where to look or what to use to define "mature knowledge"? I'm kind of leaning towards - it depends on what you want, but maybe you have some ideas.
Appreciate the input
<SK18th_Anniv2020_emailfooter (2).jpg>
|
|
Hi Madeline,
I’d echo a lot of what Nick says.
Rather than importing someone else’s maturity model, I find it more empowering to work with a cross section of staff to envision what KM could be for your organisation. Appreciative Inquiry is a helpful tool for getting to the “what good could look like”.
Then you can build your own model, tailored for the right granularity (Team? Department? Division?). That become scaffolding for the KM capability which you are trying to build – but it’s *your* scaffolding – not someone else’s which just happens
to lead you to their own set of courses and consulting offers! That's important for ownership – if people have good reason to pick holes in a model, they may end up intellectualizing KM rather than putting it to work. (More on the methodology for doing that
in
No More Consultants.)
On ISO30401, you could take a look at the
KM Canvas which Paul Corney, Patricia Eng and I included with the KM Cookbook. It contains a set of practical questions which map to the standard, and can be used to test readiness/resilience – and prompt the
right conversations about priorities.
All the best for your journey...
Chris
From:
main@SIKM.groups.io <main@SIKM.groups.io> on behalf of Nick Milton <nick.milton@...>
Date: Thursday, 3 November 2022 at 10:23
To: main@SIKM.groups.io <main@SIKM.groups.io>
Subject: Re: [SIKM] Knowledge Maturity #maturity
Madeleine, if you are looking for a “goal line” – ie something that can be measured against - you might consider ISO 30401, the Management Systems Standard for KM, with
the following caveats:
- ISO 30401 is a standard for the management system behind KM, ISO being a management-system organisation which has a model for how management
systems work. The standard therefore has much to say about governance, and little to say about which technologies, processes, roles etc to use. This is deliberate, as the standard must fit organisations of all types and sizes, and any framework of tools, processes
etc could be sufficient provided the management system meets the objectives and outcomes of the KM management system and the needs of the stakeholders.
- It is a standard for organisations, so not for personal KM nor tribal knowledge. Specifically it is for organisations with a layer of top management
or leadership who then delegate authorities downward in the organisation. There will be some styles of organisation which do not meet this description. Madeleine – you talk about supporting “an organisation” so you will know if this organisation fits this
description.
Given these caveats, ISO 30401 can still be useful. I am not personally a huge fan of maturity models, as I see KM more as a cultural phase-shift rather than a gradual maturing. But if your organisation
wants to know what an end-state might be for KM, one answer is “the end-state is a fully embedded, operating and continually improving KM management system”, and if they ask “how will we measure if we have got there”, one answer could be “you can measure against
the criteria within ISO 30401”
Nick Milton
toggle quoted message
Show quoted text
From: main@SIKM.groups.io <main@SIKM.groups.io>
On Behalf Of Madeleine Du Toit via groups.io
Sent: 02 November 2022 17:36
To: main@SIKM.groups.io
Subject: [SIKM] Knowledge Maturity #maturity
Hi,
I'm currently assisting an organisation with managing their project knowledge. They are looking to me to define an end-state. Something they can work towards. They keep on throwing Knowledge Maturity into the mix..... what would mature knowledge look like?
I know of APQC's Knowledge Maturity framework but that looks more at KM as a capability. Any ideas on where to look or what to use to define "mature knowledge"? I'm kind of leaning towards - it depends on what you want, but maybe you have some ideas.
Appreciate the input
|
|
Have look at - ISO 30401 Global Knowledge Index by K4All, Dubai
toggle quoted message
Show quoted text
Hi, I'm currently assisting an organisation with managing their project knowledge. They are looking to me to define an end-state. Something they can work towards. They keep on throwing Knowledge Maturity into the mix..... what would mature knowledge look like? I know of APQC's Knowledge Maturity framework but that looks more at KM as a capability. Any ideas on where to look or what to use to define "mature knowledge"? I'm kind of leaning towards - it depends on what you want, but maybe you have some ideas.
Appreciate the input
|
|