Lessons Learned - Metadata #lessons-learned #metadata
Hi TJ,
You've asked a pretty generic question, so my insights are also necessarily pretty generic:
- You are likely to need less classification than you think.
- If you increase your costs of capture, make sure you can
clearly describe the increased value you will get during later
discovery and reuse.
- Metadata works best when it reflects and reinforces a common
language already in use. If you don't have a common language,
spend some effort establishing it in your relevant community
first.
- Don't discount the effort required to manage your lessons
already captured. Continuing to refer people to bad information
is the quickest way for people to stop using a system.
Cheers,
Stephen.
==================================== Stephen Bounds Executive, Information Management Cordelta E: stephen.bounds@... M: 0401 829 096 ====================================
Hi everyone,
Looking for anyone who might be able to share examples or insights for metadata (tagging / classification) for Lessons Learned - to make it easy to find/browse/filter for them. Examples specific to project delivery in pharma would be great.
Thank you,
TJ
Hi TJ,
LL metadata for classification can best be defined when:
1- They reproduce the context in which the LL has happened, and describe the situation conditions. For example: project type, customer region, risk severity…
2- They act as search criteria and filters for later findability. For example, product type, assembly part, function, process…
3- They refer to expertise and areas of knowledge within their communities. For example: LL validator, engineering expert, quality owner, related competency …
If the LL platform & process are cross-functional and covering the different business disciplines, every team will try to describe the LL using their own vocabularies. You might consider maintaining the cohesiveness of the metadata set and moderating their number.
Thank you
Rachad
A couple of things to consider TJ –
Firstly, speak to your users to understand the sorts of terms they would be searching or browsing for, and make sure the metadata fits their search patterns and needs.
Secondly, the ultimate destination for a lesson is to become embedded within updated processes and procedures, or product design components. Therefore the metadata applied to the lessons must match your existing process/procedure taxonomy and/or product and component taxonomy.
Nick Milton
Sent: 08 April 2022 21:28
To: main@SIKM.groups.io
Subject: [SIKM] Lessons Learned - Metadata
Hi everyone,
Looking for anyone who might be able to share examples or insights for metadata (tagging / classification) for Lessons Learned - to make it easy to find/browse/filter for them. Examples specific to project delivery in pharma would be great.
Thank you,
TJ
Stephen - while generic, your points are definitely good to keep in mind. Keep it simple, knowing that every additional field comes with a cost (to the user, the solution, etc). Ensure there is a management process to look at the lessons captured. Think about common taxonomy.
Rachad - thanks for these examples, which make a lot of sense - and your note on aligning language.
Nick - agree, we are planning to conduct some design thinking workshops to understand this from the end user's perspectives.
#1 We find 'doing the minimum' is the norm. If only one tag is required but you offer the ability to apply multiple, then one tag is what you'll get. Don't expect more than 3 fields to be completed...
#2 'virtual Darwinism' applies to tags - you'll find a small number will be popular and usually the ones at the start of pick lists are the ones that are used. Tags have to resonate with the actual language being used by staff. For example, we asked 60 KM Managers to define 10 tags for their area of industry. The objective was to get no more than 600 tags that matched the key themes, terms etc. in that industry vertical. Apart from the fact we received over 1000 tags in response and had to deduplicate etc., we found that in practice many of the tags they thought would be common were not as common as they thought... We've been able to use Microsoft Viva Topics to prove that point.
#3 Use automation to apply organisational tags e.g. project number, division etc. You just want people to provide the unique human classification that rules or automation cannot provide
But my take is don't let the lessons sit still long enough to be tagged!
Embed them so they are actioned - that's the learning.
I can think of no case of anyone ever thinking "oh, let's go search the lessons learned"
No, we just expect the most current, up-to-date services, content, processes, products, quality, assistance - and we only get that by continuously embedding the lessons.
That improvement is the learning.
But I loved reading all of this!
Lessons were owned by the Army Capability Directorates (Training, Equipment, Personnel etc) and we added sub-categories to them for each of these. Then we added a code that indicated whether the issue at the heart of the lesson was:
1. New requirement
2. Quantity issue of existing capability
3. Performance issue of existing capability
Etc.
With over 2000 lessons under management and about 200 added every 12 months, we could identify themes and trends and underlying issues of which each individual lesson was merely a symptom.
Happy to discuss offline anytime.
Hi TJ,
Not sure what KM platform you are working on but...
For our system content managers use a separate application server that is allocated where you can edit articles or create new ones using existing templates. Here you can also pull up the necessary statistics on search queries, response relevance, referrals, etc. Articles can be not only modified, but also optimized, for example, creating meta tags to improve search in Lessons Learned. In addition, the search can be improved by forcibly adding certain articles to certain queries. This is called the “Editor’s Pick”: when searching the user sees such materials in a separate column.
I hope this helps, let me know if I'm missing the mark.
Cheers!
Patrick
817-449-5272