Define an engagement goal and success for a Knowledge Base #metrics #engagement
Hi all,
I'm working on a project where a KB has been offered as a solution to a team. While i see a few engagements and usage, its too early to see the trend. However, the Leader of that team has approached me to create an engagement goal that his team could aspire towards. Any insights on how to plan and define an initial engagement goal? Please advise and share insights on what could be the right approach! The metrics is limited to users (i.e. views), uploads and downloads of knowledge articles. -- Regards, VW
|
|
Stephen Bounds
Hi Vandana, If you can monitor uploads by user, think about tracking the number of users who have contributed any articles as a better metric for commitment to use than raw upload numbers. You probably don't want one user to be supplying 50 uploads per week and no-one else adding anything into the system! However, I suggest that the main issue with the metrics you have described as available is that they all measure proximate goals (ie goals in support of a business outcome) rather than an ultimate business goal (such as reduced average case time, fewer errors, higher customer satisfaction, etc). You need to find a plausible link between one of
your proximate metrics and a valued business outcome to make your
metrics meaningful. This link could be done empirically, or
anecdotally, quantitatively or qualitatively. For example, you could survey the team and ask something simple like:
If you track your usage stats and repeat this survey every few months you should see increasing usage tracking with increased business benefits (and if not, well ... that's a whole other conversation). Once you've demonstrated the link between usage and benefits to the satisfaction of your manager, you can decrease the frequency of user surveys or cease them altogether since it will have been established as a proxy for the thing you want to really measure, ie business benefit.Cheers, ==================================== Stephen Bounds Executive, Information Management Cordelta E: stephen.bounds@... M: 0401 829 096 ==================================== On 26/04/2021 9:56 pm, Vandana Wadhawan
via groups.io wrote:
Hi all,
|
|
Hi Stephen,
You are spot on in understanding the challenge and limitations of the KB. Thank you for sharing approaches that i can consider. The idea behind this exercise is to aspire towards a goal (Engagement specific), and then re-look at this change (KB) and see what else can be done to ensure the change (which is adapting this KB) is internalized in day-to-day behaviors of the team members. Is there a standard bench-mark i should measure engagement against? I mean the percentage of adoption that is defined and generally observed by KM practitioners during such projects. -- Vandana W
|
|
Stephen Bounds
Hi Vandana, What kind of work does the team do (broadly described if you
can't be specific)? Customer service work will have quite
different engagement frequencies and metrics from an engineering
team, for example. Cheers, ==================================== Stephen Bounds Executive, Information Management Cordelta E: stephen.bounds@... M: 0401 829 096 ==================================== On 27/04/2021 2:41 pm, Vandana Wadhawan
via groups.io wrote:
Hi Stephen,
|
|
Hi Stephen,
The aforementioned team is research heavy. They engage in data modelling activities, and performs drug-specific researches and are qualified data scientists. The KB is intended as a storefront for all of their knowledge assets. While I'm able to find out their monthly engagement (it's been only a month), I'm also supposed to propose an engagement goal to the team (a realistic percentage) that they could aspire to achieve in 2-4-6 months. Is there a standard benchmark that anyone has come across? i could workaround that benchmark and come up with something. -- VW
|
|
Stephen Bounds
Hi Vandana, Thanks for that detail, that's really helpful. I know it would be lovely to give you a standard benchmark, but you really will do better if we can find a metric that drives the kinds of behaviours you want. For any system or process, there are six basic things you can try to optimse:
I'm guessing that cost isn't the primary driver, so let's focus on the other aspects. Will your Knowledge Base be successful merely if there are knowledge assets published? What if those assets were really poor quality? Or they weren't of interest to anyone? Here are some ways to think about metrics within the constraints
of the available data. To support optimisation for:
I'll be honest though; none of these are great
metrics. They are pretty easily game-able so you'll need to trust
that your teams genuinely want to succeed on their merits rather
than by seeking to artificially inflate numbers. Of these, I'm
most inclined to:
You might also want to periodically examine quality of the
various uploads to determine if there are products being created
which aren't adding value and if so, engaging in a conversation
about if they should be stopped or adapted to be more relevant.
This would be more of a diagnostic than a metric though. Cheers, ==================================== Stephen Bounds Executive, Information Management Cordelta E: stephen.bounds@... M: 0401 829 096 ==================================== On 28/04/2021 9:54 pm, Vandana Wadhawan
via groups.io wrote:
Hi Stephen,
|
|