Tuesday, 8 July 2014

Managing Knowledge Transfer Sessions

On a recent engagement, I was required to manage the transition of manual testing over to another vendor, for a large set of applications. As part of this, the incumbent vendor needed to run Knowledge Transfer sessions with the in-coming vendor. Straight forward enough premise but the process wasn't as straight forward as everyone would have liked.

After several conversations it became apparent there were differing views on how the KT sessions should be prepared for and run. One view was the in-coming team members could simply attend KT sessions with the current team and afterwards review any documents and artefacts that were available. A second view assumed the KT session was in fact multiple sessions. There would be an initial meeting, followed by cycles of questions, feedback and further KT sessions if needed. Naturally, this didn't sit well with already busy SMEs and BAs. Feedback from the incumbent vendor on the quality of the KT sessions was initially good and then we started to get very critical feedback. Clearly, something had changed or we were simply seeing a weakness in the process that hadn't been exposed before.

To address these concerns and hopefully identify issues more easily, it was apparent there needed to be a consistent process around KT put in place. A main failing that I identified was the lack of up-front preparation by the in-coming test team. In previous engagements I've always thought the KT session was perhaps 30% of the discovery, the rest happening by the team doing their own  research with a little nudging and assistance.

To get away from the 'passive' approach to KT, the following steps were agreed on:

* App team / SMEs / BAs to provide the team with any and all artifacts about the application.
        SharePoint/Wiki links, any documents on the system, previous test plans, details of test cases and defects

* Environment and system owners to ensure the testers had access to any systems and tools needed
        Access to the application, tools such as Toad and PuTTY installed, accounts for ALM/Jira, etc. set-up

* Test team to review all artifacts and capture any queries or questions in a KT Prep Sheet sent in advance to the SME before the KT session

* KT Session to take place
        Face to face session where possible

At the point of running the KT session, most of the analysis and review has been completed. The test team should analyse and review anything that directly relates to the area they will be responsible for. Where possible, a peer review of an area should be conducted. That would see the owning tester doing a walkthrough of their understanding to other testers in the team. This way the other testers, who need to have also read the material, reviewed the application, run some test cases, etc. can call out their own queries and questions. This is a great way to ensure misunderstandings are identified and a common understanding is arrived at.

Once a KT session is complete, the team should play-back their understanding to the SME. This can take a number of formats. It could be talking through a test plan and then executing Test Cases, perhaps in parallel with the incumbent test team at first. Another way is to write up their understanding and share it back with the SMEs. In my view a combination is required. Some form of write up to ensure knowledge capture and ease of disseminating this through the team, combined with hands-on testing.

Have a look at the supporting templates on the main site:

KT Prep Sheet: http://cyreath.co.uk/template-knowledge-transfer-sheet.html
KT Session Tracker: http://cyreath.co.uk/template-kt-session-tracker.html


Mark



Liked this post?




0 comments: