New PM Articles for the Week of December 15 – 21

Balloon SunriseNew project management articles published on the web during the week of December 15 – 21. We give you a high-level view so you can read what interests you. Recommended:

PM Best Practices

  • Elizabeth Harrin describes Project Management as a Service. Not outsourcing, but a change in approach.
  • Johanna Rothman debunks the notion that competition among teams produces better products.
  • Glen Alleman debunks a debunking of myths and half-truths about estimating.
  • John Goodpasture explores the idea of cascading risks: where one damned thing leads to another.
  • Ron Rosenhead reflects on what he’s learned over the past year.
  • Harry Hall shares the lessons learned from this year’s Christmas tree disaster. Yes, even the Nativity Celebration needs a risk management plan …
  • Gary Booker illustrates a model of accountability, as a governance and operating practice.
  • Ryan Ogilvie considers whether communication is more effective when more structured or more personalized.
  • Ulf Eriksson gives us his recommendations for writing more effective test cases.

Agile Methods

  • Mike Cohn recommends that product owners should expect the development team to make a few adjustments to the sequence that they work the backlog.
  • Joanne Wortman argues that the key to success in an Agile initiative is taking the time to get the architecture right.
  • Michiko Diby is noticing that Agile values and methods are creeping into her off-duty life.
  • Kam Zaman reports on his success in implementing the elusive “dual-track Scrum.”

Looking Ahead

  • Carleton Chinner outlines three critical trends that will directly impact the practice of project management.
  • Michel Dion reflects on the evolution of project management, as the wall between operations and projects melts away.
  • Jennifer Zaino projects the future of cognitive computing, for 2015 and beyond, in health care, retail, and other industries.
  • Kent Schneider traces four critical trends related to data breaches and security that will affect our projects in 2015.
  • Seth Godin contributes his “annual plan construction set” of meaning-free buzzwords and phrases, to help you prepare for the coming year [face palm].

Podcasts and Videos

  • Cornelius Fichtner interviews The Risk Doctor, David Hillson, on the risks you didn’t even know you were taking. Just 21 minutes, safe for work.
  • Craig Smith and Tony Ponton interview Rachel Tempest Wood on why project management is still useful. Just 25 minutes, safe for work.
  • Here’s a YouTube video explaining the origins and principles of Kanban, as developed and practiced at Toyota. Just 3 minutes, safe for work.

Pot Pouri

  • Tony Adams notes the viral nature of cranky behavior at work: we are “emotional conductors” who bring our emotions to work every day.
  • Lynda Bourne describes a recent scientific study of idiotic risk, e.g. that class of risks where the payoff is negligible and the downside is extreme. Key finding: elect women.
  • Kerry Wills gives us the key bullet points from the 2014 Standish Report. If I thought it was a statistically sound survey, I’d look for other work.
  • Alex LuPon identifies the underlying project management methodology followed by The Hobbit Trilogy. Take THAT, Joseph Campbell!

Enjoy!

New PM Articles for the Week of October 27 – November 2

Just OverheadNew project management articles published on the web during the week of October 27 – November 2. We give you a high-level view so you can read what interests you. Recommended:

PM Best Practices

  • The second edition of Women Testers Magazine is now available. Not just for women or testers – this is some truly excellent content. Highly recommended!
  • John Goodpasture considers two views of “architecture.”
  • A Business Cloud News survey found that IT isn’t really driving SaaS adoption, and cloud-based applications are still providing data security challenges.
  • Andy Jordan concludes his long series on organizational risk management.
  • Johanna Rothman lays out an approach for tactical management.
  • Bruce Benson makes the case for getting into the weeds – researching the history, understanding past performance, and scheduling based on demonstrated capabilities.
  • Rebecca Mayville uses the butterfly as a metaphor for driving positive change.
  • Michelle Stronach recounts a sad story of how she took over a project in progress, from a well-liked, competent project manager who passed away.

Agile Methods

  • Kailash Awati describes how to apply the principles of emergent design to enterprise IT.
  • Glen Alleman shares his article, “Agile Program Management,” published in Cutter Journal. A long but excellent read.
  • Mike Cohn continues his series on sprint planning with the commitment-driven approach.
  • David Anderson notes that, as soon as organizations get used to time-boxing, they shrink the size of the boxes. Kanban (naturally) avoids this trap!
  • Don Kim believes that the Scrum team will only succeed if the Product Owner truly understands what is needed and can communicate it effectively.
  • Ravi Nihesh Srivastava proposes using Scrum to produce a high-quality technical proposal.

Leadership

  • Bob Tarne summarizes keys points from a recent presentation by Tom Peters.
  • Elizabeth Harrin interviews Oana Krogh-Nielsen, Head of the PMO for the National Electrification Program for the Danish rail system, Banedanmark.
  • Bruce Harpham interviews Terry Schmidt, whose resume begins with his internship at NASA during the Apollo Moon landing program, on strategic project management.

Podcasts and Videos

  • Cornelius Fichtner interviews Joseph Flahiff at the PMI Global Congress, on his new book, “Being Agile in a Waterfall World.” Just 30 minutes, safe for work.
  • Dave Prior rounds up with fellow Agilistocrats Richard Cheng and Dhaval Panchal to discuss Agile misconceptions they see in training classes. Just 15 minutes, safe for work.
  • Margaret Meloni shares an article by Roxi Hewertson, “Lead Like it Matters.” Just 3 minutes, safe for work.
  • Craig, Tony, and Renee interview Em Campbell-Pretty on the Scaled Agile Framework. Just 35 minutes, SAFe for work. Oh, stop rolling your eyes …

Pot Pouri

  • Linky van der Merwe tells us about the African Storybook Project, which aims to translate children’s stories into African languages and publish them on the internet.
  • Pat Weaver celebrates the 30th anniversary of the PMP examination with a brief history of PMI, the PMBOK, and the PMP credential.
  • Ralf Finchett shows the Project De-Motivational posters he’s been working on, and asks if we have any ideas. Wait until I take my medication, Ralf …
  • Kerry Wills finds the humor in Reply to All when “All” is the entire company.

Enjoy!

Validating Data Conversion

Continuing the series on the data conversion cycle: subsequent to any bulk load of data from another system, the accuracy and completeness of the data in the target system must be assessed.  This process is generally referred to as validation, and it has three objectives:

  1. Identify errors in mapping data elements between systems
  2. Identify errors in the source system extraction processes
  3. Identify errors in the source system records

As you can see from the diagram below, the validation process is the key feedback point in the data conversion cycle.

Data Conversion CycleValidation typically uses several techniques, selected based on the nature of the data being inspected.  They include:

  • Application level comparison – A user logs in to each system and compares the information in the source system with the information in the target system.  This is the most labor-intensive technique, but invaluable for confirming that the data elements were mapped to the correct fields in the target system (Objective 1).
  • Report level comparison – A user or analyst compares a report from the source system with a report containing the same records, prepared from the target system.  The reviewer looks for differences in the number of records, and incorrect or truncated values.  Once identified, a member of the conversion team should investigate whether the difference was caused by the extraction process (Objective 2) or resulted from issues in the source system (Objective 3).  This technique doesn’t require access to the system, or even knowledge of the data or subject matter.  However, it can be fairly labor intensive.  This technique is best used when reviewing data with a limited number of values, such as assignment to an organization, as arbitrary values are difficult to compare.
  • Excel vlookup automated comparison – An analyst prepares an Excel workbook using the data extracted from the source system in one worksheet, and the corresponding data extracted from the target system in another worksheet.  An automated comparison in a third worksheet is then possible using vlookup and other analytical functions within Excel.  This approach requires more preparation, but is usually fastest in execution, especially when inspecting very large numbers of records containing arbitrary values, such as strings (names and addresses), dates, and numerical values.  As with report level comparison, differences are investigated to determine whether the root cause was the extraction process (Objective 2) or an error in the source system (Objective 3).

Much like Chinese cooking, most of the labor in a data validation exercise is in the preparation.  To that end, validation planning should include an analysis of the data being loaded to the source system, to determine the following:

  • What user-accessible field mappings have been made?  It may be possible to identify one or two users with the access rights to check all of the mappings, using application level comparison.  Note that it is not necessary to check multiple records; the purpose is to ensure that the data is “landing on the correct runway.”
  • Based on the records types being loaded, are there delivered audit reports built into the target system that can facilitate a report level comparison with the source?  If there isn’t a complete correlation between the two systems, generally a custom report can be derived from an existing report, either on the source or target system, to facilitate the comparison.
  •  Which data elements would be difficult to compare visually, in a report level comparison?  It is useful to identify the specific fields requiring an automated comparison, so that custom reports can be created in advance to extract the records for loading into Excel.  It is also common practice to load the source records into the worksheets and prepare the comparison worksheet while the data is being loaded into the target system, to save time.
  • What dependencies exist?  Usually, there are no dependencies in the inspection process, because the data, once loaded, is static.  However, if there are issues that will impact the availability of conversion resources, they should be identified in advance.

A well-planned and coordinated validation process can proceed on a broad front, with a number of workers inspecting specific record types, using specified tools and techniques.  The team should have the goal of minimizing the time required to conduct validation, as it is the final step before a move to production.  This is critical, in that during the time between the final extract of data from the legacy system used as the source, to the move of the new system to production, transaction will continue to be processed.  These transactions must then be re-entered into the new system.

The validation process falls between loading the data to the target system and actually using it, in every cycle.  Whether that initial use is development, testing, or production, the process needs to be followed.  Note that validation findings of the final load before the move to production may have to be subjected to a triage process, in that any corrections will have to be made in that production system.  Consequently, the principal measure of success of validation in earlier cycles should be the elimination of sources of error.  A solid validation and correction process should reduce the number of corrections in production to nearly zero.

Next week, I’ll address creating a data conversion plan, based on the data conversion cycle, and integrating it into the overall project plan.