I’m Presenting at the CAMPIT Portfolio and Program Management Conference


I’ve been invited to present at the CAMP IT Portfolio / Program Management Conference to be conducted in Las Vegas, June 18 – 19, 2015. I’ll be presenting on our portfolio management approach and lessons learned from my time at MGM Resorts International. When I joined, we had 23 various properties, each with it’s own HR,Payroll, and timekeeping solution. My assignment was to merge them, so we could eventually implement a new ERP.

If you plan to attend, let me know. I’ll load up my Starbucks card and treat whoever shows up and mentions that you read about the event on my blog.


Of Normalization, Transformation, and Soup Spoons

Two SpoonsOver the last few months, Nick Pisano and I have been exchanging observations on the technical challenges of extracting actionable information from the massive volumes of project historical records. Nick’s latest post at his Life, Project Management, and Everything blog continues his ruminations on finding ways to make this mountain of data more accessible to data warehousing approaches, via normalization. Here are my thoughts on the matter:

Normalization, as defined by Edgar F. Codd and popularized by Chris Date, refers to relational databases. It is a design approach that seeks to reduce the occurrence of duplicate data. Relational databases managed using a single application, or a group of applications that respect the referential integrity rules of the system of record, benefit from this rigorous approach – it reduces the risk of errors creeping into the data as records are added or maintained, and makes management reporting queries more efficient.

We Don’t Maintain History

Historical records, on the other hand, are not maintained; they are queried. Careful controls are used to ensure that historical records are not updated, corrected, tweaked, or otherwise dinked with. That said, not all historical records are consumable in their native state. Many require some form of rationalization before they can be used. For example, two databases containing shoe sizes may have incompatible values, if the data was collected in different countries. But there is a more difficult conundrum: unaccounted-for states.

In an earlier time, a field called “Sex” would have two mutually exclusive values. Then someone noted that the transgendered might not self-identify with either value. In other jurisdictions, where privacy matters are given more weight, and the entity collecting the data has to be able to justify the request, a “None of your damned business” value is required. And as more records are prepared by software actors, we’ve found the need for “We don’t know, exactly.” Consequently, commingling history with new-age records requires more than simple translation – it requires an understanding of the original request that resulted in the recorded response, and the nuances implied in the recorded values. That level of understanding can usually be provided only through transformation at the source, rather than at the destination.

Knowledge Merits Design

All that said: if knowledge is an artifact, then it merits design. But just as there is more than one valid, effective design for a soup spoon, so will there be multiple valid designs for the knowledge accumulated during a project. I would suggest that, while normalization focuses on standardizing design, it does not address history, nor does it account for data nuances. I would also point out that current query technology and natural-language processing has made the analysis of history much less dependent on the relational models used to collect and maintain it. In order to support management reporting across lots of databases, you need to be able to make queries that are meaningful in the context of the question at hand, without regard to the underlying values in the database, or the primary and foreign keys. Aggregating the transformed responses requires a certain amount of faith in the sources, but just as relational designs make records less subject to error by eliminating redundant values, so does localized transformation reduce errors by eliminating redundant rules.

One day, our brilliant designs will be seen as quaint, or at least insufficient. With luck, we’ll live long enough to be embarrassed by them. Peace be with you.

New PM Articles for the Week of March 9 – 15

Red BalloonNew project management articles published on the web during the week of March 9 – 15. We give you a high-level view so you can read what interests you. Recommended:

Must read!

  • Suzanne Lucas interprets recent research by a developmental psychologist, which identified seven critical skills that are necessary for you to become a successful boss.
  • Elizabeth Harrin summarizes the four primary styles used in giving feedback, as detailed in Anna Carroll’s book, “The Feedback Imperative.”
  • Elizabeth Booker gives us a tutorial on procurement administrative lead time. Ever had a project start delayed because a lawyer was reviewing terms and conditions? Yup, that stuff.

PM Best Practices

  • Stephen Brobst says the interesting thing about Big Data isn’t Bigness, but the way structure and demand continuously evolves.
  • Glen Alleman observes that using a Fibonacci series for estimating adds no more certainty to the process than you’d get from using a geometric series.
  • Paul Ritchie explains what is required for an R&D-centered organization to get the most value from their PMO.
  • Ronald Bisaccia reviews the evidence: why women tend to be better at assessing and managing risks than men. Ummm … testosterone rots the brain?
  • Nick Pisano reports on efforts to standardize representations of historical data from past projects, in support of management reporting and better estimates.
  • David Cotgreave points out that some of the project manage predictions for 2015 have already materialized.
  • Toby Elwin finds project management lessons in the work of Led Zeppelin. “There’s a sponsor who’s sure all that glitters is gold, and she’s thinks she’s bought a stairway to Heaven.”
  • Ryan Ogilvie presents an example of how to apply problem management principles to IT service delivery.
  • Cornelius Fichtner interviews Jonathan Herbert, who inspired him to create his podcast, on lessons learned in preparing for the PMP exam. Just 51 minutes, safe for work.

 Agile Methods

  • Mike Cohn notes that we need to account for three types of time when planning a Sprint.
  • John Goodpasture gives us a quick excerpt from the upcoming 2nd edition of his classic, “Project Management the Agile Way.”
  • William Nocolich says that indecision is responsible for much of the high failure rate of software development projects.
  • Andrew Lin pulls together some rules of thumb, rubrics, and generalized principles that pertain to Agile and Scrum.
  • Derek Huether takes a personality assessment, and his wife confirms the diagnosis. We’re not as unique as our fingerprints would lead us to believe …

 Soft Skills

  • Bruce Harpham gives us a history lesson on George Washington – who knew he was a life-hacker?
  • Kevin Coleman articulates the long-term effects of the loss of intellectual capital and experience, as the Boomers retire.
  • Hendrie Weisinger recommends creating attainable goals and celebrating small wins – call them micro-successes.
  • Mario Trentim looks at conducting a stakeholder analysis from the perspective of the stakeholder.
  • Ron Rosenhead recounts a PM student’s tale of failing to identify a key stakeholder, and the $200 million fine that eventually resulted.