I’m Presenting at the CAMPIT Portfolio and Program Management Conference


I’ve been invited to present at the CAMP IT Portfolio / Program Management Conference to be conducted in Las Vegas, June 18 – 19, 2015. I’ll be presenting on our portfolio management approach and lessons learned from my time at MGM Resorts International. When I joined, we had 23 various properties, each with it’s own HR,Payroll, and timekeeping solution. My assignment was to merge them, so we could eventually implement a new ERP.

If you plan to attend, let me know. I’ll load up my Starbucks card and treat whoever shows up and mentions that you read about the event on my blog.


New PM Articles for the Week of March 16 – 22

Green BalloonNew project management articles published on the web during the week of March 16 – 22. We give you a high-level view so you can read what interests you. Recommended:

Must read!

  • Mike Griffiths rounds up a number of high-profile projects from the last few decades, and considers whether they or not they were successful.
  • Jason Bloomberg gives us an overview of cognitive computing, from the perspective of how it can deliver value.
  • Tony Adams quotes Friedrich Neitzsche on why people don’t want to hear the truth, and explains how to deliver bad news, anyway.

PM Best Practices

  • Elizabeth Harrin interviews Michel Dion on his new book, “The Leadership Toolbox for Project Managers.”
  • Rich Maltzman notes the growing interest in sustainability as a project success metric.
  • Bruce Harpham shares a few thoughts (and some research) on improving quality.
  • Nick Pisano continues our dialog on mining information from mountains of project management data, with a look at the influence of software packages on opinions.
  • Brad Egeland has some thoughts on what you need to be successful as a remote project manager.
  • Jerry Johns takes an applied physics approach to keeping his project in balance.
  • Joel Bancroft-Connors and his invisible gorilla, Hogarth, expound on the virtues of looking in the direction we want to go.
  • Steven Levy uses a picture of a man on a bicycle pursued by bear as a jumping-off point for a rumination on project management as a team sport.
  • Michel Dion provides a tutorial on how to talk with senior management.
  • Henny Portman reviews a new book by Hannan, Müller, and Robinson, “The CIO’s Guide to Breakthrough Project Portfolio Performance.”
  • Ryan Ogilvie gets practical, with customer service survey questions that need fine-tuning.

Agile Methods

  • Glen Alleman analyzes Jim Benson’s five estimating pathologies and suggests some corrective actions.
  • Neil Killick recaps the five estimating pathologies listed by Jim Benson, and adds a fifth – accepting the request without asking any questions.
  • Mahfoud Amiour prroposes a new Agile metric: SPOC, or story point cost.

 Soft Skills

  • Hendrie Weisinger continues his series of articles based on his new book, “Performing Under Pressure,” with a look at the positive effects of enthusiasm.
  • Michael Smith on hiring coders: “85% of a programmer’s success is due to human factors rather than pure technical skills.”
  • Suzanne Lucas notes that leaders set the pace with their example, and by communicating clear expectations.
  • Dan Furlong explores the elusive notion of “presence,” and shows why it’s important.
  • Sandy Geroux explains the difference between being accountable and taking ownership.

Podcasts and Videos

  • Cornelius Fichtner interviews Doug Hong on his seven free tutorials for managing projects with Microsoft Excel. Just 27 minutes, safe for work, and highly recommended.
  • Jacob Morgan interviews Rich Carpenter on the intersection of the industrial internet, data science, and the future of work. One hour, safe for work.
  • Renee and Craig interview Henrik Kniberg at Scum Australia, where he delivered the keynote. Just 40 minutes, safe for work.


Of Normalization, Transformation, and Soup Spoons

Two SpoonsOver the last few months, Nick Pisano and I have been exchanging observations on the technical challenges of extracting actionable information from the massive volumes of project historical records. Nick’s latest post at his Life, Project Management, and Everything blog continues his ruminations on finding ways to make this mountain of data more accessible to data warehousing approaches, via normalization. Here are my thoughts on the matter:

Normalization, as defined by Edgar F. Codd and popularized by Chris Date, refers to relational databases. It is a design approach that seeks to reduce the occurrence of duplicate data. Relational databases managed using a single application, or a group of applications that respect the referential integrity rules of the system of record, benefit from this rigorous approach – it reduces the risk of errors creeping into the data as records are added or maintained, and makes management reporting queries more efficient.

We Don’t Maintain History

Historical records, on the other hand, are not maintained; they are queried. Careful controls are used to ensure that historical records are not updated, corrected, tweaked, or otherwise dinked with. That said, not all historical records are consumable in their native state. Many require some form of rationalization before they can be used. For example, two databases containing shoe sizes may have incompatible values, if the data was collected in different countries. But there is a more difficult conundrum: unaccounted-for states.

In an earlier time, a field called “Sex” would have two mutually exclusive values. Then someone noted that the transgendered might not self-identify with either value. In other jurisdictions, where privacy matters are given more weight, and the entity collecting the data has to be able to justify the request, a “None of your damned business” value is required. And as more records are prepared by software actors, we’ve found the need for “We don’t know, exactly.” Consequently, commingling history with new-age records requires more than simple translation – it requires an understanding of the original request that resulted in the recorded response, and the nuances implied in the recorded values. That level of understanding can usually be provided only through transformation at the source, rather than at the destination.

Knowledge Merits Design

All that said: if knowledge is an artifact, then it merits design. But just as there is more than one valid, effective design for a soup spoon, so will there be multiple valid designs for the knowledge accumulated during a project. I would suggest that, while normalization focuses on standardizing design, it does not address history, nor does it account for data nuances. I would also point out that current query technology and natural-language processing has made the analysis of history much less dependent on the relational models used to collect and maintain it. In order to support management reporting across lots of databases, you need to be able to make queries that are meaningful in the context of the question at hand, without regard to the underlying values in the database, or the primary and foreign keys. Aggregating the transformed responses requires a certain amount of faith in the sources, but just as relational designs make records less subject to error by eliminating redundant values, so does localized transformation reduce errors by eliminating redundant rules.

One day, our brilliant designs will be seen as quaint, or at least insufficient. With luck, we’ll live long enough to be embarrassed by them. Peace be with you.