The role of the XP Customer in a product company: Toward a Customer Team

Kay Johansen

In this paper I describe my experiences implementing XP at a small company, focusing on the role of our XP customer (our product manager.) We were fortunate to obtain a product manager with the following qualities: I discuss how this arrangement worked well for us, and offer ideas on how to extend the role to support XP more fully, to support XP for a wider class of projects, and particularly to support XP in a larger company.


In 1999 and 2000 I worked at a small company that produced a client/server shrink-wrap information system for small business. Because of the small size of the company and their limited market position, the product's feature set was largely dictated by individual customers. The process of accepting feature requests from customers was not well controlled and work had become very backlogged, with a turnaround time measured in years instead of weeks. The backlog affected developer morale as well as customers', as the developers felt increasing pressure to produce work they didn't consider quality work, and that their estimates were largely ignored.

A new VP of development who championed development's right to make and keep their own estimates relieved a lot of this pressure. We began to pay more attention to customer requests as a whole instead of the wishes of the individual customer who yelled the loudest. The company decided to spend one release on a complete rewrite of the system's user interface, incorporating as many customer change requests as was reasonable but deferring all other requests for future releases.

Selecting the right product manager

At this point we selected two people to define our features and user interface: a product manager and a UI designer. The product manager had been hired by our company from one of our main customers.  He had spent over a year in the field doing on-site product training for our customers. Therefore, he had an excellent perspective for the role of XP Customer. He was in tune with the reactions of all of our significant customers and understood their business processes as well as their front-line employees' data entry styles.

The product manager candidate was very willing to take on the role and responsibility of coordinating this product rewrite. He had a background in Organizational Behavior and proved an excellent meeting facilitator and cross-functional team lead. He did not have product management experience, but was willing to devote all his energy to developing this release, including spending significant time with the developers and the UI designer.

Development, XP style

To get a quick estimate of the entire schedule, we itemized all of the two-hundred-odd user interface screens in the product and assigned a complexity to each one. We created a bare-bones workflow app using VB and SQL Server. The entire effort of itemizing screens and their complexity, entering this information into the app, as well as building the app itself, took one person one day; over the course of the project, this turned out to be great value for the time spent.

The product manager used this app to remind himself what screens needed upgrading. There were other tasks besides straight screen upgrades that he prioritized along with the UI changes. He attacked the work in order of priority. This was very important, as it enabled us to complete the highest priority screens first, and more usefully show the product to Marketing and other interested parties inside the company, as well as provide a valuable trade show demo early on to spur customer interest.

We developed on two-week iterations. At the beginning of each iteration, the product manager brought to the Planning Game his highest priority screen upgrades and other tasks, on 3x5 cards. Each card had an initial estimate based on our up-front complexity review of the screens. The team usually had provided rough estimates for non-screen upgrade tasks in previous "special sessions" separate from the iteration plannings.

During the first part of the meeting, we quickly reviewed what had been accomplished in the last iteration. Developers returned all cards to the product manager and we checked them off as completed or not completed. We added up the "difficulty units" (aka ideal day estimates) of the completed tasks to get our velocity from the previous iteration. The product manager knew that the estimates for the tasks selected for this iteration could not exceed the velocity we measured for our previous iteration. In addition, he had to consider how to handle any non-completed tasks. Each developer also reported how many units he had completed, and the with the input of the team and the team lead, determined how many units he would be allowed to sign up for in this iteration.

At this point the product manager had a hand of cards to be implemented in this iteration. As he laid each card out, one at a time, we discussed as a team the requirements for that card to be implemented. We revised the initial estimates as necessary. (We occasionally did change an estimate, but interestingly this occurred rather rarely.) If the estimates began to exceed the "budget" of last iteration's velocity, the product manager chose cards to defer.

The last phase of the meeting was sign-up time. As team lead, I called "ready, set, GO" and the developers pounced on their favorite cards, which they had been eyeing over the course of the meeting. This activity was reminiscent of rugby or American football. After a few iterations, prompted by fear of bodily harm to the developers, I instituted a round robin approach where each developer chose one card, proceeding around the circle and repeating until all cards were gone. (This slightly reduced the physical negotiating volume, but the verbal negotiating volume increased to compensate.) Developers were forbidden from signing up for a card they had held in the previous iteration but not completed; enforcement sometimes involved prying the card from a developer's grasping fingers. At sign-up time, the option was given the developer of once again revising the estimate; as far as I recall, this option was never exercised.

Iteration planning meetings typically lasted from one and a half to two hours. Developer morale increased significantly as a result of being able to sign up for their own tasks. The development team also got to watch the customer deliberating, prioritizing, sometimes agonizing, over features that had to be split, simplified, or dropped altogether. This provided the evidence the team needed to realize that they were no longer going to be treated as if they were a machine that could be turned up another notch, but rather as a group of humans with a fixed capacity for productive, professional work. One developer who had experienced much of the earlier ad hoc, "real customer" driven work at the company later expressed his appreciation for "keeping the floodgates closed."

A clear definition of work, far clearer than could have been accomplished through written specifications, was appreciated by all. Two developers who went on to work on another project subsequently expressed their desire that they had something like "cards" on the new project, so they could know what to work on, when they were starting and when they were done.

Towards a better XP customer

The product manager in the role of Customer was a great success for us. Through his willingness to prioritize and make hard decisions of scope to reduce, and our ability to predict how much work we could get done, we were able to meet our projected code complete date exactly, without working any overtime. The senior management team was able to know the state of the project at all times, and exercised their option to include additional features by adding to the schedule the clearly defined amount called for by our estimates.

In more general circumstances, however, I would consider building a team to play the role of XP customer. We did not implement automated acceptance testing, which lack forced us out of XP mode to go through the traditional "death march" after code complete and before shipping, and caused unpredictability in our schedule. As it was, even without writing acceptance tests, our product manager was probably stretched to the limit of his time capacity. Additional resources would have to be given to develop automated acceptance tests.

Although we did add and upgrade features, our project was primarily a user interface rewrite, and so functionality may have been more clearly defined than the general case, especially more so than if it had been a first release of a product. Defining the initial release of a product would likely take much more of the product manager's time for research, especially if he did not already have a strong relationship with a small and clearly defined customer base, as our product manager had.

We were in a small company, so although we did perform a full product launch with involvement from all the usual groups -- documentation, QA, support, sales, marketing, and (key for a business information system application) implementation and training -- the effort involved to coordinate the product launch was less than it would be at a large company, especially if the large company were distributed across multiple sites. In my new job, I am in close communication with product managers at a 3000 employee, worldwide company, and can appreciate the magnitude of the workload involved in coordinating and launching a product in this environment. I do not think it is possible for a product manager in this case to spend the time collaborating with an XP team.

My proposal for a general XP customer role, based on the experiences of this project and on many discussions with an experienced and successful traditional product manager, is twofold: First, I propose forming a product management team, including at least two people performing the following roles: marketing, project management, and technical product management. The technical product manager would be the primary XP customer. Second, I propose forming a requirements team, combining the QA team and the product management team, to be responsible for capturing, defining and automating the product requirements in the form of acceptance tests.