The McKinsey Engagement
Page 11
My first example is related to a business plan for one of the largest Italian companies, in which the McKinsey team was involved in many different parallel work streams for the client. Realizing how important it was for all of the teams to be working on high-impact areas, I was constantly involved with the coordination and prioritization efforts among the teams.
In fact, we developed an overarching model that described the key questions, hypotheses, and potential recommendations in each of the areas while synthesizing the impact and coordinating synergistic knowledge creation. One director exerted considerable effort, leading the engagement managers to maintain their focus and organization of teamwork. Such an approach of central alignment, framing, and organizing was extremely helpful, even if it seems simple to do.
My second story was not as positive. The success of any engagement is determined by how clearly a proposal is defined at the outset of the project. The agreement with the client should be very clear in terms of the following: the result, what the engagement will achieve, the involvement of the client, and how to align both the team and the client. Based on my experience, this agreement is well defined in about 99 percent of engagements, but occasionally one slips through the cracks.
In a merger project I worked on, two separate partners were not on the same page during the organization effort. The two McKinsey teams went down different paths, did not coordinate, and at times seemed to be producing contradictory results. Much of the problem stemmed from the facts that one team did the work plans for both teams and that the players on the project teams never really aligned their goals. In the end, I believe it turned out to be one of those rare cases where our efforts did not translate to significant added value for the client.
STORY FROM THE FIELD—BUSINESS SCHOOL EXAMPLE—1
Topic: Division of responsibilities to avoid redundant efforts and frequent communication lead to an efficient and thorough engagement. Ben Kennedy of the Fuqua School of Business at Duke University describes some takeaways after working as a summer associate at a top strategy-consulting firm.
I learned quite a bit during my internship. My project involved creating a short-term growth strategy for a private equity firm's client that was hoping to go public. One complication was that the company served a number of different industries, each with a different growth trajectory.
The project was organized around three different work streams, each with a different manager. I worked on several aspects of the project and found it quite motivating, as I was able to see how the pieces fit together. I generated insights as I analyzed the data, and I found that the insights became more and more developed as others in the different streams looked at the results. I learned the power of checking in with the team on a regular basis, as well as just how helpful a nice conference table and whiteboard can be.
When it came to organizing our data collection, we were very careful about splitting up the collection/generation effort—it just doesn't make sense to have everyone looking at the same raw data, articles, databases, spreadsheets, and other such material, because it is so inefficient.
One other takeaway for me was related to gathering information from the client. You want to get as much firsthand information as possible, but you also have to separate fact from opinion. You have to seek out the true experts in the firm and realize that some folks have particular agendas that explain why they tell you things (e.g., concern about losing one's job). I found it helpful to put in process steps by which I could verify what I heard through data and interviews with people at different levels within the company.
STORY FROM THE FIELD—BUSINESS SCHOOL EXAMPLE—2
Topic: Definition of team goals and awareness of others' progress help a busy team win a case competition. Our last Story from the Field comes from Juan Pulido, an MBA student at the Darden School of Business at the University of Virginia. Juan describes three critical success factors that helped his team win a major business school case competition.
The first step was to get everyone on the same page. Our team had a high level of focus, and we were all very enthusiastic. We had the same goals and guiding principles: "learn, have fun, and aim for first prize." In terms of organization, the first step was to have all five group members brainstorm about how to approach the project and what they thought the end result should include. We then narrowed the list and drew a tentative storyboard to help us Frame our research and define its scope. From this storyboard, we were able to incorporate hypothesis-driven analysis; by thinking backward, we were able to discern what intermediate steps were necessary to achieve our desired end result.
Our division of work and assignment of roles were democratic and informal, as each person volunteered to own a particular work stream based on past experience or interest. We then made a Gantt chart that included due dates and intermediate goals (though this sounds professional, it was a relatively informal process). This helped to keep everybody on the same page and aware of what the other team members were doing.
The case was four to five days long, and because it took place in the middle of interview season, everybody was extremely busy, and the situation was hectic. There were definitely some conflicts during the project, but through daily, open communication, we were able to avoid major misunderstandings and confrontations. Frequent communication was necessary to keep everybody on the same page and on task, as our schedules didn't always allow for lengthy team meetings.
There was no clear leader in this group, which was randomly chosen (we all sat next to each other in class). But by working carefully together and organizing strategically, we were able to achieve our goal and win first prize!
CASE STUDY
Not a whole lot of issues here. Dr. Friga and Chris Cannon played an active role in organizing our efforts. We also benefited from all of our hard framing work as described in the previous chapter.
WHAT WE DID
We had spent a great deal of time and energy framing the problem and defining and assigning MECE buckets, with the result that the project's organization really fell into place quite easily. In the initial stages of the project, we focused on creating a high-level process map that served to guide us throughout our research. We revised this map as our research closed off some avenues and opened up some others. This helped us to focus on important areas and not to boil the ocean.
As the project developed, we shifted our focus increasingly to the story, with our key question serving as the starting point. We knew that we could have all the data in the world and very convincing arguments, but if we didn't have a coherent story that was easily followed, our presentation would be ineffective. In order to have the most powerful presentation, Shalini and Rachita spent a long time putting all our slides together and organizing them into a logical sequence. Afterward, Dr. Friga and Chris Cannon edited the deck, rearranging and tweaking the slides so that our story would build and finish with force.
WHAT I LEARNED
At the end of the day, the story is what matters. A lot of work needs to be done behind the scenes in order to craft the story well, but ultimately the most important thing is to come up with a story that the audience can follow and that produces the desired impact/effect. You can have great ideas and great research to back it up, but if you don't present the story well, it falls flat. We found that by thinking about the story earlier in the process, especially during the Organize phase, our work was both more effective and efficient. Many times we were tempted to gather data or do some analysis on areas or topics that ultimately were not that important. This becomes clear as you consider where it would fit in terms of the ultimate story and especially in terms of the final presentation.
DELIVERABLES
Figure 6-2 Organize: Process Map
Figure 6-3 Organize: Content Map
Figure 6-4 Organize: Story Line
7
COLLECT
Figure 7-1 TEAM FOCUS Model—Collect
CONCEPT
This is the shortest ch
apter in the book, and also the most straightforward. Though collecting data is mundane, it is still an important element of the team problem-solving process. Why are data important? First and foremost, data are the tools that are used to prove or disprove the hypotheses developed during the analytical process. They enable the problem solvers to arrive at conclusions that are (hopefully) correct, and therefore effective. Finally, the data become the basis for reports and presentations when the final recommendations are presented (this will be covered much more thoroughly in Chapter 9).
So what is the key challenge in the data-collection process? The most common problem may be too much information. Given the outstanding search tools, electronic databases, and codified knowledge that are available, teams generally collect more information than they use; as a result, they are inefficient in terms of finding the important information for the issues at hand.
RULES OF ENGAGEMENT
The Rules of Engagement given here all focus on increasing efficiency and effectiveness during the data-gathering process, with a primary goal of weeding out excess information.
RULE 1: DESIGN "GHOST CHARTS" TO EXHIBIT NECESSARY DATA
This Rule of Engagement may have caught your attention because it includes what may be a new concept for your teams. When I teach the TEAM FOCUS model, this is one of the most challenging tools to introduce to teams, especially if they are not accustomed to working with draft deliverables.
What is a ghost chart (some refer to this as a "ghost slide," although the terms are used interchangeably herein)? It is basically a draft slide that is used to capture ideas at an early stage in the problem-solving process. It comprises a title (usually at the top of the slide), a data label, and the data (or a sketch of what the data will be once collected). The most important part of the ghost slide, and the only part that should reflect any substantial amount of "analytical energy," is the title of the slide. The title is in sentence format, and it states the insight—the "so what"—of the slide. An example of a title would be, "The revenue from widget sales is on a steady decline." The title is a specific identification of the anticipated data that will be shown on the slide. At this stage in the process, remember, we are not exactly sure of what the data will be, so the label is an educated guess as to what we expect to find to test our hypotheses. An example of a graph label would be "Widget Revenue 2003–2008."
I find that many times, teams will create the label and wait until the data are gathered to take a stab at the title. It is important to be thinking in terms of insights related to the story all the way through the process, not just at the end! Recognize that the slide title and even the data label will probably change over the course of the project, and that there is nothing wrong with that—in fact, it is expected!
The final part of a ghost chart is a sketch of the data. Do not take the time to format a sophisticated chart with representative data, in other words, the data you expect to gather. All that is necessary is a very rough sketch, by hand at first, and then with a template chart that represents how the data may be displayed. Basic chart formats taught at McKinsey (and in most consulting firms) are listed here and are also becoming standard fare in Microsoft PowerPoint. Note that we will cover this much more thoroughly in Chapter 9.
Bar charts (vertical or horizontal)
Pie charts (components)
Waterfall charts (composition, building to a total)
Era charts (from–to)
Flowcharts (steps)
Gantt charts (activities and timeline)
The biggest problem consultants have with creating ghost charts is their reluctance to document their ideas without thorough data collection and analysis. This was a struggle for me early in my career. However, it is important to realize that the problem-solving process is iterative and evolves over time. The formal creation of draft deliverables is a great way to make the process more efficient. This had been especially apparent to me when working with MBA students in team projects or case competitions. Rather than "wasting" time creating ghost charts, the teams just gather data and analyze them, and they continue to gather and analyze until the very end of the process, when they finalize the story and create charts. One problem with this approach is that the data are not as convincing as they might be when displayed in chart format; additionally, at the end of the project, these teams often find that they are missing key data related to their story. These problems can be eliminated by crafting and reviewing ghost charts throughout the process. Throughout the halls of McKinsey, there used to be an expression, "Create a chart a day," which highlighted how important it is to document your observations and insights in the form of a powerful chart that you share with your team. In reality, though, you will probably need to create many charts per day.
RULE 2: CONDUCT MEANINGFUL INTERVIEWS
Interviews are a critical part of the data-collection process. In most consulting projects, interviews have more impact on the problem-solving effort than the secondary data. Why? First and foremost, the interviewees can provide direct and interactive feedback about the hypotheses you are testing. Many times, especially if you are interviewing client personnel or subject-matter experts, your interviewee can provide original thoughts related to past experiences, issues, and potential outcomes. One of my main research interests is knowledge management, and I have found that the vast majority of knowledge is stored in people's heads, not codified in documents (despite the huge investments companies continue to make in codifying such knowledge). Interviewees can also save you time in your search for secondary data, as they are often able to direct you to the most valuable codified knowledge in their field. This is particularly valuable when your team has limited familiarity with the topic under study. So, make sure that strategic interviews are a key part of your data-collection effort.
When it comes to the interview itself, one of the most common problems is that the interview is poorly conducted, and as a result is ineffective. The following three tips can help ensure that your team conducts meaningful interviews:
Before the interview. The quality of the interview is likely to be determined well before the interview even takes place. The two key steps are (1) identify the right people to interview (who has unique knowledge about this topic, who can respond to the hypotheses, and who will be involved with implementation or subsequent efforts?) and (2) develop and share an interview guide (what are the three key topics to cover?).
During the interview. A common mistake in interviews is to get carried away with trying to gather as much information as possible. A better strategy is to spend the time very carefully—for example, on insights and reactions to a hypothesis—and to build a positive relationship that would allow for comfortable follow-up conversations.
After the interview. While a "thank-you" e-mail, letter, or card is certainly a good idea, the real recommendation I have for you is to document, document, and document. Within 24 hours, the consultant must document the key takeaways from the interview, including quotes and references to additional material. Also, share your interview notes with the other members of your team to keep them in the loop on your research. McKinsey utilizes a template form and provides training in interview notes documentation.
RULE 3: GATHER RELEVANT SECONDARY DATA
Gathering the most relevant and powerful data is the backbone of good consulting and the opportunity for junior consultants or business school students to shine. Remember, since our goal is to be as efficient and effective as possible, we have to address ways to minimize the gathering of data that are not important to our story.
The starting point in strategic data gathering is to keep the context of the key question and the hypotheses in mind as you gather the data. If you start gathering a ton of data before you have really fleshed out your issue tree and internalized the key question, you will ultimately gather information that is perhaps only tangentially related to the core analysis. Have you ever worked on a project where the team gathered extra data and created charts t
hat were not used in the final deck or even the supporting appendix? If you continue to ask the relevant questions as you gather data, you will be more efficient. This topic will be covered more thoroughly in the next chapter, but teams must question the potential impact of the data during the collection process as well.
Of course, the foundation of good data collection is familiarity with electronic resource tools, especially for consultants working in smaller firms or business school students. The largest consulting firms of the world, such as McKinsey, employ research specialists who assist with and often run the data-gathering process. For the rest of us, electronic databases and search tools can be our best friends or our worst enemies. The only way to learn how to make them your best friends is to get to know them and spend a lot of time with them. Most of the top business schools have access to the best data-gathering databases in the world—and you will be astounded at the amount of information that is available to you with only a few keystrokes.
When I work with students (and executives), I start by having them do an inventory of the tools available to them in their school or company. For example, some of my favorite research databases for typical business problems include the following (note that these sources are always in flux):