Corporate strategy is not like business unit strategy whereby the corporate objectives of the parent are guiding the business unit / division.
Understanding business unit strategy is much easier if you think of a lighthouse analogy.
The direction and objectives of the parent company is the lighthouse, and the purpose of the business unit strategy engagement is to watch the lighthouse and build a plan to get there. So, the lighthouse may be, for example, insisting on no less than 30% market share and a return of 18% on all investments.
In business unit strategy, the consultants need to build a path to these objectives.
In corporate strategy, there is no lighthouse. The job of the corporate strategy engagement team is to develop that lighthouse. What should be the purpose, direction and objective of the parent? The canvas of options is blank and it is scary. The longer the team takes, the longer you have life rafts aka the operating divisions floundering around for direction.
This is why corporate strategy is so complex. There are so little rules and boundaries and an inexperienced consultant can end up causing a lot of damage. A too conservative management consultant would stick to the core business while a too adventurous management consultant could recommend a leap in a direction which is just too far to make.
The key is, of course, to follow the data. But it is naïve to think that is enough. It is also about judgment and an ability to extract a different understanding from the data.
An outstanding corporate strategy partner will need to think about where the market is going, what competitors would do and whether the client has the resources to execute the recommendations.
I was a corporate strategy partner and I have actually done very little business unit strategy work. The jury is, of course, still out on my abilities as a corporate strategy partner because we need to wait about 10 to 15 years to see how my clients will do.
Why so long? I worked in slow sectors like resources, utilities and state-owned-enterprises. They do not work as fast as Apple, and even if the advice I provided shows some results in the short-term, we need to see if it is sustainable in the long-term.
To imagine a state-owned-enterprise doing a victory lap, close your eyes and picture a snail jogging through molasses.
I could have written any piece about corporate strategy, but I want to focus on one area.
It will be very easy for me to talk about the data analyses and fairly creative techniques we used in this study. I am going to do that a little, because it is important to understand them for context. However, I am going to focus this piece on the method we used to help the executive board of a power utility in Asia understand their strategy choices.
In all my time in management consulting, this is the most elegant piece of work I have ever had the privilege to lead and help a client understand. When I die, I will think back to this study and consider it a crowning moment in my career.
I really consider all the mentoring I received as an associate, consultant and project leader etc., was building up to this moment.
“Utility” is the name given to a business that typically does not see radical shifts in its business model over short periods of time. Given the steep barriers to entry, and the massive costs to change, once the utility is operational it will generate a predictable flow of cash.
Think of your local water provider. That business has probably not changed much in the last 20 years. It is usually profitable but has very low margins.
Although this is true on average, there are pockets of rapid change. Deregulated markets where private companies can operate, customers can pick service providers and prices are set in the market tend to see lots of competition.
Parts of the US and Canada have experimented with allowing renewable power which allows consumers like you and me to put a solar panel on our roof and sell the power to the local utility. So there is movement in some parts of the world.
But most markets are dominated by one or two giants, and in most places just one giant. Since it takes so many years to build a power station or the lines connecting power stations to cities, the opportunity for change is limited.
If you have one dominant player, regulated or controlled by the government, then you can imagine how many corporate strategy studies will be done. Very few actually since the dominant utility in a market does not need a corporate strategy every year. Maybe a review every two to three years but never a full study.
As a firm, we had done a lot of work on power utility corporate strategy out of the German offices. The German offices have always been the flagships for corporate strategy work.
Particularly in that period from 1992 to 2000, the collapse of the USSR saw many Eastern European countries require help on restructuring and privatizing there power assets. So that is why the German offices had so much experience in this field.
Outside of Germany, we were not doing as well. McKinsey in particular was doing a lot of good work but there were not so many opportunities beyond the emerging markets.
We saw a shift begin around 2000. Massive urbanization around the world, consumerism and industrialization led to the start of huge power plant and transmission line construction projects around the world.
A good strategy partner needs to identify a trend that will impact a client and help a client understand that trend.
One of the things we noticed is that all corporate strategy work, including our German offices and competitors, build their recommendations on power plant construction around least cost analyses. I do not want to get into the details, but basically it meant which power plants cost the least to build and maintain over their lifetime.
Power plant construction analyses had been done that way for years. It was the gold standard.
When something is conventional wisdom it is ripe for disruption.
At a utilities conference in London, I had witnessed an academic by the name of Shimon Awerbuch present a very crude and theoretical way to advise power utilities on their construction projects.
When I saw what he presented, it is a little bit like seeing the answer to the meaning of life. While this poor guy was being obliterated by the energy planning executives in the audience, I could see the brilliant value of what he was presenting.
I think I saw more in his work than he saw in his own work.
We spent a lot of time talking and he shared much of his work with us. It was very crude and there was much more to be done for this to work. He had not even discovered the balance sheet constraints or how to fix them. Yet, the principle was brilliant.
Basically, it takes a well-established principle in financial economics and applies it to corporate strategy in power plants. Harry Markowitz and William Sharpe won the Nobel Prize in Economics for their work on mean variance analyses.
Basically, imagine a graph with stock returns on the y-axis and risk (volatility of the stocks) on the x-axis. Markowitz and Sharpe independently showed that if you had a portfolio of 100 stocks and changed the mix of stocks in your portfolio 10,000 times. You would get 10,000 different stock portfolios.
Each portfolio would generate a different risk and different return. If you plotted the risk and returns for each of the 10,000 portfolios you would get this beautiful efficient frontier graph. On a single page, you could visually see the impact of adding more tech shares to your portfolio.
The mechanics and math of this is not important.
Basically Awerbuch was saying you could do the same things with a construction project. If you wanted to generate 100 Gigawatts of power, you could have 10 nuclear stations do it, or 1 nuclear station and 9 coal stations, or 2 nuclear stations, 3 wind plants and 4 coal stations.
Each of these combinations produce the same output but have different returns and risk profiles.
When I looked at that I realized something important. A consulting partner wants the executive board in a strategy session to not become inundated with lots of data points. Basically, I do not want the board to get involved in vetting painfully specific data. That is not their job.
I would want the board to see the big picture and make decisions on the big picture. I have led several corporate strategy sessions with utilities and the board always struggles to understand the trade-off between different data points using different techniques and from different sources.
For example, the board will typically receive a pack of about 60 slides in a presentation, and not all are from the same consulting firm.
Some slides will be from the internal planning department showing the costs of building each plant. Another team will have a slide showing the risks to the transmission system. Another set of slides will show cash-flow projections. Yet, another may come from bankers showing the impact on the bond ratings if more debt was used. You get the point.
I have always seen board members struggle to understand what would happen if they took more debt. How would it impact all the other issues? Therefore, a meaningful discussion could not take place because those answers cannot be provided in the meeting.
What happens is the board will speculate and ask for further analyses, which merely paralyzes discussions. The trick is to give the board enough to make a decision, but not so much that the data is confusing or stalls decision-making.
All you end up seeing is a bunch of frustrated executives trying to make some decisions blindly.
What I saw in this new approach from this professor was an elegant way to get the board to make decisions. Rather than giving them 60 slides, we could give them 1 slide. Each of the options available to the company plotted on one simple graph, and you could intuitively see which option created more risk or return.
Granted, he had not figured out how to do it beyond very crude calculations, but the principle was sound.
For that matter neither had we.
Whenever you are introducing a new way of thinking to a client there is a level of education involved. The concept was and is unusual. It would take time for clients to understand its value.
Therefore, we decided to start discussing the idea with an Asian utility planning a large construction program. We had been retained to evaluate the different nuclear design options that the client could use. It was a basic feasibility study.
As that study was coming to an end, we began talking to the head of the planning committee to determine how his team could analyze the trade-offs and how he would get the board to analyze the trade-offs.
Conducting the kind of analyses required to produce this simple chart was extremely difficult. I had not seen it done anywhere else. Given its complexity, the head of committee was intrigued but not really qualified to know if it had the rigor of their current planning process.
Therefore, before he would commit to anything, he wanted us to meet the head of the cost-planning department to see if our approach would pass the rigorous standards of that department. Basically, did the new approach take into consideration all the factors currently taken into consideration? If not, why?
Proud engineers ran the cost-planning department. These were not guys who were simply waiting it out for a promotion to a management role. Some of them had had the same slide-rule desk for their entire career. If you enter the department all you see is a sea of slide-rule desks used by architects.
They check everything, in painful detail and with painfully long explanations.
That meeting morphed into 8 meetings over 4 months. They would raise questions and concerns and we would need to go back to office and think through the approach. Now, clearly this was taking up a lot of my team’s time and my engagement team of consultants were pretty busy on the main study.
So, we proposed the idea of doing a higher-level version of the study, just to test the idea. If that worked, we could consider the more detailed study. In many ways, this was a paid proposal to the client that allowed our team to begin working on the corporate strategy before the client issued a request for proposals.
As a state-entity, they had to tender everything.
I do not want to turn this piece into a discussion on the intricacies of the power sector because the lessons are useful to readers in all sectors.
Suffice it to say, we did an outstanding job thinking through the approach we will use to do the full study. We worked with the cost-planning team for 5 weeks to come up with the issues they wanted addressed in our analyses. We had still not figured out how to actually do the work, but the cost planning engineers believed we knew enough to proceed to the full study.
All this time, we did not care much about the fees or even the smaller high-level study we were conducting. It was nice to have it but we had two other more important objectives.
First, to ensure the cost-planning engineers signed-off on our approach. That was crucial for credibility.
Second, we were following a Trojan-horse strategy to plant ourselves in the department, which would be writing the request for proposals.
As we were showing the cost-planning engineers this simple and elegant way to get the board to understand the construction options, the engineers became excited. They believed that for the very first time, the board could receive a document that crisply articulated the trade-offs.
In a normal board meeting, the engineering opinion would be in a separate document while the business issues would be in another. The boards technically spent most time on the business issues. With everything on one page, the board could no longer ignore the technical impact.
For example, it frustrated the engineers that they were never allowed in board meetings. Yet, crucial decisions were made which they had to implement and which the board failed to analyze fully.
In one case that became the most cited tale of failure, the board had voted to reduce costs by changing the engineering standards applied. The board had failed to realize that the new standards did not apply to all the power plants and one plant had to be scrapped when it could not be run safely based on those standards.
The cost-planning engineers felt that although our approach did not discussed any engineering issues, the impact of those decisions could be plainly seen and that excited them.
Seeing this, the planning engineers basically wrote out the requirements, for the analyses portion of the corporate planning request for a proposal, which was asking for something no other consulting firm had heard of.
So imagine you are Accenture or Booz receiving this document and you see very detailed requirements on a technique to measure and integrate all the risk and return probabilities into one simple metric. What do you do?
You assume, like they did, that the client was just listing everything they could think of and was not really sure of what they wanted. So Accenture and Booz presented their usual approaches and did not meet the requirements.
Therefore, while we never influenced or wrote the request, our presence led to the authors of the request having very specific ideas of what the study should look like.
Therefore, the request for proposals are written in this manner. They typically are a list of items that must be completed. It is assumed that if the list of work is done, the strategy can be generated. It is not uncommon to see things like “scenario planning” and “economic model of the options” listed in these requests.
This is not what strategy is about, especially corporate strategy.
If the request is written in this way, it is conceivable a very weak consulting firm can secure the work if their proposal perfectly matches the listed requirements. These proposals typically work on a scoring system and the highest score can win.
We faced a bigger problem. Through our extensive efforts to educate the client about our new approach to present strategy options, we were implicitly telling the client our old approach was slightly inferior.
In addition, the client knew we had never tried the new approach at a single client, beyond the mini-study we had done for them.
Our problem was that to show the value of the new approach, we had to show the flaws of our old approach. So we had basically destroyed any chance of getting a meaningful portion of the 25% of scoring points allocated to prior work.
Therefore, the committee reviewing our proposal could simply conclude our past experience was not relevant. It is possible the committee would be mature about this and not do so. However, we did not know for sure.
In a state-entity request for a proposal, a major requirement is to dedicate about 15% to 25% of the scoring to previous work done of a similar nature. This contribution is so high because the request for a proposal is written as a scandal hedge.
Should something go wrong in the study or the strategy fail, the entity can also say they hired someone who had done this before and how could they have done anything more to vet the consultants.
Therefore, part of our effort was to make the team writing the request understand that prior experience should not count as extensively, but a greater proportion of the scoring should be allocated to how the study would be uniquely tailored to this client.
That was a fair way to do it, because a management consulting firm should not rely solely on its past work. It should demonstrate a unique approach for every client.
Therefore, this did not give us an advantage, but eliminated an unfair disadvantage we had at this point in time.
We won the right to advise the client. I believe we won it because we had a superior way to allow the board to make decisions.
When you are building any financial model, the bigger the model the greater the trade-offs in accuracy. Our approach was certainly less accurate everywhere, but provided a more useful output overall.
Other firms were going to partner with specialists and amalgamate all the different data. However, it still meant the board would need to pull out their reading glasses, squint and try to understand what was being done.
Make no mistake this was a tough modeling study where we would have nothing at the end of the day unless the model worked. However, I did not want the client to see that side of the study. We did not want them to think of this as a modeling assignment so we hid that part from them while we were finishing the model.
Therefore, for the first four to five weeks of the study we were in the awkward position of presenting theoretical updates to the committee.
Rather than showing them actual results in those updates, we had to show them dummy numbers and use that moment as an opportunity to teach them how to have the main discussion when the real output started arriving.
That was really tough to plan and manage. We had to resort to almost discussing case studies of competitors to help the board think about how to analyze the issues in the industry. This was hard to do because the board consisted of CEOs of other utilities as well, who felt they did not need the exercise.
I think there was a lot of frustration on the part of client because the output was taking so long to come out.
What we did was very challenging but I think it is important to discuss this a little further. I want to talk about one of many challenges, which made this work difficult to do. It is called the balance sheet constraint.
Anyone with basic math skills and exposure to Monte Carlo simulations can put together a model in one night and use different power fuel mixes for the power plants. You can have 100% coal, 80% coal and 20% nuclear, 20% coal – 20% hydro – 60% nuclear and so on, and generate the graph.
Yet, life is not that simple. In fact, this is what most people do when applying this technique to power plants. They, therefore think it is easy to do but get meaningless numbers.
Remember, to plot this graph you need about 10,000 different portfolios of power stations generating the same output.
If you allow the model to plot a portfolio consisting of 100% nuclear power stations in the mix, that may very well be the portfolio which generates the highest return with the lowest risk.
Obviously, every sane board of directors in the world would want to pick that portfolio. They would want to build a portfolio of power stations consisting of 100% nuclear stations.
Ha, but there is a problem.
You already have 70% of power generated by coal stations. So clearly, the model must distinguish between new and existing stations.
There is another problem.
The power stations are not built at the same time. They are built over time and the model must simulate this. It needs to determine when capacity is being reached, how long it will take to build, and then begin the building at the appropriate time.
There is still another major problem.
Assuming it does all of this, some nuclear station designs need to be built near specific locations to help them cool. So if the model started building that type of nuclear power station, it would need to do so near the basin of a river, and it would need to also build transmission lines if they did not already exist there.
Therefore, we needed to understand the transmission costs and constraints.
Moreover, the model calculates the return from the power stations. It works out the return and risk of each portfolio of power stations.
Here is the problem. Lets assume the perfect portfolio is 20% coal and 80% nuclear. However, the company may not have sufficient cash to build that portfolio, and if it tried to do so, its credit rating could drop and cost of borrowing increases. So, as you can see, the volatility of the returns is a different risk from the credit rating change due to deterioration of the balance sheet.
The solution was to model the entire balance sheet. Now, when you run 10,000 permutations of portfolios on a laptop or PC in the early 2000’s, it is going to take a basic Dell laptop 8 hours to run one permutation. We had to run 10,000 permutations.
That is a problem we honestly did not predict. This was before cloud computing, Amazon S3 or big data. There was no easy way to do this work.
We initially rented 50 Dell laptops and had to manually set the models to run and collect the results in the morning. We wanted to do the simulation on a small scale to see if the data at least worked.
That first few nights we had to have consultants work in shifts to watch the laptops and make sure they did not crash or freeze. If they did, we needed to reset the machine and rerun the analyses on that machine. So if the machine crashed at 6am and the other results where coming out at 8am, we needed to work with one less data point or wait for the resent laptop to complete its run.
The reason I am mentioning this is because in some ways we were well before our time in using big data and technology. Management consultants today drop around those words like it is easy to just plug in and play around with big data technology.
I agree that the technology is exponentially much better, but the problems have become even more complex. So, the net impact is that it is still confusing if you are working at the frontier of using technology to solve consulting problems.
Unfortunately, we had to completely rely on our fabulous office IT team for help. They were very nice about it, but this was well outside their league. We needed 10,000 portfolios a night at the bare minimum and 100,000 portfolios run to generate meaningful results.
But they really tried. They were up most nights trying to figure this out.
Eventually the cost-engineers came to the rescue. They were working with Accenture to use the mainframe at a university to run their infinitely more complex models. They worked with Accenture to write the code to string together everything and run the models at night.
Therefore 6 weeks into the study we had our very first results. There were some kinks and bugs in the analyses, but we quickly ironed them out and developed this simple, elegant and insightful single view of the client’s choices as a business.
While we were waiting for the results, we could have easily have shown the data from parts of the model.
For example, we had to calculate labor benchmarks for each power station, to use in the model and we had these. However, when we tried to present this to the client they would also push back and say “but you said the relationship between the data was more important so we will wait for first cut of the numbers.”
Yet, it was well worth the wait.
The discussion went as we expected. Though, I really took a lot of stress in the initial update sessions because we had little to present and used the sessions as education sessions. This paid off in the end because the board and project committee could easily understand and discuss the findings.
When I say “I” note that the associates and consultants typically do not attend the board meetings. So it was senior partners, the project leaders and myself. It was the senior partners who were putting gentle stress on me. It was stress nonetheless.
Let me give you an example of the counter-intuitive findings this approach presented.
There was an off-take agreement the utility was negotiating with a neighboring country in the South and it was generally assumed the deal was not beneficial. In the agreement, the country agreed to buy power from the client. The rates were not fixed and fluctuated with demand. Moreover, to meet that agreement, the utility could only build gas plants since the agreement was to provide peak power.
Peak power is generic phrase for power demand that happens very quickly and is expensive to supply because the plants producing them are expensive to run.
We call it the peak because if you visualized a graph of power supply over time across the country, the graph is typically drawn by horizontally layering the source of the power. So think of a lasagna dish cross section. At the bottom you have very cheap power like coal and nuclear plants.
The layer on top of this cheap layer is slightly more expensive. The layer at the top is the most expensive. That is usually gas. When power demand peaks it happens really quickly and you need to produce the power needed fast and for a short period. Gas stations are best for this.
They are called peaking power since they occupy the peaks of the graph.
The study showed that while this contract by itself destroyed value, its impact on the overall portfolio of power generation was very beneficial. For one, the behavior of gas prices hedged out the behavior of nuclear prices. So they were cancelling out each other. Roughly speaking, when the price of nuclear fuel went up 7% the price of gas tended to drop 5%, so the portfolio only saw a 2% increase in costs.
Typically, least-cost analyses could not show this type of relationship.
So we could have this type of discussion with the board. The kind of discussion they never had before. For the final board meeting, they even requested we run five special strategic options they had considered and we could show them the impact.
The biggest finding of the study was the client was not running their optimal portfolio. In other words, they were generating a much lower return for the risk they were taking on and to increase that return while keeping the risk the same, they could make a few operational adjustments.
The approach does not work in every sector, but it is effective way to present options on a single piece of paper.
SPREAD THE WORD! Like this? Please share it.
Also, remember to visit our iTunes account to rate us and post comments on what more you would like to see.