Leaked MoD SDSR “lessons learned” document

PROGRAMME BOARD, 3 NOVEMBER
SDSR: LESSONS IDENTIFIED

1. This note captures some key lessons identified in the SDSR process, from a Defence perspective. It is intended to:

– provide an MOD contribution to a wider government exercise to be led by the National Security Secretariat;

– support, where necessary, implementation of the SDSR outcomes; and

– assist those preparing for future SDSRs.

PREPARATION

2. The Department did significant preparatory work on:

– policy development, in particular Global Strategic Trends and the Future Character of Conflict exercise led by DCDC and the Green Paper led by the then Secretary of State;

– management information, in particular the Cost of Defence, development of the FAST tool which enabled us to turn postures and ambition into costed force structure and the ICAT tool which modelled the industrial implications of capability decisions; and

– project planning. A detailed proposal on sequencing and timing was developed, endorsed by PUS and CDS, and socialised within the Department. (Summary graphic at Annex [ X ])

3. This work generally proved useful. The Green Paper and FCO provided a firm basis for policy discussions and a broadly shared understanding of the challenges facing Defence within the Department. The management information underpinned each stage of discussions during the Review, allowing the Department rapidly to cost policy options. The project plan broadly guided our work.

4. However, further preparatory work might usefully have been undertaken:

– on high-level military judgements on future capabilities. The Force Testing process delivered significant insights into future requirements and potential trade-offs. But the sequencing of the SDSR made it difficult to fully exploit these insights in the decision-making process. There is a strong case for running a similar exercise in the six to nine months preceding a future review.

– on engaging international partners. A rapid consultation exercise was carried out during the Review. But the responses were received only as decisions were being taken (and collated only as they were being confirmed). It would have been preferable to undertake this exercise in advance of the Review, especially with close allies.

– on the local and regional implications of potential decisions. We were able to advise the Secretary of State (and other Cabinet colleagues) of the broad implications of our final decisions. But work to identify these began very late in the process and proved extremely challenging.

– on coordinating internal project planning with Cabinet Office planning and ensuring we and they developed a shared understanding of realistic timelines.

5. And, in general, the preparatory work would have benefited from:

– a clearer and more hard-nosed description of the financial challenges faced by the Department. There was some evidence that the wider Department did not fully understand – or accept – the affordability challenge until late in the process. It was clear that none of the three services had developed meaningful internal thinking on how to deliver a 10-20% reduction in their resource baseline. An earlier understanding may have generated more radical alternative ideas.

– More external challenge. This would have gone some way towards balancing the inevitable reduction in external activity once the Review was underway. It may also have contributed to the generation of more radical options for change.

GOVERNANCE, DECISION-MAKING AND RECORD KEEPING

6. The Review was overseen by the National Security Council, with meetings prepared by the SDSR Steering Board, chaired by the National Security Adviser, with PUS and CDS representing the Department, and the process facilitated by a Coordination Group chaired by the Deputy National Security Adviser with DG Strategy representing the Department.

7. Within the Department, the Review was overseen by the Defence Strategy Group chaired by the Secretary of State (Membership at Annex [ X ]), and facilitated by an SDSR Programme Board chaired by DG Strategy. Additionally, in the later stages of the Review, the Secretary of State convened smaller groups involving the PUS, CDS and DG Strategy to provide advice on key decisions. The papers for the DSG, including where appropriate recommendations, were circulated by the Boards Secretariat and records of the meetings, including decisions, were recorded by the Secretariat.

8. Overall, the governance arrangements worked well. In particular, the DSG provided a forum for wide-ranging and substantive discussions; and was effective in ensuring Ministerial control of the overall process within the Department. The NSC proved effective in addressing the outstanding challenges in the later stages of the Review.

9. However,

– within the Department, some (in particular in the single services) believed that DSG discussions would have benefited from greater preparation by a subordinate group. The Programme Board made some contribution to this. But it functioned primarily as an advisory body for DG Strategy: it was not asked to clear papers which went to the DSG.

– Some have also argued that engaging all Chiefs of Staff in discussions with the Secretary of State on single-service issues was counter-productive in the longer-term. Although it contributed to robust debate within the DSG, it has potentially created longer-term tensions

– Some of the DSG decisions were potentially ambiguous and remain under dispute. It might have been valuable to have a more formal record of decisions, with greater clarity and with that record formally agreed at subsequent meetings. This would have assisted with implementation and also provided a long-term audit trail.

– The earlier meetings of the NSC did not provide the guidance the Department required to take forward detailed studies on force structure and capabilities. (See Policy Baseline above.) The discussions on threats and interests, in particular, though important way-points on our internal plan, provided little useful guidance on next steps. The complexity of Defence issues inevitably meant that the NSC took some time to become familiar with and begin to address the most pressing matters. It would have been preferable, from the Department’s perspective, to have had a more regular and substantive sequence of meetings early in the process.

POLICY BASELINE

10. The National Security Strategy effectively provided the policy baseline for the SDSR. It was prepared by three set-piece NSC discussions: on the UK’s national interests, on prioritised threats and risks and on key policy choices. The final NSC discussion had also been prepared by detailed DSG discussion of the “Policy Terrain” and the Government’s preferred approach to managing threats and risks.

11. The original intention was to put the key conclusions into the public domain through a speech mid-way through the Review. In the event, the Strategy was published the day before the SDSR and finalised in the preceding fortnight. The policy baseline for much of the Defence work therefore was limited to guidance that the NSC favoured an adaptable posture and summary minutes of the NSC discussion of risks.

12. From a Defence perspective,

– it would have been helpful – at least in process terms – to have a clearer policy baseline early in the review. (The original plan envisaged agreement of that baseline within six weeks.)

– it would also have been helpful to have a centrally-endorsed baseline which integrated: strategic approach; military tasks; planning assumptions on our level of ambition; geographical priorities; and areas on which risk should or should not be taken. (A developed draft of such a document had been prepared by the FCO in advance of the decision that the National Security Secretariat should lead the Review.)

POLICY AND CAPABILITY STUDIES

13. The Department undertook 36 separate studies and contributed to 12 others under Cabinet Office leaderships. A full list is at Annex [ X ]. The majority were led by individuals who had some knowledge of the issues, but who were not currently responsible for them. The Terms of Reference were deliberately broad in order to allow a wide-ranging approach to the issues. The length of the studies differed in order to ensure that reporting was sequenced, with the shortest studies being given 3 weeks, and the longer studies being given 8 weeks.

14. The studies proved useful in creating a body of information and providing some challenge to subject matter experts. The most effective studies provided a point of reference right through to the final days of the Review.

15. However,

– there were probably too many areas of study. It was not possible to address the full range of issues raised in the time available. Some studies were discarded relatively quickly because they failed to generate new ideas or identify areas for change. But others were simply put to one side because there was limited time to engage on the full range of issues. A more effective alternative approach might have been to limit the areas of enquiry to the principal force structure and capability issues, plus two or more other areas – such as medical support, training or defence engagement. Regular reviews may allow us to manage the scope of the Review in this way.

– The relationship between the study teams and subject matter experts was not sufficiently clear. In some cases, they worked consensually. In others, the study team provided a clear (and, on occasion, uncomfortable) challenge function. In the latter case, the subject matter experts felt disenfranchised. This variety of approach was particularly noticeable in the three key studies on the sea, land and air environments – which yielded quite different reports. There may be value in setting a single model in future reviews.

– The status of the studies also lacked clarity. The studies reported to DG Strategy, and were provided as background to his subsequent recommendations to the Secretary of State and the DSG. The wider Department was, therefore, not being asked to endorse the study’s conclusions. But this distinction may not have been made sufficiently clear.

– Some felt that that the brevity of the papers provided for the DSG, in précising the outcome of the studies and the recommendations proposed, lost necessary detail. (Though others, including Ministers, felt that the approach enabled a high-level discussion of the issues.)

FORCE TESTING

9. VCDS led a process to investigate the key drivers within the force structure, based on 7 scenarios covering an illustrative range of future operations. This included a scenario in common with the US QDR. The work also looked at the inter-relationships between elements of the force structure (eg logistics support vs enablers vs front-line), and issues relating to concurrency requirements. As the Future Force 2020 structure was developed, it was iteratively compared with the VCDS-led Senior Judgement Panel conclusions, highlighting areas of risk, shortfall and affluence. This Force Testing work was useful and has been warmly welcomed by all stakeholders. However,

– it should have begun earlier (see above);

– it would have been valuable to involve Ministers to ensure the process fully recognised the political choices underpinning force generation decisions and to ensure Ministers understood the risk trade-offs associated with capability decisions.

– It would also be valuable to involve external experts (including from other government departments) in both the design of the scenarios and the conduct of the Judgement Panels.

– There is a strong case for separating out two functions of the process. The Senior Judgement Panels both generated insights which informed the decision-making process (force development); and putatively provided an assessment of the risks associated with the final decisions (force testing). The former should precede the main review. The latter should be formalised, be undertaken towards the end of the Review and provide advice to the Secretary of State on the risks associated with the final force structure.

SYNTHESISING SDSR, SR10, STRATEGY FOR DEFENCE AND PR11

16. The Department’s work on the SDSR, Spending Review and Planning Round was taken forward in parallel, drawing on the same management information. SDSR options were generated and costed by the PR teams in Strategy Management and Defence Resources. The results of SR negotiations were factored directly into SDSR baselines. This ensured the SR negotiations reflected the SDSR process and the PR process was prepared to implement the SDSR outcomes. It enabled DG Strategy to put synthesised advice to the DSG. This proved an effective way of working and should be repeated in future reviews.

17. However, beyond the Strategy area, there was on occasion insufficient clarity on the lead responsibilities for advice to the DSG and to the Secretary of State. A disadvantage of asking non subject matter experts to lead on studies is that there can be confusion over who should lead on follow-up questions. This proved a particular issue with capability issues. For example, advice on carrier strike or amphibious shipping could be generated by single-service staffs, Navy RP, the Capability area or individuals commissioned by DG Strategy.

WORKING WITH OTHER GOVERNMENT DEPARTMENTS
18. The SDSR was consciously an attempt at coherent pan-Government decision making overseen from the centre. But this raised a number of issues that should be thought through before the next Review:
a. Paradoxically, the firm management of individual departments by the Cabinet Office drawing all the inputs together in the centre led to a progressive reduction in productive engagement between experts across departments as the Review proceeded;
b. The process for production of briefing material for Ministers, press offices and the chain of command in the final stages was complicated by the separate Cabinet Office and MOD systems, and it was unclear much of the time where final responsibility and accountability for this material lay. The severe constraints on internal communication in the final stages also made the production of accurate and informed material in a timely fashion significantly more difficult, increasing the risk of inaccuracy or incompleteness, including in material produced for constituency MPs.

INTERNAL COMMUNICATION

19. The Strategy Area held eleven townhall meetings during the course of the Review. All TLB HQs had a townhall meeting, with a further three townhalls in London and one in Bath. DG Strategy issued six updates to all staff. CDS and PUS wrote to all staff at the outset, in September and on the day of the announcement. The single services and other TLBs also provided updates for their personnel. Feedback suggests that all of this activity was received positively.

20. However, by general consensus, internal communication of the final decisions was badly handled. At Cabinet Office direction, there was no pre-briefing of the chain of command and no pre-warning of units affected by the changes. The combination of well-sourced media stories on final decisions (eg on Nimrod or Ark Royal) and these restrictions on internal communications have badly damaged the confidence and morale of our personnel and created a poor baseline for implementation.

21. In retrospect – and as advised by the Chain of Command at the time – the restrictions on internal briefing were clearly a mistake.

EXTERNAL CONSULTATION AND COMMUNICATION

21. The Department undertook internal and external consultation exercises during the Review. An internal exercise aimed at members of the armed forces and their families and defence civilians, and an invitation for Members of Parliament and other interested parties to offer contributions to the SDSR, were by the Secretary of State in Parliament. The Cabinet Office and Treasury also ran parallel consultation exercises on the Coalition Programme and on public expenditure, which produced a significant number of defence-related inputs. In total well over 6,000 responses were received through these routes. These had little substantive impact on the Review. The limited timescale available and the specific format of the online consultations produced inputs in a format difficult to analyse and relate to the issues being considered in the SDSR, and most substantive external inputs were submitted too late in the process. In retrospect, the online consultation approach was poorly suited to generating useful information and some other means should be considered in future. Both exercises would more usefully have been run in advance of or in the early stages of the Review.

22. The Department engaged NATO and EU partners through discussions with international staffs and a conference for all member states in Brussels; and a wider group of international partners through Defence Attaches. All partners appreciated the exercise, and there were a variety of useful responses, but it proved difficult to integrate the outcomes into our SDSR decisions as the exercise proceeded in parallel with early decisions. The exercise would more usefully have been run in advance of the Review.

22. Two areas might have benefited from greater engagement.

– the Secretary of State briefed the HCDC on the approach to the Review in July 2010. But there would have been value in engaging Committee staff or members throughout the Review to ensure they had a greater understanding of the process the Review was following and the issues that were under discussion. This could have, inter alia, reduced the misinterpretations in the HCDC report on, for example, a rushed and chaotic process (which in turn led much of the media comment and some internal reactions).

– the Secretary of State also engaged some key academics during the review and offered speeches at RUSI and Chatham House. But, in general, Departmental engagement with external experts was much reduced for the period of the Review. This reduced the extent to which our ideas were challenged. It also limited our ability to shape expert and media reactions to the outcomes and lost an opportunity to enhance our reputation as an open organisation.


Tags: , ,


Leave a Reply