One step forward, two steps back: Govt cloud policy displays classic ignorance

5

full opinion/analysis by Renai LeMay
16 July 2013
Image: Attorney-General’s Department

opinion/analysis The Federal Government has taken many positive steps forward in the past year with respect to freeing up its departments and agencies to adopt the new class of cloud computing technologies. But the release of an overly bureaucratic policy this month on offshore data storage has the potential to set that progress back substantially, relying as it does on several outdated concepts of risk management in IT projects.

The adoption of the new class of cloud computing technologies has had a very troubled start in Australia.

Although the use of cloud technologies from giant, predominantly US-based vendors such as Google, Amazon, Microsoft, Apple, Salesforce.com, NetSuite and others has found strong adoption in the nation’s massive small business community, which has absolutely no hesitation about adopting whatever useful technology it can get its hands on (often ignorant of the risk associated with such enthusiasm) take-up has been much slower in large enterprise and government circles.

The problem to date has stemmed primarily from structural challenges, rather than a lack of appetite per se for the new class of services.

In a macro sense, Australia’s large organisations tend to be concentrated in several distinct sectors, most of which had historically had strong regulatory constraints about how they store and access the sensitive data on consumers which they hold in their datacentres. In the nation’s large financial services sector, regulators such as the Australian Prudential Regulatory Authority have been very reticent to allow our banks and insurers to send data offshore at all, and have issued a constant string of warning messages about such a practice.

In Australia’s Federal and State Governments, the situation has been much the same. Up until several years ago, the term ‘cloud computing’ was a virtual anathema within the nation’s public sector. Part of this lack of enthusiasm for the new technology has come from the public sector’s own watchdogs — privacy commissioners and central government IT strategy groups such as the Australian Government Information Office, which have quietly warned departmental CIOs away from the cloud.

Other sectors have their own challenges. The nation’s large mining sector usually likes to keep most of its IT in-house or in the hands of a limited set of trusted outsourcers such as CSC; the telco sector is similarly constrained by regulation as the financial services sector, and the retail sector is largely known for being conservative. If you discount one or two examples, such as the innovation found in the IT-positive operations of supermarket giant Woolworths, the nation’s retail sector has been very unwilling to invest in IT at all. You need only turn your eye to the dot matrix (Yes, dot matrix!) receipt printers still found in Harvey Norman or the archaic point of sale systems used by major chains such as David Jones to realise just how out of date most of our large retailers are.

The other problem has been one of infrastructure. Up until 2012 and 2013, most of the major global cloud computing giants simply have not had local datacentre facilities in Australia. US Ambassador to Australia Jeffrey Bleich might have famously claimed in December last year that the preference for major Australian organisations to host their data in Australian datacentres, instead of the facilities available in the US, was a form of “cloud protectionism”, akin to keeping cash “hidden under the mattress”, but the truth is, as many in the local IT industry said at the time, that the preference was more about “consumer choice” than anything else — and a certain level of distrust for the US Government and its penchant for gaining clandestine access to datacentres through legislation such as the Patriot Act.

As it turns out, that distrust was certainly legitimate, as the recent revelation of the US National Security Agency’s hooks into US datacentres has shown. And many of the major IT vendors have responded to the local infrastructure issue. Companies like IBM, Rackspace, Amazon, Oracle, SAP and Microsoft have finally bent to Australian customer pressure and are building local cloud computing facilities to meet local demand for cloud computing services. Some of the top-tier players — especially Salesforce.com and Google — haven’t come to the party yet. But it’s still possible they will.

This changing dynamic has resulted in some rapid changes in some areas of Australia’s economy. Cloud computing projects have proliferated in the highly regulated financial services sector, as banking technologists have been able to work out clear guidelines for which types of IT services can safely be shipped off to the cloud, and what types of data can be stored where. And cloud computing pilots and even bigger deployments have even made their way into other conservative sectors such as retail and even resources.

The last holdout has been Australia’s public sector.

Over the past year, especially, this has begun to change. The sheer inability of Australia’s state governments to successfully deliver fundamental IT serivces through traditional means has driven most of them towards the cloud. Politicians in Queensland, New South Wales and Victoria now openly espouse ‘cloud-first’ as the paradigm for ICT project and service delivery going forward. In state government, the shift to the cloud hasn’t been so much a choice as an escape clause.

While we still haven’t seen any major cloud computing projects in the Federal Government, things have been on the move there as well. A succession of cloud computing policy releases by AGIMO has opened the door for further discussion of the new paradigm in the Federal public sector, and one of the last acts of Communications Minister Stephen Conroy in the portfolio was to release a new government cloud computing policy that mandates Federal Government agencies to consider cloud services for new IT projects.

Let’s be clear about this: This is a great thing. Over the next decade, the old models of bloated in-house enterprise software delivered through customised versions of major software suites from companies such as Oracle, SAP and Microsoft and refreshed once a decade is going to be drastically changed to a much more flexible model, standardised, made more inexpensive and financially predictable and delivered through a browser. The sooner Australia’s governments get on board with this paradigm the better; because it’s where everything is going eventually.

However, unfortunately, much of this momentum in the Federal public sector looks set to be undone by the low-key release earlier this month of a separate cloud computing policy by Attorney-General Mark Dreyfus and Minister Assisting for the Digital Economy Kate Lundy. Entitled Australian Government Policy and Risk management guidelines for the storage and processing of Australian Government information in outsourced or offshore ICT arrangements, the document is available online (PDF).

To understand the background to this new policy, and its likely impact on cloud computing adoption in the Federal Government, it’s important to understand a little about how the machinery of government works in Canberra.

Most overarching IT policy, especially with respect to standards and purchasing guidelines, is set in the Federal Government by either the office of chief information officer Glenn Archer or chief technology officer John Sheridan. Both divisions reside in the Department of Finance and Deregulation and were created in January this year from AGIMO.

AGIMO’s primary reporting line of responsibility had in the past been to Special Minister of State Gary Gray. However, earlier this year Gray exited the portfolio, with these responsibilities being added to the portfolio of the Attorney-General Mark Dreyfus. It also appears with the split of AGIMO, that CIO Archer and CTO Sheridan now primarily report into the Department of Finance and Deregulation, rather than to a separate Minister.

AGIMO’s role, and the roles of the new CIO and CTO, had primarily been one of encouraging positive change in the Federal Government through the setting of standards and working with departmental CIOs. In this sense, the previous cloud computing policies which were issued by the Federal Government have had an expansionary flavour, as AGIMO attempted to slowly remove boundaries to the adoption of cloud services.

In contrast, it appears as though the cloud policy released by Dreyfus last week comes from a completely separate area of government. It appears to be a document created by the Attorney-General’s Department, as part of the Protective Security Framework which it administers to help ensure the physical and information security of the Federal Government as a whole. In this sense, the document attempts to implement a constraining influence on Federal Government adoption of cloud computing — rather than an expansionary one.

This fundamental difference between this new cloud policy and the old batch being pushed out by the Federal Government is evident right from the start, in reading the document. Right up front is what the policy describes as its concern about “risk” involved when departments and agencies make choices about where their data is stored. It states:

“Most Australian Government information is unclassified and has been provided to the Government by citizens and businesses. In many instances the provision of such information is required by law. As such, the community expects the Government to protect information from unauthorised access or inadvertent public release. Where there are risks to personal information that may lead to widespread loss of public confidence and trust in Government, Ministers should be made aware and agree to the controls.”

The document then goes on to outline what is considers to be the steps involved in a “suggested risk assessment framework”.

The section entitled ‘Identifying, assessing and evaluating the risks” will be of particular interest to public servants seeking to make decisions about cloud computing. It asks such persons to consider questions such as the following examples, when discussing the risks involved in storing data in outsourced or offshored arrangements:

“What would be the impact of loss of confidence in the integrity of your information? How could an unintended disclosure of Government information occur in an outsourced or offshore arrangement? What would the impact of an unintended disclosure be for the various classes of information? Why could an unintended disclosure occur? What is the cause (actions, incidents or factors) behind the source of risk? Are there any measures in place that limit or encourage sources of risk? What would an unintended disclosure look like? What would an event or incident look like? Where would this happen? Based on arrangements, would this happen in Australia? If offshore, what countries could the information be stored or processed through?”

The document also raises a number of threats that could arise through data being offshored — such as the ability of foreign powers to access the data, the lack of transparency around access to the data, the prevailing culture of some countries (the document mentions examples such as “acceptance of corruption and white collar crime”), and complications arising from data being simultaneously subject to multiple legal jurisdictions.

Because of these reasons, the policy’s strictures around the offshoring of data are proscriptive, setting clear limits around the use of offshore facilities where data contains personal information about Australians.

It states that information that doesn’t require privacy protection can be stored and processed in outsourced and offshore arrangements after an agency level risk assessment. However, privacy-protected information (information that contains information about individual Australians) can now only be stored and processed in outsourced and offshore arrangements with suitable approvals in place. “The relevant portfolio Minister, and the Minister responsible for privacy and the security of Government information, currently the Attorney-General, will also need to agree to the arrangements,” said a statement issued by Dreyfus and Lundy.

On the face of it, the new policy makes sense. It explicitly enables formerly cautious departments and agencies in the Federal Government to make use of popular offshore cloud computing facilities (for example, Amazon’s Web Services or Microsoft’s Windows Azure platform), where personal information is not being referenced. This type of policy would appear to mirror similar approaches taken by Australia’s financial services industry. Where exceptions are sought, the approval of pertinent ministers would be required.

However, if you delve a little deeper into the implications of the policy, some disturbing facts rapidly become evident.

Commenting on the policy in a post on Delimiter shortly after it was released, Ovum research director of public sector technology Steve Hodgkinson — a former deputy CIO in the Victorian Government — pointed out that the “unspoken premise” behind the policy was that in-house and traditional dedicated outsourced ICT infrastructure was “safe, trustworthy, affordable and sustainable”.

Hodgkinson questioned what would eventuate if agencies were required to complete the full risk assessment methodology outlined in the policy for their existing ICT arrangements — which would then require approval by two ministers in order to certify the agency to continue daily operations.

“Some agencies, of course, are fully ship-shape … but many are not due to under-investment, ageing assets, skills shortages, sub-scale operations etc. … and budget realities prevent necessary remedial investment,” wrote Hodgkinson, implying that it was likely that cloud computing infrastructure, operated by expert providers on a massive scale, was likely to be more secure in many cases and offer more certainty in terms of risk, than the often poorly maintained internal IT systems which departments and agencies use now.

Indeed, audits have repeatedly shown that government departments right around Australia have terrible IT security. In June 2011, for example, Western Australia’s auditor-general handed down a landmark report which detailed the fact that none of a wide range of government departments and agencies in the state were then able to prevent basic cyber-attacks against their IT infrastructure — or even detect that they had taken place. Similar reports have heavily criticised IT security practices in most government jurisdictions around Australia as not being anywhere close to industry standard.

The Queensland Government’s IT systems, for example, have been found to be home to botnets that are actively involved in attacks against other systems. It seems clear that in many cases, departments and agencies would drastically decrease their risk by outsourcing basic IT infrastructure to the cloud.

As Hodgkinson added, the policy is “well-intended”, but its net effect would likely be to perpetuate the inefficiencies and risks of the status quo. “Sign-off from TWO ministers? Sounds like more hassle than its worth … safer just to carry on with customized development hosted in the existing ageing data center with un-patched infrastructure software and no tested backup or DR facilities. Better to stay under the radar and just cruise along mate! I know, lets do it as a common application shared between 5 agencies and run it on a multi-agency shared service center … that’d be excellent … keep us all busy for years,” the analyst wrote.

Another criticism of the document is that it contains only one definition of risk — the risk of information loss.

However, major vendors such as Microsoft have highlighted the fact that focusing on the security of information stored on IT infrastructure has often, in many cases, become a proxy for a more comprehensive and realistic IT risk management strategy. The company has published a vendor-neutral cloud risk decision framework (PDF), which it is attempting to use to show customers that there are many forms of risk involved in evaluating cloud computing migrations and that the risk of information loss is only one of those risks.

The document states: “A good number of organisations do not operate an enterprise Risk Management program. Risk associated with ICT deployment and operations is largely controlled through the rubric of ‘information security’. When environments operate within a contained enterprise and are slow-moving, security provides a logical proxy for IT risk management. However, when direct organisational control of it assets is diminished and shared along an elongated supply chain that extends beyond the enterprise, a more holistic and formalised means of managing risk must be employed. this is particularly true when dealing with project, or decision-related risk.”

The document mentions many risks which are completely unrelated to IT security, yet might impact the decision to move to a cloud computing environment. Some examples include vendor lock-in that might constrain the ability of the platform to interoperate with other business applications, the risk of degradation of the platform’s performance, the difficulty of moving legacy data into a cloud-based environment, the risk of poor business continuity management and so on.

One important risk that are not considered by the Federal Government policy is the risk of service failure in the case that a department does not move a key legacy IT system onto more modern infrastructure. Indeed, this is a real risk for Australian government departments. A comprehensive ICT audit of the Queensland Government’s ICT systems recently found that 90 percent of them were outdated and would require replacement within five years at a total cost of $7.4 billion, as the state continues to grapple with the catastrophic outcome of years of “chronic underfunding” into its dilapidated ICT infrastructure. In that case, the risk of not migrating onto modern cloud computing infrastructure (as Queensland plans to) is very likely substantially larger than leaving in-house agency systems un-touched.

Another example where other facets of risk management might trump security is the case where best practice exists. To take one example, throughout most of the past decade, the use of BlackBerry smartphones has represented best practice throughout virtually every major organisation in Australia. However, BlackBerry throughout that period operated a system whereby all email which is delivered to its devices passed through a central facility in Canada. Clearly the use of such a system would have been flagged as a risk under the government’s new policy and would have required high-level ministerial approval. And yet, throughout that period, there was no real alternative to the use of BlackBerry, and not using the system would have resulted in significant productivity losses to the Federal Government.

Lastly, another key factor which is not necessarily considered by the new cloud policy is the complex nature of data sovereignty issues with respect to cloud computing, which are not well-represented in the document.

Earlier this month, the University of New South Wales’ Cyberspace Law and Policy Centre published a whitepaper on this specific issue. It noted that data sovereignty issues had led government organisations to generally prefer to use datacentres within Australia to store data, in order to maintain physical jurisdiction over their most sensitive data.

However, it also noted that even this practice could not wholly safeguard data from being legally accessible to authorities in other jurisdictions. For example, many of the major IT services companies which provide cloud computing facilities in Australia — including Microsoft, IBM, Amazon, Oracle and others, many of which have existing major traditional contracts with government departments — are headquartered in the US, and may be subject to US legislation such as the Patriot Act, which is specifically discussed in the whitepaper. The UNSW whitepaper states:

“Some cloud providers in Australia will commit to host services within national boundaries to alleviate these data sovereignty concerns. However, even if the data is hosted domestically, it is nonetheless conceivable that some service providing access to the data could be hosted in a foreign jurisdiction, or under the control of another jurisdiction.”

In this sense, as Hodgkinson noted earlier, the policy’s focus on preventing procurement issues when using offshored cloud computing facilities may avoid a required level of scrutiny on some of the same risks being inherent in the current contracts and platforms used by Federal Government agencies. For example, it is possible that many Federal Government agencies are already putting their data at a certain amount of risk of access by foreign powers by housing it in Australian datacentres operated by US companies such as IBM.

Look, I don’t want to criticise the Federal Government’s new cloud computing risk management policy too harshly in this article. From a certain point of view, it is tremendously useful that the Federal Government is discussing the issue at all, instead of taking the ‘head in the sand’ approach that so many departments and agencies have in the past when faced with new technology paradigms. In addition, the explicit guidelines allowing non-sensitive data to be stored in offshore cloud computing facilities will certainly open up use by departments and agencies of those platforms. I suspect we’ll see a lot of agency website transferred to Amazon Web Services within the next year, as we’ve seen in other sectors.

However, it is also incumbent upon central IT strategists and decision-makers in the Federal Government to think in a nuanced way about the adoption of new technology, and not simply apply a blanket ban on its use that can only be overcome through exceptions stamped into approval by no less than two ministers (a feat, which many in the public sector will agree is virtually impossible to accomplish). This concept is particularly important when it comes to cloud computing, which is not a single technology nor even a single class of technology, but a whole plethora of wildly varying technologies that need to be evaluated separately and not as a whole class.

It’s only when we start thinking intelligently about government IT procurement that the best options will come to the fore. Putting artificial limitations on the adoption of new technology has never been a recipe for success.

5 COMMENTS

  1. I agree that the work is directionally sound, but that there are detail-level execution problems. I hadn’t picked up on the issue of two ministers, for example, for approval to go beyond its pre-approved scope.

    I’m also enjoying Delimiter 2, nice work mate.

    • Cheers, thanks for your kind words! :)

      The issue of having two ministers is a key one. I can’t even imagine how hard it would be to get offshore cloud projects with the wrong sort of data approved in that scenario. Even punting something up to one minister’s office in the first place is a big deal in the public service. Two ministers, when one isn’t even your minister, and is one as busy and as security-conscious as the Attorney-General? You might as well forget about it ;)

  2. In ’98 the transition to “outsourcing” IT and the bodies to manage it was a massive cultural & operational change shift for state and federal agencies, esp. deals like the CSC “Cluster 3” ( DIMA, AEC, DOFA, IPAU, AGAL & AusLIG ) leading into Y2K required massive changes in how the agencies “did business” even though for most part it was really just a change of guard bodies wise.. The adoption of “cloud” per se is in effect the same outcome for government, the biggest variance though is that with outsourcing you could go for a tour of the outsource providers data centre.. with Cloud that’s not so easy to do as the “boundaries” of “where” the cloud “is” are often very blurry..

    • Very true — I would agree that the adoption of cloud computing is as significant as those massive outsourcing deals back in the 1990’s. It requires a fundamental change in mindset for government departments and agencies. The main thing, as far as I can see, is business process standardisation. When every department has a heavily customised HR/payroll/finance system, your business rules don’t have to conform to anything specifically. But when you’re using a standardised solution from a cloud computing vendor, by and large you really do have to fit in with their system, as it’s much hard to modify it, and they have less incentive to do so, as they often can see the best solution from a 10,000 foot view of the operations of multiple customers.

      At the same time, the jurisdictional issues you mention are also present, and have only gotten more serious with the NSA/PRISM allegations coming out of the US.

      I really wonder what government IT will actually look like in half a decade’s time, and then again in a decade’s time. Everything’s clearly changing for the better right now, but there is definitely going to be turbulence along the way ;)

  3. Just found out about Delimiter 2.0 from the guys down at the York Butter Factory, nice job.

    One of the biggest things I tell people when they are thinking of moving infrastructure to the cloud is if their security is woeful on-premise, it will be woeful in AWS. AWS doesn’t actually give you any extra security tools at all, if anything it puts the onus back on the user to provide some of the more advanced security services.

    At least the actual exercise of moving stuff to the cloud is usually enough to prompt a complete review of their security policies, which has the collateral effect of making both their cloud presence and their on-premise more secure overall.

Comments are closed.