• Great articles on other sites
  • RSS Great articles on other sites


  • Renai's other site: Sci-fi + fantasy book news and reviews
  • RSS Renai LeMay

  • News, Telecommunications - Written by on Wednesday, August 1, 2012 17:17 - 18 Comments

    Pacific Fibre cancels Tasman/US cable project

    news A little over two years since it formed with the aim of building fibre-optic submarine cables between Australia, New Zealand and the United States, local telecommunications venture Pacific Fibre has folded, citing an inability to attract sufficient funding for the project.

    When it formed in March 2010, Pacific Fibre’s backers included The Warehouse founder Stephen Tindal, TradeMe founder Sam Morgan and Xero CEO Rod Drury, as well as former Vodafone marketing chief Mark Rushworth, telco veteran John Humphrey and entrepreneur Lance Wiggs. The group planned to construct a 5.12 Terabits per second, 13,000km cable to be ready in 2013, connecting the three countries. That capacity, it said, would be five times the level of the existing Southern Cross cable. The cable would also have the potential to branch out to reach several Pacific islands.

    At the time, the group acknowledged the project would be difficult. “This is a bold vision which, as realists, we know will not be easy to deliver, it will take a huge effort to complete, and has many risks,” said Tindal in March 2010. “While we have completed early feasibility work it is essential for people to know we now need to determine the level of interest from potential partners before we go to the next stage of a full business case, risk assessment and proof of concept to take to investors and bankers.”

    “We realise the risks are large but are prepared to push through to the next stage. We have released this news today primarily to ensure that any parties who are interested in this space have an opportunity to speak with us during this early planning phase.”

    However, in a new statement released today, Pacific Fibre said it had “resolved to cease operations”, as it was unable to raise the NZ$400 million required to fund the cable build.

    “A 13,000km cable is clearly an audacious thing to try and do. We were fortunate to find supportive shareholders, fantastic staff and early customer support from the likes of REANNZ and Vodafone” said chairman Sam Morgan. “We’ve spent millions of shareholder funds trying to get this done and despite getting some good investor support we have not been able to find the level of investment required in New Zealand initially and more broadly offshore.”

    Morgan said the global investment market was “undoubtedly difficult” at the moment, but Pacific knew the project was always going to be hard, regardless of its timing. “We started Pacific Fibre because we know how important it is to connect New Zealanders to global markets. The high cost of broadband in New Zealand makes it hard to connect globally and it is this market failure, not a technical failure, that we tried hard to solve” said co-founder and director Rod Drury. “We still cannot see how the government’s investment in [New Zealand’s Ultra-Fast Broadband project] makes sense until the price of international bandwidth is greatly reduced.”

    Pacific Fibre pointed out that in September 2011, Australian analyst group Market Clarity reported the cost of bandwidth to the US from New Zealand as 5.8 times greater than the price paid by Australians. “This project had encouraging early momentum and we were pleased to attract a great team and board, and shareholders who invested because they felt passionately that this problem needs solving for New Zealand”, said Morgan. “We believed funding for these long term infrastructure investments would have been more readily available and were confident the business case was solid. We feel like we’ve done everything we can to succeed and we are all hugely disappointed that we have not managed to get there. We’d like to thank our staff, shareholders, customers, partners and supporters”, Mr Morgan ended.

    opinion/analysis
    It’s sad to see this project be cancelled; Australia and New Zealand need all the international submarine cables they can get, in order to help drive down international bandwidth costs; an important factor for the local telecommunications players, given how much Internet content is located overseas.

    Image credit: Pacific Fibre

    Print Friendly

    18 Comments

    You can follow any responses to this entry through the RSS 2.0 feed. Both comments and pings are currently closed.

    1. Posted 01/08/2012 at 5:52 pm | Permalink |

      Hmm, wonder if the pending massive upgrade to capacity on Southern Cross scared them off?

    2. Hubert Cumberdale
      Posted 01/08/2012 at 6:13 pm | Permalink |

      I dont understand why they didn’t just ask me for the money. I have 200 trillion dollars just sitting here waiting to be invested. I’m sure I could spare some for this project.

    3. Posted 01/08/2012 at 6:40 pm | Permalink |

      Mmm, quite sad. But at least they were realistic and didn’t try the usual corporate claptrap of getting 2/3 of the way through, having chugged through more money than even originally planned and then collapsing the whole thing and running off with whatever’s left.

      I’m sure there will be another opportunity on the horizon for New Zealander’s to gain pricing parity with Australia in international bandwidth.

    4. Michael
      Posted 01/08/2012 at 8:56 pm | Permalink |

      Seems like a silly plan for a number of reasons

      1) So much content is cached in australia now (2/3rds of international traffic according to some ISP’s), which makes international bandwidth a lot more efficient.

      2) Going straight to America is unheard of, hawaii/guam have plenty of bandwidth and are cheaper to get to (pipe did it for 200m)

      3) Going straight to new zealand is odd, it makes sense because of the investors but prioritizing a country with 4 million people over australia seems strange, also i cant see why it wouldn’t be simpler to build more connectivity sydney-auckland

    5. Jacob
      Posted 01/08/2012 at 9:44 pm | Permalink |

      It’s likely the 15.7 percent tax the US government is proposing to charge submarine cable operators that killed this project.

      • Elijah B.
        Posted 02/08/2012 at 1:17 am | Permalink |

        If that’s the case then I’m at least pleased the US Gov has been deprived of their robber-baron extortion tax, considering they would have contributed absolutely nothing worthwhile to the project to justify the tax.

    6. Ferretzor
      Posted 02/08/2012 at 10:18 am | Permalink |

      It was my understanding Aus had quite a lot of international bandwidth available already, as in sitting there unlit waiting for a reason to be turned on. Perhaps you really can have too much of a good thing? Still a pity this project fell through though.

      Continually impressed by the engineering required to make something like this happen too. Can you imagine the forces on the cable due to the currents across N and S hemisphere? Its not like they bolt the damn thing down every now and then, I think it hangs mid ocean most of the way.

      • Posted 02/08/2012 at 10:20 am | Permalink |

        @Ferretzor

        Indeed, we use around 600-800Gbps on our international links. And we have, before the Southern Cross cable upgrade, around 6200Gbps. After the upgrade, I believe it’ll be coser to 8000Gbps.

        It is a pity for New Zealand though. Their international backhaul is alot more expensive than ours.

        • D
          Posted 02/08/2012 at 12:38 pm | Permalink |

          > It is a pity for New Zealand though. Their international backhaul is alot more expensive than ours.

          Actually it’s not – Southern Cross charge the same for US-NZ as they do for US-AU

          • Posted 02/08/2012 at 12:53 pm | Permalink |

            @ D
            ummm…

            Pacific Fibre pointed out that in September 2011, Australian analyst group Market Clarity reported the cost of bandwidth to the US from New Zealand as 5.8 times greater than the price paid by Australians.

            ??

    7. Matthew
      Posted 02/08/2012 at 11:56 am | Permalink |

      Am I reading it correctly?
      It looks like NZ mostly has a fibre to the node style infrastructure and are upgrading to the premises.

      http://www.chorus.co.nz/ultrafast-broadband
      “We already have 30,000km of fibre connecting our telephone exchanges and suburban broadband cabinets. This means that today, around 80% of New Zealanders are connected to a network with a fibre backbone”

      • Thrawn
        Posted 02/08/2012 at 3:15 pm | Permalink |

        Pretty much everyone in Australia already has their copper connected to a fibre at the exchange or cabinet. Been the case for well over a decade.

        The difference is that NZ completed their FTTN last year and their average copper distance is now a lot shorter than us.

        • Posted 02/08/2012 at 3:20 pm | Permalink |

          @Thrawn

          True. However, note they have now cancelled further FTTN rollout and are converting it to FTTP. Why? They haven’t even managed an average if 10mbps across the network, when they wanted 30mbps average.

          Why? The line lengths are still too long and the copper quality is not good enough. In most cases >1km. You need nodes about within 600m to get 30-50mbps.

    8. Trevor
      Posted 02/08/2012 at 12:43 pm | Permalink |

      Even though Australia has adequate capacity (today) the lack of competition in this space is what has led to our ridiculous packet charging model. Telstra pioneered this extortion model and it remains one of the biggest inhibitors to Internet innovation, which will only get worse in a post-NBN world where download/upload capacity limits will be unable to keep up with available speeds. Take streaming 1080p video – it is more expensive to watch streaming content in Australia than buying the blue-ray outright. Imagine how this will affect Australians when we’re trying to stream content with 4x the definition, or replace voice comms with HD telepresence.

      We may have the capacity right now, but what we really need is a lot more competition so Australians on fixed fibre can get access to products that aren’t so expensive per data packet that they essentially cripple our ability to communicate and utilise the Internet (and what it’s morphing into) as the rest of the world is able to.

      Additionally, on the subject of caching content, that is useless for telecommunications, remote access and vpn. There is a huge trend for enterprise to move to cloud services atm, but if you have inadequate bandwidth to your cloud servers this makes your job essentially impossible. I’ve managed several networks with relatively small organisations running Terminal Server desktop environments, allowing staff full access to do their jobs from any device in any location… As long as they’re in Australia. Terminal server, even in heavily optimised environments, is essentially useless from the US or Europe to locally hosted servers. This comes down to ubiquitous high speed fibre throughout all regions and is not an easy problem to solve, but the faster the networks at our end (read NBN) the more Australians will be pushing the envelope on what we can do with the technology and the greater the impact of inadequacies in other areas beyond our local terrestrial fibre.

      • PeterA
        Posted 02/08/2012 at 2:00 pm | Permalink |

        Caching content reduces the large draws on international capacity.
        Thus leaving more bandwidth for your other uses.

        So caching helps not just for the cached content, but for the uncached too. (that is what they were saying).

        No, Terminal services will never be acceptable from AU -> US/EU (or vice versa).

        Latency is too high. It is a fundamental physics problem.

        • Trevor
          Posted 02/08/2012 at 2:16 pm | Permalink |

          Latency doesn’t need to be the problem that it is today – a slight delay in responsiveness is acceptable. A five minute delay per click is utterly unusable. Obviously the technology isn’t capable of handling this issue and thus needs a communication architecture that is more flexible in high-latency applications. I foresee Microsoft developing an html 5 RDC client eventually that should be capable of elegantly surmounting these issues as long as reasonable latency is the limiting factor, but bandwidth constraints will need to be overcome for worldwide ‘cloud desktops’ to become a reality.

          And I completely agree that caching reduces international load, freeing up bandwidth for other uses, but the problem becomes when those ‘other uses’ are so fundamental and used so heavily that they saturate available capacity. This certainly isn’t going to happen within the next two or three years, but as I said, once people get used to the possibilities from large bandwidth for local applications, they will want to know why they can’t use the same technologies for international communication and connection. I can see this hobbling local innovation very quickly.

    9. Adrian
      Posted 02/08/2012 at 3:22 pm | Permalink |

      I could kick in a few dollars. Maybe they could launch a kickstarter? :)

    10. Dean
      Posted 02/08/2012 at 4:55 pm | Permalink |

      This seems like an interesting development, if true: http://www.interest.co.nz/business/60485/kiwi-pacific-fibre-cable-project-sunk-us-fears-about-chinese-investment-espionage-it-




  • Get our weekly newsletter

    All our stories, just one email a week.

    Email address:


    Follow us on social media






    Use your RSS reader to subscribe to our articles feed or to our comments feed.

  • Most Popular Content

  • Enterprise IT stories

    • Legacy health software lands SA Govt in court doctor

      In which the South Australian Government comes up with complex legal arguments as to why it should be able to continue to use a 1980’s software package.

    • Microsoft wants to win you back with Windows 10 windows-10

      The latest version of Microsoft’s Windows operating system will begin rolling out from Wednesday (July 29). And remarkably, Windows 10 will be offered as a free upgrade to those users who already have Windows 7 and 8.1 installed.

    • Qld Govt Depts have no disaster recovery plan brisvegas2

      Two sizable Queensland Government departments have no central disaster recovery plan, the state’s Auditor-General has found, despite the region’s ongoing struggles with extreme weather conditions that have previously knocked out telecommunications and data centre infrastructure.

    • ASD releases Windows 8 hardening guide windows-8-1

      The Australian Signals Directorate appears to have released a guide to hardening Microsoft’s Windows 8 operating system, three years after the software was released for use by corporate customers, and as Microsoft is slated to release its next upgrade, Windows 10.

    • ASG picks up $35m CIMIC IT services deal money

      Perth-headquartered IT services group ASG this week revealed it had picked up a deal worth at least $35 million over five years with CIMIC Group — the massive construction and contracting group previously known as Leighton Holdings.

  • Blog, Policy + Politics - Jul 31, 2015 12:43 - 0 Comments

    Google ploughs $1m into Australian tech education

    More In Policy + Politics


    Blog, Enterprise IT - Jul 31, 2015 14:16 - 1 Comment

    Legacy health software lands SA Govt in court

    More In Enterprise IT


    Industry, News - Jul 28, 2015 12:37 - 0 Comments

    ICAC to investigate NSW TAFE ICT manager

    More In Industry


    Consumer Tech, News - Jul 29, 2015 17:14 - 11 Comments

    Telstra integrates Netflix, Stan, Presto into re-badged Roku box

    More In Consumer Tech