Conroy trusts public’s “common sense” on filter

20

Communications Minister Stephen Conroy this afternoon said he trusted Australians to get the mix of content to be blocked under Labor’s controversial mandatory internet filter project right, and that the Government remained committed to the initiative.

Facing strong opposition from the Coalition, the Greens and the general public, Conroy last year postponed legislation associated with the project while a review of the Refused Classification category of content (which the filter is intended to block) was carried out by the Minister for Home Affairs for the consideration of federal and state Attorneys-General.

Speaking in a Senate Estimates hearing in Canberra this afternoon, Conroy said the process of public consultation about what should be included in the Refused Classification category had begun, and noted he was “very relaxed” about that fact. “I’m very comfortable [for] all Australians to have their say,” he said. “Here is an opportunity to make your arguments. I trust to the common sense of the Australian public with respect to the classification system.”

Conroy said he had passed on his private viewpoint to the review, notably that he believed content featuring child pornography, bestiality and pro-rape material should be blocked under the mandatory filter.

A number of surveys with regard to the filter over the past few years since the policy was introduced have shown that the Australian public has mixed feelings with regard to the filter. On the one hand, a number of surveys run by the Sydney Morning Herald, ZDNet.com.au, Whirlpool and other media outlets have shown stark public opposition to the plan. However, in May 2010, a survey commissioned by groups opposed to the policy have found that most parents strongly supported the filter idea — although the more information they received about it, the less likely they were to support it.

A number of ISPs, initially including Telstra, Optus and Primus, but now including two more — Webshield and a company Conroy described as “IT Extreme” have already committed to implement mandatory filtering technology on their networks to block the smaller category of child pornography, but Conroy said today that several other companies — TPG and Internode — had refused to do so. Others are believed to be awaiting the development of an industry code on the matter.

Internet regulation in general
In a broader sense, Liberal Senator Simon Birmingham asked the Minister whether the Government remained committed to the filter project.

Conroy said he believed that the debate over the filter had reached the point where “nobody is trying to pretend” that there was any issue with the filtering technology reducing broadband speeds while providing its blocking functionality, or that it was either underblocking or overblocking content. The debate, he intimated, was now about what content should be included in the Refused Classification category of content.

And, on the matter of whether the existing voluntary filter regarding child pornography might be enough to meet policy objectives: “If you believe a voluntary filter should block child abuse, how would you justify having a voluntary filter not block a bestiality or pro-rape website”. “We’ll be moving to implement our policy, yes,” he said.

In a broader sense, Conroy noted that Governments around the world were speaking to major Internet companies with the intent of regulating the Internet in various ways; stating that the argument that the Internet should be unregulated didn’t hold water. “If the starting point is that there should be no regulation of the net,” said Conroy, “it’s one that I am going to disagree with.”

Conroy pointed out that when spam had become a problem over the past decade, that companies had “beaten down the door” in trying to get Governments to address the problem in a systematic way — so there was currently some regulation in place with respect to the Internet.

“The Internet is becoming a major centre of economic activity,” the Minister added, stating that the online environment opened up significant opportunities for organised crime, and that it was necessary that Governments therefore examine the Internet in terms of privacy and security.

“I think there’s a more mature debate developing around the world and I look forward to having that in Australia as well.”

Image credit: Kim Davies, Creative Commons

20 COMMENTS

  1. I think there should be nothing in the Refused Classification category. If it’s illegal, take it down. Classification is exactly that and should not be about censorship.

    Also, I don’t think anyone has conceded that there’s no chance of over/underblocking or that there’d be no impact on speed. The man’s an idiot.

  2. Ok, who was the idiot who said the filter will slow the internet down?

    That was never going to be an issue, and it’s just given Conroy fodder to throw out whenever anyone mentions valid technical issues with the filter.

    • Censorship has to happen in real time, the limit on the number of rules that the foltier checks is extremely depended on hardware resources.

      If a filter DOESNT slow down the net its only because its a crap filter or its costing you thousands.

    • The ISPs that were trialling the filters initially claimed some of the filter techniques showed drastic slow downs. Besides Conroy throws that around because he knows the rest of his arguments lack any real weight at all and is a moron.

          • Since we’re just talking URLs the static filtering would be a fixed list, so that when someone attempts to access a site their URL would be compared to that list, pretty straight forward.

            Dynamic would introduce variables, ie. you might want to block child sex for example, now when a site is access the content of the page has to be dynamically compared to a set list of criteria as to whether or not the filter believes the page to contain RC material or not.

            This is also why Conroy says the filter is 100% accurate, he’s right, with a static list if it’s on the block list it will be blocked, whether or not it should be on the list at all is another argument.

            It’s only with the dynamic fitler that overblocking and underblocking can occur because it’s all determined by the filter criteria, and of course due to the extra checking requirements this is when slows would occur.

          • Yea, static filter is easier, but as the trial showed, it doesnt make it accurate. Wa it a dentist webpage that got blocked ?

            Static filters doent work very well with “web 2.0” because not all content has a seperate URL, its dynamic.

            So the only way to filter dynamic web pages with a static filter is ban the whole site, which is potentially punishing the innocent.

            I expect the static filter will be replaced with a more powerful one in time, they will say its to help us.

            Whatever the social problems are, censorship is NOT the answer.

          • Yea, static filter is easier, but as the trial showed, it doesnt make it accurate. Wa it a dentist webpage that got blocked ?

            Actually that shows the filter was accurate, that page was on the list so it got blocked, why it was on the list is another matter because it had to be manually put there by someone and not have the filter dynamically determine it had RC material.

            This is part of ongoing debate of what the RC classification should be, who determines it, and if your site should end up on the list incorrectly how does it get removed.

  3. What rules?

    It’s pretty established that the filter will be a static black list with no dynamic features about it at all.

  4. He trusts the public to make the right “common sense” arguement on the filter, so what, we’ve not been using common sense this whole time when we fundementally oppose it? Well I’ll be.

  5. Conroy has to trust the public’s common sense, since he has none of his own.

    On the speed issue, the biggest problem is one of scaling. A single page is NOT a single URL that needs to be checked.

    As of this moment – (8:10am) – if I save this webpage into a folder on my hard drive, 92 separate objects are save. The HTML, plus 91 other objects. Objects such as images, javascript modules, cookies, and CSS code.

    So if this page was to pass through Conroy’s URL filter, it would need to do 92 lookups against his “naughty filter” – one for each and every addressable object that goes to make up the page.

    Now, he claims that each lookup takes “one seventieth of a blink of an eye” – (I find that difficult to believe, but let’s take that as the number for now) – add those 92 lookups together, and you have about 1.3 blinks of an eye to check the list 92 times, plus the loading time for each of the objects.

    Now, lets says 20,000 of your users are trying to access the “filtering server” at the same time, now you add a contention time to the delay. You could add more “filtering servers” to improve performance, but now you’re adding extra cost – (hardware, maintenance, administration, compliance activities) – to the solution.

    It’s quite conceivable that during high load times – (evening peak in particular) – the delay introduced to loading this particular page might be a few seconds. Now, how important is a few seconds? For many people, not very – for some it will be.

    Is it significant? That’s a point of argument and an argument people can (and will) have. But to say there will be no discernible impact on speed is just plain wrong.

    What happens when the “filtering server” goes down? Is the page just allowed, or by default is everything blocked to make sure none of the bad stuff gets through while it’s down?

    Now you’re talking about continuity of service issues that just having the policy at all introduces.

    This is the stuff Conroy just doesn’t seem to get.

    • In addition it is dependant on whether any of the URLs on the blacklist are from high traffic sites or are colocated with high traffic sites. From Telstra’s report:

      “If the URL somesite.com/really-bad-stuff appears in the blacklist, all somesite traffic is directed via the proxy server.
      Video clips from high traffic sites are very popular with typical Internet users, accounting for up to 10% of traffic. If any content from sites distributing these video clips were to appear on the blacklist the blocking solution would fail because 10% of 40Gb/s of traffic is greater than the 1Gb/s capacity of a proxy server.”

      Conroy likes to skip that part of the report, along with the part which says it is trivial to bypass.

    • You are correct… but… that’s only the tip of the iceberg. it gets worse from there. First thing that happens after a filter is introduced is pages switch to HTTPS (and anyhow, pages are switching even without the filter). So then the filter can’t see the URL and they need to start blocking IP addresses and/or DNS lookups… both of which have their own set of problems.

      Then there’s VPN’s, a whole new kettle of fish.

      Then there’s the problem that the stuff on the blacklist is secret, and because it is secret it cannot be contested. Thus, the accused never get’s to face his/her accuser, never gets to hear the evidence against him/her (presuming there was any evidence) and we overturn a very basic principle of law that is hundreds of years old.

      Then we get on to browser plugins, and what they can do ;-) fun fun fun

  6. Conroy’s arrogance on this topic really gives me the pip. He doesn’t know what he’s talking about, and that doesn’t seem to bother him at all. It’s a disturbing attitude.

    So much for “that’s why we have experts”. Does anyone listen to them?

    Mechanic: “Your cooling system needs replacing.”
    Customer: OK

    Sysadmin: “Our mailserver needs replacing.”
    Boss: “No it doesn’t.”

  7. Here’s an idea Sen. Conroy.. Just reinstate the Howard era free PC filters, and let the public exercise their own common sense at home..!!

Comments are closed.