The Internet Society of Australia (ISOC-AU) has voiced its criticism of the Federal Government’s mandatory ISP-level content filter, arguing ISPs should have no role in determining what content their customers can access.
In its submission on the Government’s Accountability and Transparency for Refused Classification Material consultation paper, the organisation representing the interests of Internet users said a mandatory ISP-level filter was neither practical or effective.
“We do not support the Government’s announced policy to require ISPs to block Refused Classification (RC) rated material hosted on overseas servers,” the submission reads. “ISPs should not have a role in determining content that their customers access. Their only proper role is to transfer packets from the sender to the recipient(s).”
ISOC-AU said such a filter would not prevent access to a vast array of “unacceptable material” on the Internet either because it is delivered by means other than the web or because the URL of the material varies with each access.
“As the Enex Testlabs report recognised, with the ease of changing domain names and IP addresses, any such list can never be considered as either complete or current,” the paper reads. “Further, blocking of RC material will not stop children from viewing material that is inappropriate for them.
“The Government’s announcement suggests that the report supports the efficacy (and utility) of blocking URLs on a very small blacklist. However, the report highlighted that it is not feasible to filter traffic accessed through HTTPS, peer-to-peer, instant messaging, or any mechanism other than simple web traffic, including any sites using dynamic database-driven content where the URL varies with each access.”
ISOC-AU also said that to filter all RC content - such as material which included the detailed instruction of crime or drug use - had the potential to block many topics of legitimate public debate.
“However, the use of some drugs, or the nature of some ’crimes’ such as euthanasia or the advocacy of informal voting or a form of voting which is legal but unconventional are the subject of legitimate public discussion and debate, and should not be automatically included in a list of content to be blocked from the Internet,” the report reads.
“There is also quite legitimate debate about whether images that are readily available publicly (painting, photography or sculpture) should be classified as RC and therefore blocked.”
In a media statement the organisation said that if the Government did proceed with its filtering proposal, then a review of the use of RC classification as the basis of blocking URLs was required.
ISOC-AU also called for notification to both the site owner and user if a site is found to contain RC material, and for an independent and efficient review process to review the classification of any blocked material based on agreed community standards.
Google continues filter opposition
Google this week also published its submission in which it argued that the wide scope of content to be filtered could include not just child pornography but also socially and politically controversial material.
Iarla Flynn, head of policy at Google Australia noted in a blog post that it would also remove choices for parents as to what they and their children could access online.
“Moreover a filter may give a false sense of security and create a climate of complacency that someone else is managing your (or your children's) online experience,” the blog reads.
“A large proportion of child sexual abuse content is not found on public websites, but in chat-rooms or peer-to-peer networks. The proposed filtering regime will not effectively protect children from this objectionable material.
“Moreover, the filter appears to not work for high volume sites such as Wikipedia, YouTube, Facebook, Twitter, as the impact of the filter on Internet access speeds would be too great.”