Common questions and answers

New Zealand is opposed to making, publishing, viewing and trading images of child sexual abuse. Nothing provides the total solution to people abusing children in this way but in New Zealand, we do what we can to reduce demand, identify offenders, and protect children.

The Department has entered into a partnership with ECPAT New Zealand, part of a global organisation the purpose of which is the elimination of child prostitution and child sexual abuse material and trafficking of children for sexual purposes.

ECPAT is operating a hotline through its website (www.childalert.org.nz) so that members of the public can report suspect sites, not already identified by the Department.

The Department is also working in partnership with NZ Internet Service Providers by offering them the Digital Child Exploitation Filtering System to protect their customers from accessing these illegal websites inadvertently or otherwise. 

The filter has caused debate in the media and the blogosphere, including some quite misleading and ill-informed claims about its purpose. 

The Department has received many questions about the filter. The Code of Practice provides information about the operation of the filter and, in the interests of ensuring informed discussion, we will publish our responses to common questions below.

What is the intention of the filter?

The Digital Child Exploitation Filtering System has a very narrow purpose. It blocks access to known websites that contain child sexual abuse material.

It is one of the Department’s measured responses to community expectations that the government and internet service providers (ISPs) should do more to provide a safe internet environment.

It is designed to assist in combating the trade in child sexual abuse material by making it more difficult for persons with a sexual interest in children to access that material.

It is also an educative tool to raise the public’s awareness of this type of offending and the harm caused to victims.

The filtering system complements the information, education and enforcement activity undertaken by the Digital Child Exploitation Team of the Department of Internal Affairs.

The Department is working in partnership with New Zealand ISPs and offering them a choice to protect their customers from accessing these illegal websites inadvertently or otherwise.

It is not a magic bullet that will prevent everyone from accessing any sites that might contain images of child sexual abuse. But it is another important tool in the Department’s operations to fight the sexual abuse of children.

How does it work?

Using a secure link the system advertises to an Internet Service Provider routing information that relates to a list created by the Department of known websites that host child sexual abuse material.

When someone attempts to access an Internet Protocol (IP) address that matches this routing information, that request is sent to the DCEFS for examination.

If the request is for the URL of a known website, meaning it is currently within the list, the system will present a landing page informing the user that the request has been stopped.

If the URL does not match an item on the list the user’s request is forwarded on.

Objections have been raised about the filter and the fact that ISPs are taking it up. Is this filter unique?

No. According to a Statistics NZ survey in June 2009 21 per cent of ISPs offered some form of web content filtering as a free service.

ISPs would be considered remiss if they did not provide, for example, anti-spam filters.

What evidence is there that a filter such as this will have a material affect on reducing access to child sexual abuse material by offenders?

A great deal of traffic goes to websites containing images of child sexual abuse. Websites play a part in transactions to purchase such images and act as a gateway to peer-to-peer services.

During the Department’s two-year trial 1,055,277 requests for objectionable websites were refused (Note: these would have included repeated requests by individuals and “pop-ups and pop-unders” where sites uninvited provide links to other objectionable material).

Why don’t you just close the websites down?

Not every jurisdiction considers the distribution of child sexual abuse images as a serious offence and even fewer have the resources to conduct online investigations.

However, the Department works closely with enforcement agencies in other jurisdictions and international agencies such as Interpol to close sites and rescue children.

From time to time the Department and other NZ enforcement agencies receive information from overseas agencies, who have taken action against websites hosting images of child sexual abuse, relating to tens or hundreds of New Zealanders who have accessed images from those sites.

Investigations that result from this information are costly and time consuming and can lead to multiple people appearing in court.

There is clear benefit in stopping people from committing this kind of offending: for the victim, the offender, enforcement agencies and the justice system.

What is the Department’s response to concerns that the filter impacts on the civil liberties of New Zealand internet users?

No one has the right to view illegal content that focuses on the sexual abuse of children; just as no one has a right to import illegal books and DVDs.

The filter will focus solely on websites offering clearly illegal, objectionable images of child sexual abuse.

It is a prevention tool, not a law enforcement tool and the anonymity of anyone who is blocked from accessing objectionable sites will be preserved.

The Department is concerned about the sexual abuse of children involved in the creation of the objectionable pictures.

The adults who make, trade or view these in New Zealand are parties to a serious offence. They contribute to an international market that supports and encourages further abuse.

The children who are victims of this activity sometimes suffer the psychological effects of their abuse for many years after the physical offending has ended.

Images that are distributed on the Internet never go away. With each download the person involved is re-victimised.

What assurances are there that the filter will not in future be extended to block content other than that intended?

The Department’s contract for the use of the software that supports the DCEFS constrains its use to filtering child sexual abuse material.

A Code of Practice has been put in place to govern the operation of the system and an Independent Reference Group (IRG) appointed to ensure the Department holds to its promise that the filter will focus solely on objectionable websites.

As the system advises people that they have been blocked, any departure from that stated aim would be widely publicised and participating ISPs would withdraw from using the system.

The fact that the system is voluntary provides an important further assurance that the system will keep to its stated purpose and that concerns about "scope creep" are unfounded.

Should ISPs be concerned with the direction of the filtering system, they are able to withdraw.

Will the IRG actually review/view the list of sites?

The IRG will be able to inspect the filter list and have access to the inspectors’ reports on any of the sites blocked.

They will also be able to check from the Department's premises any particular website on that list if they have concerns about it.

What other activities is the Department is involved in to tackle the issue of child sexual abuse imagery?

The focus of the Digital Child Exploitation Team is prosecuting those people who supply, possess or distribute child sexual abuse material. It also works with a number of agencies providing advice on Internet safety.

The Department has focused on the child sex abuse image trade since the Digital Child Exploitation Team was formed in 1996 (formerly the Censorship Team).

The Digital Child Exploitation Team monitors Internet activities such as news groups, Internet Relay Chat, but in particular they focus on peer-to-peer networks tracking down offenders who trade in child sexual abuse images.

The Department has a history of successful prosecutions against New Zealand residents who trade, make or possess such images.

People who deal in this material can expect to get caught. The unit is part of a world-wide effort combating this problem.

They have developed specialist software for detecting New Zealand individuals operating peer-to-peer networks devoted to the distribution of child sexual abuse images and they have made this available to over 20 countries.

The Digital Child Exploitation Team is unique in the world in that they carry out all the work associated with bringing offenders before the court.

The dedicated inspectors have the expertise to catch offenders. All inspectors are forensically trained and carry out an investigation from detection to preparation and execution of search warrants, interviewing offenders and providing expert evidence in court.

Up to the end of March 2009 we had secured 324 convictions related to the possession, manufacture or distribution of child sex abuse imagery.

How do you respond to claims by groups such as Techliberty and InternetNZ that the filter threatens the stability of the Internet in New Zealand by providing a single point of failure that could affect the whole country should there be an issue, or hackers attack?

The "single point of failure" comment is completely incorrect. The reality is that every government or service providers’ system is a target for potential attack.

The system has been established to the same high level of protection that any government system requires today.

This system is very stable. If the system should ever turn off all that would happen is that the users requesting the child abuse material will be able to access it.

At the end of the day the Department and the team operating the system have the same level of responsibility as any service provider to ensure the end users’ Internet experience is not interrupted. We have built a system to do just that.

ISPs would not connect our system to their network if they thought it would make their networks vulnerable. The filter could give people a false sense of security. This is acknowledged and clearly stated in the Code of Practice.

The Department considers it important that the public has realistic expectations of the system and does not develop a false sense of security.

We have worked for a number of years on educative material and strongly support the Netsafe message that parental supervision is the most effective mechanism to ensure that children have a safe Internet experience.

Opponents of the filter say there is a low incidence of accidental viewing of child abuse material and as such it is not a major source of harm.

The Code of Practice states that, of the potential threats that a child might be exposed to through the Internet, inadvertent exposure to child sexual abuse images is very small.

Our experience from dealing with hundreds of offenders indicates that viewing such material on websites can be a pathway for potential offenders.

In the long term, if it is made more difficult for persons with a sexual interest in children to access this material, the market will decline and fewer children will be exploited.

Internet NZ suggests educating people about different desktop and hosted filters and making them readily available is the best course of action.

Why has the Department not taken this option?

Experience in other countries and with virus protection shows that consumers have a poor record of installing and maintaining software that protects their computers.

Desktop and hosted filters generally work by analysing the content returned to users. They therefore use more resources and do have an impact on the user’s Internet performance.

The Department is currently engaging with a number of third party providers to obtain information relating to current parental control software and desktop based Internet safety systems, with the aim of providing this information to the public which will be updated regularly.

Why did the Department not announce that the filter went live?

The Department has been very open about the introduction of the filter both during the two-year trial and after.

We have been moving slowly to bring service providers online gradually to ensure that the system runs at optimal levels. It was intended to have the system operational by 31 March 2010.

Watchdog and Maxnet, were involved in the final tuning of the system and were connected as we set about rebuilding the filter list.

We have now written to all ISPs advising them of the filter and we did not wish to pre-empt them telling their own customers that they have joined the filter by making a public announcement.

To what extent has the public been consulted or informed about the filter and why it is needed?

The Department has been very open about the introduction of the filter including media releases in July and August last year.

The Code of Practice was made available for public comment and a great deal of information has been made available on our website and to blogsites.

Can you point to any international discussion on the subject?

The Oxford Internet Institute, University of Oxford, in a March 2010 report on a forum on Child Protection and Freedom of Expression Online commented: The strongest consensus was found in the discussion of blocking access to content that was patently illegal for all, the most clear-cut example of this being child sexual abuse images.

Child sexual abuse images are illegal in most jurisdictions, and there was almost unanimous agreement that the voluntary blocking of these images at the ISP level was acceptable and appropriate.