[Chapter-delegates] Internet Filtering

Alejandro Pisanty apisan at servidor.unam.mx
Mon Jun 30 14:12:42 PDT 2008


Marcin,

you are so right here.

It is really impressive how fast this group has built a shared body of 
technically sound opinion that supportes freedom and rationality in the 
use of the Internet.

To the specific points one more observation. Crime and other forms of 
alfeasance are forms of human conduct, not machine performance. They have 
to be regulated at the level of human conduct: law, ethics, norms, 
agreements - governance, that is.

Technology is a tool. Let's not forget it. You can use it to help detect, 
prevent, correct, mitigate, restore, the damage done by human conduct. but 
to fix human conduct... you have to fix human conduct. Good luck to all 
who try.

Alejandro Pisanty


.  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  . .  .  .  .  .  .
      Dr. Alejandro Pisanty
UNAM, Av. Universidad 3000, 04510 Mexico DF Mexico

*Mi blog/My blog: http://pisanty.blogspot.com
*LinkedIn profile: http://www.linkedin.com/in/pisanty
*Unete al grupo UNAM en LinkedIn, http://www.linkedin.com/e/gis/22285/4A106C0C8614

---->> Unete a ISOC Mexico, www.isoc.org
  Participa en ICANN, www.icann.org
.  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .


On Mon, 30 Jun 2008, Marcin Cieslak wrote:

> Date: Mon, 30 Jun 2008 18:12:44 +0200
> From: Marcin Cieslak <saper at saper.info>
> To: Omar D. Al-Sahili <osahili at gmail.com>
> Cc: Chapter Delegates <chapter-delegates at elists.isoc.org>
> Subject: Re: [Chapter-delegates] Internet Filtering
> 
> Sivasubramanian Muthusamy wrote:
>> I am far from a person who advocate control of the internet
>
> I will refer to the points below because I think they are important to 
> improve our understanding of the "malicious activity" on the Internet.
> Please do not treat this as a personal response to Sivasubramanian -
> I am actually taking opportunity to quote some sentences to engage in
> a bit more deeper discussion.
>
>> There is a lot of malice around, that arises out of the power placed
>> on the malicious underground (here again I would like to say that the
>> "underground" per se is not malicious, but a section of the
>> underground)  that brings down computers and whole networks down with
>> destructive malware. 
>
> It is important in the discussion we have with the stakeholders that various 
> kinds of malice should be treated appropriately. Once cannot put  DDoS, 
> hacking, unwanted content into one category. I believe that we should stress 
> each and every time that the particular issue should be identified and 
> appropriate measures taken and no "catch-all" solutions exist.
>
>> On extreme issues such as child pornography I believe that the Internet 
> community
>> needs to find a way to keep these untraceable content completely out of the 
> net.
>
> I have trouble understanding why "child pornography" made such a career in 
> recent times. Mentioning this over and over as the example of the 
> "universally accepted wrongdoing" brings more harm than good; I wonder how 
> many people started looking for such content just out of curiosity.
>
> The Internet reflects more and more accurately human activities in the real 
> world. It is not the issue to remove the content out of the net.
> It is the issue to root-out the particular criminal activity. There has been 
> lots of expertise in tracking particular kinds of criminals in the real 
> world.
>
> One very often forgotten fact is that Internet brings also new possibilities 
> of combating crime; for example it is much easier and safer for police agent 
> provocateur to act on the Internet. They do not need to risk they lives to 
> engage with the wrongdoers.
>
>> It requires some form of regulatory mechanism. I am not suggesting
>> that such powers be vested with the Government. 
>
> There will not be and cannot be a single regulatory mechanism, since we are 
> dealing with very different policy issues:
>
> 1) Network and protocol-level vandalism
> - is handled today to a some successful extent by the network operators by 
> managing excess traffic for example.
>
> 2) Preventive (ex-ante) censorship - blocking the content before it can be 
> seen
> - is handled in the best way by the end-user.
>
> Filtering out pornography is no different than popular spam- or ad-filtering. 
> I would advocate the use of Privoxy (http://www.privoxy.org/) or similar 
> solutions to many end-users. Actually, removing pop-under and pop-up 
> advertisements makes it much more difficult to reach unwanted web sites.
>
> In my practice since many years I have not been able to reach any explicit 
> pornographic content by accident, mainly because I use appropriate host 
> security, spam and ad filtering software. Maybe I am lucky, but I could 
> access specific content only by explicit request and even with some effort.
>
> In general I advocate the use of personal filters because of the very fact 
> that every user should be able to define his/her own filtering rules. Every 
> user should also be able to temporarily turn off the filtering should there 
> be a need to access specific normally unwanted content. This way those users' 
> policies (what's unwanted and what's wanted) can be personalized and can 
> change over time, in order to become more effective.
>
> Whoever takes the responsibility to take care of others (being that ISP, the 
> government etc.) by providing a shared policies (global, nation-wide or 
> per-site filters) is taking the possibility of free choice off the end-users. 
> This, among from free-speech disadvantages mentioned already by Oliver, 
> generates also the false sense of security - "we are safe because somebody 
> ELSE takes care of us".
>
> We - in Poland  - have analyzed the rules of one the software "blessed" to be 
> used at the Polish public schools (unfortunately in Polish: 
> http://prawo.vagla.pl/node/6430). The result? It blocked access to many 
> popular blogging sites and everything about homosexuality.
> The Internet community here has managed to cause that issue to be petitioned 
> officially on the parliamentary forum (http://prawo.vagla.pl/node/6984).
>
> 3) After-fact (post-ante) take down mechanism.
>
> This works only for the best for the most obvious and universally regarded as 
> malicious content (for example worms, viruses). Traditionally in the early 
> days of the Internet this was handled with the so-called LART-mechanism - the 
> informal process of exchanging information between network operators. Today 
> this process has been largely replaced be some sort of cease-and-desist 
> letters, being generated manually or automatically, with limited success.
>
> The issue with #3 is similar to that one of the global filtering - no one is 
> able (and I presume will never be able) to provide a global set of policy 
> rules stating what's right or wrong.
>
>> But would there be any harm if such powers are vested with the Internet 
> Community
>> - you and me and those from the Internet Community renowned for their 
> values of
>> freedom and privacy and other rights and values that are
>> characteristic of today's internet?
>
> Yes, but finding a universal solution to the wrongdoing on the Internet is a 
> special case of solving the general philosophical question "What's right and 
> what's wrong?". While it is easy to name few unquestionable cases, things are 
> getting really nasty when the edge cases come to play, for example as regards 
> to political, religious or sexual content.
>
> Therefore I believe that the general strategy of ISOC should be to empower 
> and educate users about making their own explicit choices and the 
> possibilities to apply appropriate technology.
>
> -- 
>              << Marcin Cieslak // saper at saper.info >>
>
>




More information about the Chapter-delegates mailing list