
Hi, I just take a look at the Boost-Wiki: BoostGui. The links http://www.crystalclearsoftware.com/cgi-bin/boost_wiki/wiki.pl?TableViews http://www.crystalclearsoftware.com/cgi-bin/boost_wiki/wiki.pl?OutlineViews http://www.crystalclearsoftware.com/cgi-bin/boost_wiki/wiki.pl?TreeViews etc. are all hacked. With best regards Johannes ____________ Virus checked by G DATA AntiVirusKit Version: AVK 16.350 from 06.09.2005 Virus news: www.antiviruslab.com

Salut, On Wednesday 07 September 2005 14:07, Johannes Brunen wrote:
are all hacked.
well, It is not exactly 'hacked',. Just something from an custblock.intercage.com IP is adding unsolicited entries. Since IP and time is logged in the recent changes section, there could be a possibility for action. However, I'm not sure what to do. Best wishes, Peter P.S. whois 69.50.166.3-custblock.intercage.com OrgName: InterCage, Inc. OrgID: INTER-359 Address: 1955 Monument Blvd. Address: #236 City: Concord StateProv: CA PostalCode: 94520 Country: US

On Wed, 7 Sep 2005 14:15:55 +0200, Peter Schmitteckert (boost) wrote
Salut,
On Wednesday 07 September 2005 14:07, Johannes Brunen wrote:
are all hacked.
well, It is not exactly 'hacked',. Just something from an custblock.intercage.com IP is adding unsolicited entries. Since IP and time is logged in the recent changes section, there could be a possibility for action.
However, I'm not sure what to do.
I'll take care of it. I have backups of the entire site. I've banned the IP and will also be banning the particular link content so they can't spam anymore. I didn't check, but I suspect the machine is a zombie since there are actually multiple ips involved in the attack. Some of the Wiki spammers have many machines to use b/c Wiki's have had IP banning for ages. Jeff

"Jeff Garland" <jeff@crystalclearsoftware.com> writes: [snip]
I'll take care of it. I have backups of the entire site. I've banned the IP and will also be banning the particular link content so they can't spam anymore. I didn't check, but I suspect the machine is a zombie since there are actually multiple ips involved in the attack. Some of the Wiki spammers have many machines to use b/c Wiki's have had IP banning for ages.
How about requiring posters pass a CAPTCHA for each post to the wiki? -- Jeremy Maitin-Shepard

On Wed, 07 Sep 2005 13:53:22 -0400, Jeremy Maitin-Shepard wrote
"Jeff Garland" <jeff@crystalclearsoftware.com> writes:
[snip]
I'll take care of it. I have backups of the entire site. I've banned the IP and will also be banning the particular link content so they can't spam anymore. I didn't check, but I suspect the machine is a zombie since there are actually multiple ips involved in the attack. Some of the Wiki spammers have many machines to use b/c Wiki's have had IP banning for ages.
How about requiring posters pass a CAPTCHA for each post to the wiki?
Well it may come to that if we have more of these kind of large-scale incidents. This is only the third time we have had a major spam attack. The previous ones were caught after about 100 pages were spammed. This one added something like 3000 pages before it got shut down. In general my goal has been to put as few human usability barriers in the way of using the wiki while being able to shut out spammers after the first incident and quickly recover all spammed pages. So far it's been working well. In a more typical week we get about 3 spammers changing 1-2 pages per incident. These look like they are done by hand and hence CAPTCHA would do nothing to stop these. I've also resisted calls for registration as I'm fully convinced that people smart enough to run bots to spam from 50 different IP addresses will simply register to work around that barrier. And, one of the bots was also smart enough to meter it's pace to work around 'throttling' traps. So I don't put it past some of these guys to find a way around CAPTCHA too. In the end, the critical thing is the backup -- no matter how bad the spam, things can be restored easily... Jeff

"Jeff Garland" <jeff@crystalclearsoftware.com> wrote in message news:20050908014237.M29002@crystalclearsoftware.com...
In general my goal has been to put as few human usability barriers in the way of using the wiki while being able to shut out spammers after the first incident and quickly recover all spammed pages. So far it's been working well. In a more typical week we get about 3 spammers changing 1-2 pages per incident. These look like they are done by hand and hence CAPTCHA would do nothing to stop these. I've also resisted calls for registration as I'm fully convinced that people smart enough to run bots to spam from 50 different IP addresses will simply register to work around that barrier. And, one of the bots was also smart enough to meter it's pace to work around 'throttling' traps. So I don't put it past some of these guys to find a way around CAPTCHA too. In the end, the critical thing is the backup -- no matter how bad the spam, things can be restored easily...
Jeff
I agree with most of what you said. However, putting at least mild spam prevention including searching message content for multiple links outside of the Wiki will at least control automated spamming. Also, some human spammers may just realize it isn't worth the effort and move on to another wiki site. Mike

Michael Goldshteyn wrote:
I agree with most of what you said. However, putting at least mild spam prevention including searching message content for multiple links outside of the Wiki will at least control automated spamming.
AFAIK Jeff is already doing content filtering, in addition to the IP black listing.
Also, some human spammers may just realize it isn't worth the effort and move on to another wiki site.
Possibly, but they are a determined bunch as they are getting paid to do this. And hence they have considerable incentive to work around barriers. An additional filter to consider... As many spammers use automated tools, it might be worth it to also filter based on a white/black list of user agent IDs. For example only allowing edits when the UAID is "Mozilla/*Gecko/*Firefox/*", etc. And not allowing for obvious spammers, for example "EmailSiphon*", etc. --- UAIDs at: http://www.psychedelix.com/agents.html List of User-Agents (Spiders, Robots, Browser) http://www.zytrax.com/tech/web/browser_ids.htm Browser ID Strings (a.k.a. User Agent ID) -- -- Grafik - Don't Assume Anything -- Redshift Software, Inc. - http://redshift-software.com -- rrivera/acm.org - grafik/redshift-software.com -- 102708583/icq - grafikrobot/aim - Grafik/jabber.org
participants (6)
-
Jeff Garland
-
Jeremy Maitin-Shepard
-
Johannes Brunen
-
Michael Goldshteyn
-
Peter Schmitteckert (boost)
-
Rene Rivera