Blocking trackback spam using .htaccess

This morning I received a trackback spam. It pointed at a rubbish domain –, and came from ip address using User Agent Mozilla/4.0 (compatible; MSIE 5.5; Windows 98; Win 9x 4.90).

I took a look at Spamhuntress’ site and sure enough she has a post warning that a trackback spam run is about to get underway imminently.

Then I checked out my raw log files and found several entries from this User Agent, all from different IP addresses so banning the IP address would be useless to block the spam.

Consequently, I added the following line:
SetEnvIfNoCase User-Agent ^Mozilla\/4.0 \(compatible; MSIE 5.5; Windows 98; Win 9x 4.90\) spammer=yes
to my .htaccess file.

Although this may seem a tad drastic, I trawled through my raw log files and couldn’t find any legitimate entry for that User Agent in my logs.

Be aware that if you intend to use this code, you need to use it in the context of the surrounding code in my .htaccess file (i.e. follow the code with
deny from env=spammer
if you are uncertain, be sure to check out my .htaccess file).

You can test the efficacy of this code by going to the Wannabrowser site, entering the User Agent into the HTTP User Agent field, your site’s address in the Location field and clicking the Load URl button. You should get a 403 result if the code is successfully blocking this User Agent.

UPDATE: Diane let me know that this code was too strict as it was blocking her and she isn’t on a Windows 98 PC. Spamhuntress pointed to a script to block access to Trackbacks – basically you use this script. I have been using the script and haven’t received any trackback spam since I installed it.

19 thoughts on “Blocking trackback spam using .htaccess”

  1. I would suggest using this only on POST requests, because there ARE quite a few people with this configuration. Another possibility is to only protect one file: the trackback script. That’s possible if you use it in .htaccess. Not sure if that would work in http.conf.

    But generally, what works in .htaccess can be used in http.conf. Or is it httpd.conf?

  2. Ah, the Spamhuntress herself. Thanks for the link.

    What I want is probably tricky: to let through legitimate trackbacks, but not the others. I’m probably dreaming.

  3. Ann, I’m not sure that script will work in WordPress 1.5. I have just gone through my raw log files in detail and any posts to the xmlrpc.php file are from “The Incutio XML-RPC PHP Library — WordPress/1.5”


  4. I placed the code suggested by Spamhuntress in my .htaccess file in my top level directory, trackback spam gets through. I placed it in my /blog/ directory, it gets through.
    Elsehere, I read a suggestion that I put a star here:, I tried that. It gets through!

    I looked at my raw logs, and they spammer are posting, and they the log does included “mozilla” in the user agent.

    What am I doing wrong? (I noticed spamhuntress mentions “crustfree URLs”. What’s a “crust free URL”? And could it be my URLS have crusts and so I need to do something different?

  5. To be honest Lucia, without seeing the .htaccess file it is impossible to say.

    If you want to try emailing me a copy of it I can take a look and see if I can see anything…

  6. Thanks for this article! There is another (related) way of blocking Referrers containing single words which are “Bad”:

    SetEnvIfNoCase Referer ".*(this|is|where|all|the|dirrty|words|go).*" BadReferrer

    Deny from BadReferrer

    I found this in the htaccess file of WikkaWiki

  7. I have found that my website is on several message boards.

    A few of them are,,, and

    How can I use htaccess to block these four sites from being indexed through google, msn, yahoo, altavista, excite, msn, and other search engines?

  8. Robin, htaccess will not enable you to block other sites from being indexed in the search engines. That’s not what htaccess is used for or capable of.

  9. This might help.

    RewriteCond %{REQUEST_METHOD} POST
    RewriteCond %{REQUEST_URI} ^/?wp-comments-post\.php.*
    RewriteCond %{HTTP_USER_AGENT} ^$
    RewriteRule .* - [F]

    You can also try:

    RewriteCond %{REQUEST_URI} ^/?wp-trackback\.php.*
    – Haven’t this second line.

  10. RewriteCond %{HTTP_USER_AGENT} MSIECrawler [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} MSIE\ ([2\.0d])\. [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} MSIE\ ([3\.02])\. [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} MSIE\ ([23456])\. [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} “Wget” [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} “Acrobat\ Webcapture” [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} “HTTrack” [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Curl$ [NC]
    RewriteRule ^(.*)$

Comments are closed.