Reflections on Web Spam
I'd like to provide a few insights, based on my own experiences in dealing with MediaWiki spam.
Masspirates' wiki disallows self-serve signups, and that's obviously a big help in preventing spam. If we decide to allow self-serve signups, we might consider using a captcha extension called Asirra. Asirra displays photos of pets (via PetFinder); the user has to identify all the cats. It's a good human test, and it's helped reduce the number of spam accounts on wiki.occupboston.org.
In terms of cleaning up spam, page patrolling is the best strategy I've found. When you view Special:NewPages (as a logged-in user), you'll see unpatrolled pages highlighted in yellow. From there, you mark not-spam pages as "patrolled", and delete the spam. When you run out of un-patrolled pages, you're done.
For wiki.occupyboston.org, the amount of web spam seemed to correlate with how often OB was in the news. When there was a lot of hubbub, we'd get 20 pieces of wiki spam a day. Now OB is (for the most part) another little group of activists, we barely get any spam at all.