Web content editors, designers and developers have all worked hard to make their website interesting, attractive and functional.
A lot of time and money is spent promoting the website. People find the website talk about it and link to it from their website, blog, wiki, bulletin board ect.
Search engines then trawl through the internet looking for links and keywords (among other things). The more links the search engine finds the more interesting the target website must be (all these people linking to it, it must be good)
The search engine goes away tots up all the scores. The one with the most incoming links is the winner. They will be at the top of the search engine ranking for that month (Its not really that simple but I think this is basically what the spammers tell their clients).
The Gobbledigook you receive from the form submission always contains links to websites. The spammer is not trying to get you to click the link. Spammer wanted a link published on a website by filling in a form that would update the blog, wiki ect. Spammer is trying to get as many links as possible pointing at the clients website, increasing the site’s search engine ranking. The results can then lead to the site being listed ahead of other sites for certain searches, increasing the number of potential visitors and paying customers.
How is it done?
A computer programme is used this searches for publicly accessible forms. Once a form is found it adds content into all text fields, a non existent email address into the email field and HTML containing a link into the text area field usually comment, content or message field.
All websites that accept content via a form are at risk of receiving spam via their forms.
Disallowing multiple consecutive submissions
Spammers often reply to their own comments. Checking that the users IP address is not replying to a user of the same IP address would help reduce the spam flooding our in boxes.
This however proves problematic when multiple users, behind the same proxy, wish to submit the same form which is quite often the case here.
Blocking by keyword
Spammers have to use relevant and readable keywords so the search engines can index them effectively
Spam could be reduced by blocking the keywords they use simply banning names of casino games, popular pharmaceuticals and certain body enhancements.
Drawback the list could be quite extensive and would have to be maintained.
Is a method used to display an automatically generated image of a combination of numbers and letters. The user then enters the letters in to a text field to validate the form.
A computer programme can not read the image and the form will not validate.
Drawbacks sometimes difficult to read and the form needs to be refreshed or submitted several times before you get a readable image.
This system can prove difficult or impossible for the visually impaired who rely on screen readers. Providing an audio version of the characters can resolve this.
Use CSS to hide a text field. A programme will find the field enter data our validation checks the field if it contains data the submit fails.
Drawback if a screen reader is used it will find the form filed and ask for data the form will then fail validation.
Originally developed for use on blogs but now most form data can be submitted to one of the services.
When a user submits a form the content is sent to one of the services. The content is then filtered. The service looks for links and keywords it also compares the content against a database of known spam content already submitted. The content is then given a score and sent back to your server. The server then accepts,flags or rejects the content based on the values you set.
Akismet, Defensio, Mollom are some of the web based distributed services.
Drawback Valid users can be blocked. If a user is wrongly flagged as being a spammer it can be difficult for that user to post data to websites using the same service.