Monday, December 6, 2010

You Must Agree to Continue

The process of acquiring modifications for digital games can be arduous.  Applying patchsets in the proper order, puzzling out the undocumented mod manager software, wresting files from proprietary compressed archives, and defragging the hard disk after moving around thousands of files of all sizes; these things are difficult enough without being forced to create a new identity on their website in order to download the necessary files to begin with.

On the approach to the first hoop, their terms of service and community rules will be listed.  Agreement is compulsory.  Clickwrap licenses have had a befuddling history but are generally upheld in the courts.  However, there is little point to them beyond something for forum moderators to point to when someone complains that their friend, or alternate account, was banned.

The next hoop is lit on fire.  A warning notice explains that failing the CAPTCHA will result in failure to join, and possibly a ban from making a second attempt.  After squinting, cocking of the head, and asking others around to "shush", perhaps success is had.

What was the intent behind all that nonsense?  The website operator wanted a new account, to point advertisers to, indicating they have many users and pageviews a month.  The operator also wishes to avoid allowing rude user behavior that would drive old users away from the site, or spam.  Therefore they make it clear that they may kick anyone off the site if they don't uphold the community's moral norms, and a trap is set to prevent a bot from making accounts in an automated manner.

It mostly works; which is to say it keeps undesirable behavior at a minimum level that can be policed according to the Terms of Service.  Is there a better way?  By combining all three Intents:  community building, spam prevention, and the exile of trolls, can a superior system be puzzled out?  Could it even solve the problems with the current approach?

Firstly, nobody ever reads the rules.  Everyone knows what behavior is going to be expected, and failing that everyone knows to "do as the Romans do."  Secondly, serious spammers operate in a far more targeted manner, and have sophisticated networks of computers and even people doing data entry to avoid the CAPTCHA.

The author proposes to test the prospective users of the service.  Ask them a series of questions, of the "What would you do if?" variety.  Let the series of questions serve as a sort of interview and personality test, to ascertain if the prospective is the sort of individual that the community wants to join.

This method is clearly superior at making sure the user knows what is acceptable in the community.  There has been little research into programming computers to make moral choices, making it a better test for flesh and blood than reading text, which has been the subject of vast investment and research.  This will also weed out not-native speakers of the forum's default language; while they might be able to puzzle a series of letters and type them, they aren't going to be able to answer a series of questions about moral quandaries, especially if Google or Babelfish translated it.  Lastly, it is far superior at informing the user just what kind of community they are joining, and may encourage them to stay.

Kingdom of Loathing employs just such a test to join their in game chat.  I found it to be a pleasant place to be digitally present in.

No comments:

Post a Comment