There is someone in this world I find myself disagreeing with a lot. Families are like that, after all. But recently I had a heated debate with them about the state of the Internet, and I conceded that some kind of innovation was required, though I did explain that most of the provided solutions weren’t valid, and didn’t really solve the right problems…
The debate ranged far and wide over the state of the Internet, but the first thing we agreed on is that there is a lot of out of date information on the Internet.
And it’s true, there is. Some of it has dates on, a lot of it doesn’t but swift validation with the likes of the Internet Archive demonstrates how old some of it is. Now, some of it won’t go out of date very quickly, some of it will be barely minutes old and be out of date – and herein lies the problem.
How do you know when something’s out of date? Better than that, how do you police it? The suggestion raised was to send a query to the author on the anniversary of publication and ask them if it’s still up to date. Putting aside the multitude of practical problems, the theory is quite sound: presumably the author of something published – especially if they’re a hobbyist – would be aware if what they had written had become out of date.
Unfortunately, reality trumps vision here: the overhead of physically tracking pages, and their content, being up to date would not only create a truly massive database (not quite in the multi-petabyte archive that IA has, or however many exabytes Google has, but still huge), you’d have to send out a great many messages to people to ask them if it was up to date.
Then, realistically, how many of them are going to actually respond, and if they do, how many of them are just going to reply with a quick ‘yes’, even if it isn’t? The answers are ‘not that many’ and ‘most of them’, respectively. The optimists among you will no doubt think I got the order the wrong way around, but sadly not.
So, this lead to the second major suggestion: vet incoming sites onto the web. Get them in the system up front. Essentially, give people a licence to publish information on the web. This has several interesting connotations to it.
It would pretty heavily cut down on fraudulent sites. Right now, setting up a site is a free-for-all exercise, anyone can do it, anyone can own a domain name etc. even if it’s a trademark or copyrighted, anyone can still register a domain and wait for the inevitable wrangling. After that, the actual exercise of dealing with publishing content, and content management, is surprisingly commonplace.
Now, if you put in place some kind of vetting procedure – especially if you have one that requires regular renewal – you would distinctly be encouraging users to get their software up to date on the server side. That would certainly help with reducing bad behaviour going on out there.
I see two problems with this otherwise glorious view of a policed internet, even putting aside the whole ‘global’ thing. Firstly, the Internet as it stands is very good at one thing due to its structure: it is very, very good at rerouting around failures.
The so-called Great Firewall of China is not exactly the greatest barrier to content out there, and you can bet that whatever policing measures are introduced here to prevent unauthorised publication, it would be rerouted around.
Secondly, and probably more importantly for some people, is the whole censorship angle. So you have an authority of some description, presumably at least partly government funded, authorising publication of material onto the public internet. Now, sure, there are measures in place to withdraw illegal material, but they’re not that reliable, certainly not across global spaces, and again the Internet reroutes around it.
But if you have a government-funded agency proactively (as opposed to reactively) vetting content, you can block content before it is published. If the penny hasn’t dropped yet, think state published and state vetted news…
Then we come back to the other matter: globalisation. The ‘net is global. It’s everywhere, and that means if one country implements stricter measures, they mean precisely squat except inside that country. I’ve mentioned the Great Firewall of China – the strict border control there doesn’t affect the US internet content, except to limit material going into China, at least theoretically.
It’s long since been shown, though, that getting through the GFC isn’t exactly the hardest thing to do, which makes you question the practicality of it in the first place – but whether it’s practical or not, the fact remains that you have two different countries with different rules. The rules would have to be enforced equally everywhere to have any meaningful impact, and as it stands, that just isn’t going to happen.
A bit like policing the Internet generally, I guess, because it will just reroute around any other blockages you care to place.