My blog has moved into Beta!



Please note that my blog has changed addresses! It's all new and improved. Hence this old friend of mine had to be left alone. You can visit me at The Last Word (beta). It's all warmed up for you! :)

« Home

Lockdown in Sector 4Expandable Posts Version 2.0The power of CSSBon Echo ImpressionsWindows - a victim of its own ambitionsGoogle too big for Google?Its a colour thingajaxWrite - what ticks me off!Del.icio.us back with new featuresGooglesoft?  »





It is extremely evident that the more we try to remove human hands from handling data, and get automated artificial intelligence to analyze it and generate intelligable derivative content, the more unsatisfactory are the results. The argumentative excuse for that being that today the quantity of data is just too high for human beings to be able to handle it personally.

The problem

Our technological advancements, though astounding, still fall short. There are some algorithms which work wonders, and they will give results for quite a while in the future. This includes things like Google's search technology and context sensitive ads. But most others, which deals with lots of content, based on relevance and is referred to by millions of people do fail. Such failures are immediately brought to attention of the public, and then starts the whole campaign of 'is the source reliable?' which we have become so used to by now. A recent instance of this happening is the news about the 15 year old Tom Vendetta being hired by Google (news article). As it turns out, it was just a juvenile prank which ended up revealing a gaping hole in the system. It showed how blind [sic] algorithms can be.

Websites like digg help to tackle some of these problems, giving human beings the power to control the flow of news and deciding what is relevant. Techcrunch once surveyed that sites like Digg actually control the internet traffic and the pages which people visit. That is probably the best way to go, and is perfectly everything Web 2.0 (which seems to be doing its own guiding these days). Its more sites like these that we need to make relevance more relevant.

The solution as I see it

The best thing to do at this point (until the automations become dependable) is involve some human intervention with the working. Probably keep a team which looks into it from time to time so that the content flows smoothely. If someone can invent algorithms to filter out useful data, they can also come up with algorithms to filter out unuseful data. Based on a rating system, content which falls below a certain rating can be reviewed by someone, and if its found to be actually irrelevant, removed from the system.

Or maybe Google can acquire digg (hehe!) and integrate it with their own Google News, so that news content is posted by both people and their computers, to be rated by other people. Basically, the more we get people involved, the better it will be for us. Atleast, at the present moment!

External links

Googlebot Tips : to make your pages Googlebot friendly
digg : Driving the internet traffic


1 Comments

Well, that has more to do with the how A.I is being developed than anything else. it just not the same having algorythims running on servers and being checked with a sorting database that determines how information is going to display and what would happen if the A.I was suplied by central super computer designed to contextualize information bases on what it is known, in this case, its internal knowled that must be referenced in interconecting mesh of not only simple databases, but autoevolving databases based on the forementioned disernitive A.I of the central computer that would need a hive, in this case a sub computer acting as a watcher for the cental computer running the dictates of such computer.....

phew.

why we have not seen this yet? because besides already being in development (what do you thing they use super computer for?) it is just not possible because the need for such a thing is not enough yet, but with folksonomy in is way, and everything the hyper contextuality of information, the need of this very thing has been created by the the very humans users that are creating such a inmense influx of connected information.




Leave your comment
You can use some HTML tags, such as <b>, <i>, <a>

Or you can sign in as a different user.







Categories

Latest Updated

Subscriptions

Get My Blog In Your Mail!

Powered by Yutter

add this button to your site/blog as a link to this page! « link to me!
coComments my coComments
my claimID

subscribe to feed
Widgetize!
Google Reader add to google
del.icio.us The Last Word add to del.icio.us
Add to My Yahoo! add to yahoo!
Subscribe with Bloglines add to bloglines
add to msn
Add to netvibes add to Netvibes!
myFeedster add to feedster
Furl The Last Word add to furl


Archive Pages
January 2006 February 2006 March 2006 April 2006 May 2006 June 2006 July 2006 August 2006 October 2006 November 2006 December 2006

Advertisement


Song Of The Day:




Creative Commons License Widgetize!
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 2.5 License.
Aditya Mukherjee © 2005-06 | Powered by Blogger