Sven and I went along to a lecture on the "Semantic Web" at the Royal Society last night.
The Roger Needham Lecture given by Professor Ian Horrocks, clearly explained the ideas behind the "Semantic Web". It was an interesting tour around the "Semantic Web" which is just starting to show practical promise. But as Ian said to me afterwards, it is currently limited to discrete domains. It remains to be seen if all these domains will one day merge to form a Meta-Semantic Web?
The main problem is defining "semantics", or in other words, determining "meaning". "Meaning" is decided by human beings, not machines, and although "inference engines" can infer meaning they are only as good as their underlying facts - and here lies the problem. This kind of system only works well if we all think like Mr Spock: logical and ruthlessly honest.
Remember META tags in web pages? They worked well until everybody started to tag their pages inappropriately with terms that didn't really apply. And then things got really silly - we saw META tag trade mark infringement and bulk keyword stuffing - as a result, search engines now routinely ignore META tags.
The Semantic Web, and the recent trend of "tagging" will suffer from human nature just like META tags did. If these systems are to survive they need to take account of human nature at the outset.
I don't know about you, but I prefer to think like Captain Kirk than Mr Spock. Planet "Semantic Web"? Even Spock might say "does not compute." Quick Scottie ... Beam Me Up!