When Tim Berners-Lee proposed an information management system in March 1989 (the Internet) he certainly had a vision. Fast forward to Web 2.0 and we have moved from simply linking pages to linking the content on those pages (and not the people through social media, as it’s often referred to). It is essentially a computer-generated phenomenon that is offered to readers and social media users alike.
Yes, we have entered a world of engines, pre-calculation and anticipations. With XML, we no longer have to describe the content. Computers have mastered semantics. Now, we give meaning to structured data. With OWL 9, we are giving engines a new dimension: they should be able to anticipate behaviours, feelings and perceptions… or shall we say more straight forward: human soul?
As we forge ahead with Web 3.0 and build on the concepts currently in place, engines and crawlers are now learning from us. They will soon be able to understand us better, anticipate our needs and bother us outside our immediate circles. How long before they can predict if you’re going to turn right on the corner to get a coffee or turn left and head directly to a 8 a.m. meeting?
The good news is that with the mobile revolution, the Internet is now expanding from browsers, or even computers, to applications that are great at collecting all sorts of data.
Once engines and crawlers have mastered who we are and understand everything about us (virtually making us public, which I’d argue is great for democracy by the way) could we ever return to human interaction? What I would like to see happen with Web 4.0 is for us to regain control of the engine. Only then will we go back to social and overwrite the media!