Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Sign in to follow this  
NiGHTMARE

The web - past, present and future

Recommended Posts

I've more or less finished a 2,000 word essay I've go to do for my university. Seeing as it's on the subject of the web, I thought that maybe some people would be interested in reading it.

The only thing that's missing is some examples of websites - sites which use lots of fancy features, sites that don't but still look good, and sites that don't work in certain browsers. If you have any good examples, especially of media related sites, please let me know. I especially want to find a site that works properly in explorer but not netscape, or visa versa.

Tim Berners-Lee described his vision of the World Wide Web as “… a universal linked information system, in which generality and portability are more important than fancy graphics and complex extra facilities”.

- What did he mean by this?
- Has it gone according to plan?
- Can his original vision survive in any form?
- Should it?

The World Wide Web is an Internet based hypertext system that allows for the sharing of information across the world. Tim Berners-Lee, a graduate of Oxford University and at the time an employee European Particle Physics Laboratory was the one who, in 1989, invented the WWW and its primary language (Hypertext Markup Language, or HTML). He also wrote the first ever web client (a program used to visit webpages; they’re now known as browsers), and the first web server a year later. His original intent was to create something that everyone, whoever and wherever they might be, could use for profitable transactions. These transactions could involve almost anything, from the purchasing and sale of goods, to the free exchange of information.

When Berners-Lee made his comments about generality and portability, what he meant is that it is extremely important for all sites on the web to work on as many different platforms as possible. ‘Platforms’ includes the many alternate web browsers available, the two most commonly used being Microsoft Internet Explorer and Netscape Navigator - others browsers include Lynx, Konquerer and Mozilla. ‘Platforms’ also includes different operating systems – besides the several different versions of Windows (’95, ’98, 2002, ME, XP, NT4, and so on), there are many other PC operating systems, such as Linux (and there are many, radically different versions of this), BeOS and OS/2. ‘Platforms’ even includes different computer systems - as well as the Apple Macintosh and other, older home computers such as the Amiga, a few modern consoles and more and more mobile phones can now access the web.

By limiting a website to only working (either at all or properly) on a specific browser, OS or computer, the creator of that site is clearly placing a very large restriction on the number of people who can access the site. It seems that Berners-Lee thought that the entire web should have as few restrictions as possible, at least in terms of who is actually able to view it, and logically speaking, he is right – why shouldn’t someone be able to go to a certain website if they want to? As has already been mentioned, Berners-Lee believed that one of the web’s primary purposes should be to serve as a source of information, and that this information should be accessible to as many people as possible, no matter what hardware or software they happen to being using.

Unfortunately, Berners-Lee’s desire for the entire web to work across multiple platforms has remained unfulfilled, with the vast majority of modern websites seemingly concentrating on style and delivery over content. There are some websites that cannot even be loaded on both of the most popular browsers (i.e. Explorer and Navigator), obviously indicating that the creator of that site either doesn’t care about compatibility, or he values his personal preference above the needs of his site’s visitors. These incompatibilities are not necessarily due to major differences between the platforms, it could just be a tiny difference in how they treat the code of the website that is causing the problem. Embedded applications, such as Flash (which will be discussed later), animated images and the BLINK tag, serve to obscure information rather than emphasise it, whereas proper mark-up through HTML and other approved standards serve to organise and convey information in an effective manner. Embedded scripts, while attractive and impressive looking, add almost nothing to the informational value of the pages, and provide a distraction in addition to consuming the user’s bandwidth and CPU time.

Berners-Lee and others believe that enhancing standard HTML in regards to allowing enhanced graphics and multimedia is completely unnecessary, as the only thing that’s important on a site is the information it contains, and HTML really cannot be improved in this area. But there is an alternative to improving HTML; it has been possible to either embed other languages into HTML or even go so far as to replace it entirely for several years now, meaning that existing HTML pages will still work, and fancy new ones can be created without any conflicts. Of course, Berners-Lee would still see the features on these sites as being somewhat irrelevant, but most people are more likely to browse through an attractive looking site than a plain and simple one.

There is a major disadvantage to using alternate web languages however, which is related to Berners-Lee’s view that portability is of extreme importance. This disadvantage is that there is no guarantee that all browsers will support these alternate languages, so visitors to a site which uses a non-HTML language must use a browser that actually support this language, else they will not be able to view it at all. The solution to this problem is to try to standardize these alternate languages as well, and to try to encourage as many of the browser creators as possible to support them.

There is a worldwide consortium of various companies and organizations that are dedicated to doing this, as well as working on many other web related issues. The World Wide Consortium, or W3C, that tries to maintain and updated various web-related standards. Founded in October 1994, the Consortium is currently directed by Berners-Lee himself. According to their own website, the Consortium has three main roles regarding the web:

1.Vision: W3C promotes and develops its vision of the future of the World Wide Web. Contributions from several hundred dedicated researchers and engineers working for Member organizations, from the W3C Team (led by Tim Berners-Lee, the Web's inventor), and from the entire Web community enable W3C to identify the technical requirements that must be satisfied if the Web is to be a truly universal information space.

2. Design: W3C designs Web technologies to realize this vision, taking into account existing technologies as well as those of the future.

3. Standardization: W3C contributes to efforts to standardize Web technologies by producing specifications (called "Recommendations") that describe the building blocks of the Web. W3C makes these Recommendations (and other technical reports) freely available to all.


Fortunately, nowadays nearly all modern browsers do support the most common alternate web languages, such as PHP, CSS, Flash and JAVA (although there are rarer languages that are only supported in a few browsers), either because the developer of the browser programmed support in themselves, or because the developer of the language created a ‘plug-in’ for the browser. Because these languages work in modern browsers , and because these languages make alterations to standard HTML unnecessary, modern browsers should be able to access both new and old websites with no problems. As long as the server a website is stored on still exists, there is no reason that the site and its content shouldn’t still be accessible in 50 years, 100 years, or even longer. If the site was created in standard HTML or one of the other common web languages, it would mean that the site will still be decipherable by future browsers, operating systems and computers.

However, this would only be the case as long as all the current features of HTML and the alternate languages remain unchanged. While it might be possible for new features to be added to a certain language, existing features in that language would need to stay exactly as they are. Keeping HTML and other web standards relatively unchanged definitely guarantee portability, but it could be argued that by sticking to these standards, the development of the web is being severely limited. It makes it difficult to develop new web based technologies and make improvements to existing ones if backward and sideways compatibility are considered to be absolute necessities.

As has already been discussed, sideways compatibility (being able to open a site in different browsers) is essential, and if anything, more attention needs to be paid to this than is currently the case. Backward compatibility (being able to open the same site in an older version of the same browser) shouldn’t be so much of a concern. While using an alternate web language on a site means that older browsers or older versions of current browsers (versions that were written before the newer language came into existence) are incapable of accessing it - even XHTML (Extended HTML), which was developed by W3C themselves as an alternative to standard HTML, adding many newer features that allow more advanced web design, is incompatible with older browsers - this is only really a concern in regards to older computer systems.

There is simply no need for anyone to be using an extremely old version of a browser, as even the latest versions of most browsers, including Explorer and Navigator work perfectly well with older PCs and older versions of Windows. For example, most versions of Internet Explorer will work on any version of Windows from Windows ’95 to Windows XP. However, some people might prefer to use Internet Explorer 5.0 instead of 6.0 because they dislike the newer version (it supposedly has a lot more bugs), but there is little reason why they would want to use an even older version. Similarly, Netscape Navigator 6.0 has a completely different look to Navigator 4.0, so many people might prefer using the older version, but Navigator 3.0 looks pretty much the same as Navigator 4.0, so using it would be rather pointless.

Berners-Lee's vision should be seen as the most important principle behind the web by as many people as possible, for as long as possible, but only so long as it isn’t detrimental to the health of the web. His vision also doesn’t take into consideration the fact that, while current web-related standards remain the same, it would be best if the number of different standards is not limited – the number of standards definitely shouldn’t be reduced, but creating more standards as time goes by necessary if we don’t want the web to stay exactly the way it is currently, except with different sites, forever.

Share this post


Link to post

XHTML isnt extended HTML, it is HTML restructured to also be an XML document. This has several advantages: for example, XML is a much more strict language. If you give an HTML parser something invalid it will generally still render the page properly. If you give an XML parser an invalid XML document it is obliged to reject the document: only 100% properly formed documents are accepted. This is important: the looseness of the HTML syntax means that people tend to make non compliant pages. As a result it is incredibly difficult to write modern web browsers because they have to be able to cope with pages that arent HTML compliant. Also, if you have a page which is ambiguous, it is ambiguous how it may end up being displayed in different browsers. XHTML aims to fix some of these problems, it does not extend the original language.

Share this post


Link to post

Fraggle: thanks (and thanks to Boris in #zdoom as well), but it's too late now, as this essay and two others have just been posted off. We were never taught this kind of stuff in the unit the essay was for, so we can't really be expected to experts in it, so hopefully it won't lower my final mark too much ;)

Share this post


Link to post
fraggle said:

XHTML isnt extended HTML, it is HTML restructured to also be an XML document. This has several advantages: for example, XML is a much more strict language. If you give an HTML parser something invalid it will generally still render the page properly. If you give an XML parser an invalid XML document it is obliged to reject the document: only 100% properly formed documents are accepted. This is important: the looseness of the HTML syntax means that people tend to make non compliant pages. As a result it is incredibly difficult to write modern web browsers because they have to be able to cope with pages that arent HTML compliant.

I'm not sure if that's a fault of HTML. HTML itself is quite strict, innit? It's just that people got so used to writing non-compliant HTML, and HTML parsers so happily accept it.

I.e. if all of a sudden all HTML parsers started to reject bad HTML, people would write good HTML :)

I don't see how XML changes anything.

(oh, and XML sucks but that's another matter)

Share this post


Link to post
bigbadgangsta said:

My God that is kick ass! Hope you get a 100!

Extremely unlikely, a first is normally ~70%, and that's killer.

Share this post


Link to post

My average mark for this year, ignoring the two units I failed, is 58%. I'll let you know what it is once I get the updated results :)

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  
×