Chapter 1

Introduction:

The widespread proliferation of digital communication technologies has changed the future of mass communication. This digital age is changing how people are receiving and sending information on all levels.

The transmission of multimedia data is made possible by the digitization of the visual and audio information. This is a key factor for consideration. This digital conversion translates the information into a binary format. This allows the data to processed by a computer just like other binary information. More importantly, the data can be distributed like any other binary file. No matter what the bits represent they are still just a string of numbers. There is no difference between the coded information of text, sound, or video. The bits can be distributed together and interpreted and processed by the receiver. These packets of information have what are called "headers" that tell the receiver what kind of information they are and how to process. Any video or multimedia presentation can be acquired anywhere a properly equipped computer or receiver can be connected to a telecommunications system.

Whether it is widespread dissemination or interpersonal communication, there is one telecommunications system that is forcing the world to change the way it communicates; the World Wide Web. As more people are connected to the Web the popular media machine will have to adapt. This means that information providers are going to one day to have to utilize the Web as fundamental part of their dissemination strategy. The new incarnation of information providers will materialize first in the United States.

“In a peculiarly American way we have often sought technical solutions to social problems. Indeed, more than anything else, this tendency defines American information culture…. And so we attach an enormous importance to new machines, especially information machines, layering them with all our hopes and dreams” (Lubar, 1993, p. 10).

American society openly accepts and often encourages technological advances in communications. This usually means that society makes incremental adjustments as some new device is unveiled that makes communication easier, better, clearer, or more fun. In some cases the device becomes an unexpected status symbol as with the cellular phone or pager. The world wide web is much more than a new gadget. It is a method of transmission. This makes it more significant but, the there are other recently introduced methods of transmission that were heralded by some to revolutionize communications and have not. Since Direct Broadcast Satellite television was introduced in the United States it has gained a respectable degree of acceptance but it has not greatly called into question the future of mass communications and the news media the way the Web has. DBS was just a new way to present the same thing. It can offer more channels but it is still broadcasting. This means a one way flow of information.

The world wide web is a two way form of communication that, as stated in its name, is world wide. It allows individuals to interact from anywhere on earth that there is a connection. As a result, the communications industry will never be the same and the changes that have come about are only the beginning. There are fundamental changes coming that will change the methods of information providers in every aspect. As technology races forward the limitations and capabilities of the Web change. News media will have to make adjustments to meet the constraints of the medium and the new demands of the audience. The information providers will have to temper the richness of their presentation to facilitate the limitations of the medium.

The purpose of this thesis is to illustrate the changes that are coming for way people will receive their news as a result of the advance of digital technologies. It will focus primarily on the world wide web as the chief catalyst for the new news media in a digital world. Why should we care? This future news media experience is going to shape the way people see the world around them. Their knowledge and understanding of current events will shaped by it; for better or worse.

Chapter 2

Basic Explanation of Digital Information

Sampling and Coding

In order to discuss how the web has changed and will continue to change communications it is necessary to have an understanding of what digital technology is. The basis for digital technologies is ones and zeros. This means that all digital information is binary. How does this happen? At the most basic level the information is coded this way. In computing text is converted into binary form by asigning a value to each character using a string of ones and zeros. A standard binary code for the representation of text was established called ASCII. ASCII stands for the American Standard Code for Information Interchange. The establishment of a standard protocol for text recognition was essential for the creation of person to person, textual based communication, on computer systems. The ability for computers to interchange data regardless of the type of system was fundamental in the rise of modern communications technology (Burger, 1993).

The process is a little more difficult when dealing with sound or video. The visual and auditory information found in nature is called analog. This means that the information is made up of continuous signals that have smooth fluctuations. The sound or light can be represented by waves. That which we see is made possible by light waves and that which hear is created by sound waves. These waves can be digitized after they go through a process of conversion from analog to digital.

Like ASCII text, audio and video information is also coded for digitization. In this case the information is not assigned a random code but it is sampled. The sampling process involves recording the frequency and amplitude of waves at a set

Figure 2.1Figure 2.2

interval. Sound and video are composed of waves. And these waves are sampled at consistent intervals so that the waves can be reproduced. The rate of sampling needed to create a satisfying visual or audio playback is based on the limits of human perception and the limits of the hardware. The intended use or audience is also taken into account. The faster the sampling, the more storage needed for the data and the more elaborate the equipment.

To faithfully reproduce a sound or moving image the rate at which the information must be sampled is twice the rate of the highest frequency to be digitally represented. This is known as the Nyquist Theorem. What this means is that anything that is sampled at a slower rate than two times the highest frequency may suffer perceivable quality loss. In some cases a slight loss of quality is acceptable when considering limited storage or bandwidth (Burger,1993).

CD audio is sampled at a rate of about 44,000 times a second (44.1kHz). This creates the illusion of seamless sound waves. This is a fast sampling rate that is designed for the highest quality with little regard for storage space. This type of sampling rate is not currently practical for computer multimedia applications but satisfactory sound can be achieved at slower rates (Holsinger, 1994).

The Advantages of Digital Communication

Now the information is binary. Since the information is based on numbers it can be manipulated like any set of numbers. These numbers are what we know as bits. These bits are what computers process, receive, and transmit. These simple bits have already changed communication as we know it.

One of the characteristics that makes a bit so valuable is its ability to travel. Any one who has used Email has sent ASCII coded bits to another location. Bits can travel through a variety of media and they can travel quickly: The speed of light to be exact(depends on the medium). When compared to analog signals this looks pretty good. This is actually just one small part of the list of advantages to digital data.

Analog signal transmission—as in traditional television, radio, or phone transmission—is subject to signal attenuation. It is the nature of the waves being transmitted to be susceptible to interference. When the data is transmitted digitally some data can be lost but, it is much more reliable. In addition, information can be added to the signal to correct any errors that result during transmission. “On your audio CD, one-third of the bits are used for error correction. Similar techniques can be applied to existing television so that each home receives studio-quality broadcast—so much clearer than what you get today that you might mistake it for so-called high definition.” (Negroponte, 1995, p. 17) This sort of data correction exists on the web to try to ensure that information gets to its destination without loss.

The transmission of digital data does not require as much bandwidth as analog signals. This means that information can be sent down a smaller pipe. An example of this is the difference between an analog television signal and a digital. The spectrum allocation for one channel of analog television can be filled with several digital channels that can offer a better picture. On the web, the rules of bandwidth are the same. The amount of time necessary to receive all of a file is dependent upon the size of the file and the available bandwidth. The ability to compress the data helps to make digital information very valuable.

Consider this information in a bigger sense. Anything that can be represented by bits can now be distributed reliably around the world at amazing speed. The dissemination or transport of this information is inexpensive. And if the information is traveling via the web it never has to leave the atmosphere to find a satellite. This makes digital transfer very attractive for all sorts of applications, especially to those in the information business. The transmission process is only limited by the speed of the processor, capacity of the transfer media, and available bandwidth. If the amount of data could be reduced then faster transmission could be achieved. Hence the creation of data compression which will be discussed later. First, lets discuss how this grand network we call the world wide web got started.

Chapter 3

Dissemination of Information:

The History of the Internet and the World Wide Web

All of these discussions about digital information are important because all of the information on the web is digital. The world wide web is the network through which computers around the world can send or receive these digital signals. It all started here in the United States as military project. In 1968 the Department of Defense was in need of a communications network that would be able to function even if portions of the network were suddenly eliminated. It was the height of the cold war and the U.S. military needed a communication method that could withstand a nuclear attack. The solution was ARPANET (Advanced Research Projects Agency Network) (Kroll, 1992). This network utilized packets of information that could reliably find their way to a destination. The key was that the protocol used by the network would be able to detect a path that would get the information there reliably.

The first nodes were set up in 1969. Much of the work on the network was being done by computer scientists and engineers at universities so these nodes were set up in the universities (Prater, 1994, p. 163). The decentralized architecture of the network made it relatively easy to add more computers and expand the network. The network grew quickly and found its greatest use among researchers and universities. By 1983 the network had grown enough to split the network into two separate ones, one for research and education, the other for the military.

The fundamental factor in the explosive growth of the ARPANET was the creation of a standard protocol for the transmission of data. The TCP/IP (Transmission Control Protocol/Internet Protocol) set the standard and made it possible for different computers to communicate across vast networks. The value of this communication medium led to the creation of the NFSNET. The National Science Foundation invested money to connect universities and research centers making TCP/IP the standard for the Internet (Prater, 1994, p. 150).

“The technology and networks were adopted by other government agencies and countries, as well as the private business sector. Today, Internet technology and the Internet have found massive acceptance and use by hundreds of thousands of organizations around the world…. As of 1 Feb 1995, the Internet consisted of more than 50,000 networks in 90 countries. Gateways that allow at least Email connectivity extend this reach to 160 countries. At the end of 1994, 5 million computers were indicated as actually reachable - with an estimated total of 20-40 million users. Network growth continues at around 10 percent per month.”[1]

In the late 1980’s the joining of the Internet with other similar networks around the world made the Internet even more valuable. It enabled researchers around world to share their findings. The problem was that it was still reserved for the computer literate. The interface was not user friendly and was usually just text requiring a knowledge of assorted programming language commands. For the Internet to realize its full potential new system would have to be developed. It came through the European Particle Physics Laboratory (CERN). In March of 1989 a scientist named Tim Berner’s-Lee proposed to CERN a project that would allow researchers to read each other’s work over the Internet. Berners-Lee proposed a new language that would include hypertext. This Hypertext Markup Language (HTML) is the language that web pages are written in today. The Hypertext Transfer Protocol (HTTP) was created as the standard to handle these new documents (Magdid, et. al:,1995, p. 9)

By July 1992 the idea and software for this new World Wide Web had been disseminated through CERN. It still lacked the sort of widespread impact that we see today because the software was only designed for expensive computer work stations. The browser was a text-based browser but, the idea of the World Wide Web was gaining acceptance on the Internet. In 1993 the National Center for Supercomputing Applications (NCSA) released a browser that worked with more common computers. Suddenly, the World Wide Web could be utilized by a much larger number of people. The new browsers were more stable, reliable, and relatively user friendly (Magdid, et. al, p. 12). Then in 1993 a man named Marc Andreesson suggested the addition of an additional HTML tag that would allow a document to include images. [2] This suggestion led the Web to become a truly multimedia medium.

How Multimedia was Introduced

When the first browsers were introduced in 1992 they created a revolution on the web. The web had been textual domain in which only the "techno geeks" could be comfortable. Browsers that utilized a friendlier (relatively) graphic interface and made the transfer and display of images possible created a broader audience. The natural progression was to try to improve upon this basic display of images by including sound. And while we're at it why not include the moving image? We can transmit plenty of great visual and audio information through coaxial cable.

The idea seems simple enough to the non-techno nerd but, the differences in transmission methods, media, and protocols makes it much more difficult. The first transmission of images and sound was simply the FTP (file transfer protocol) of a file. This is the method of transferring a data file from one place to another. Once received, the file could be stored and then processed by some application on the clients machine. In 1992 a format for the identification of file type was proposed by Nathaniel S. Borenstein at the ULPAA '92 Conference in Vancouver. Borenstein thought that it would be a great idea for people to be able to do "multimedia email." He proposed the creation of MIME

(Multipurpose Internet Mail Extensions) extensions for use on the Internet. Borenstein’s MIME extensions were later incorporated into browsers for the Web.[3] These extensions are now used by browsers to identify what kind of file to interpret.

As browsers became more powerful they could process more of the information within the program. Netscape now includes a player for .au and .aif audio files. When combined with the ability to display .gif and .jpg images in a document, this makes for a better, but still very basic, multimedia experience.

The addition of the programming language Java has further increased the capability of web browsing. It enables the browsers to update images, trigger sounds, react to user input, and more. And now there are over thirty different programs that enhance the capability of certain browsers.[4] These “plugins” are being created by independent software makers to add functionality. There are plugins that allow users to view pre-formatted documents, view VRML (Virtual Reality Markup Language) documents, view CAD (Computer Aided Design) documents, and more.[5]