|Broadband: Bringing Home the Bits Committee on Broadband Last Mile Technology|
source ref: ebookbrand.html
Broadband is a means to an end: It refers to capabilities that people will use, but the question is, how and why? There is much excitement in some quarters, but many consumers who are unsophisticated about information technology and networks and do not yet have experience with broadband connectivity have expectations that are very much at odds with current reality. Indeed, as the discussion below suggests, there is much potential for future applications that enrich or complement traditional content and communications channels, but excitement about them should be tempered by an appraisal of the time frame in which these applications could be realized.
In practice, what broadband customers see today is largely a better version of the Internet access that they enjoyed with dial-up ISP service, featuring Web-page viewing, e-mail access, messaging, and the like. This experience is enriched by improved access to audio materials (most notably, music and Internet radio) and, albeit less frequently today, video. A few new broadband-only applications are available today, such as network backup and storage. This incrementalism may be inevitable for economic reasons, but it also disconnects the application experience of today from that anticipated as a result of enhancements to familiar applications, the introduction of new applications, and the integration of diverse broadband-based activities into everyday life.
As the examples in this chapter illustrate, technology capabilities are one constraint on new applications. Current-generation DSL and cable modem technology are unable to provide large quantities of high-quality video-on-demand. And distributing content within the home in a useful way--i.e., at least as well as do today's conventional consumer electronics (television, radio, and stereo systems)--remains a significant "systems integration" problem involving broadband hardware and software, in-home networking, and consumer appliance design.
There are, nonetheless, a number of places with experience in new broadband applications. The limited pool of users with broadband at home today, together with a larger set of users who access the Internet at high speeds in the workplace or through the networks of academic institutions, provides some indication of the sorts of applications that could emerge on a mass-market basis. Experimentation in industry and academic laboratories provides another source of indications of potential applications. These early adopters and their applications may not generalize completely, and not all of these services will necessarily succeed from a business standpoint, but they do illustrate how people respond to the availability of broadband and integrate it into their activities at large. This chapter explores the characteristics of a variety of present and future applications and examines some of their technical and related socioeconomic features.
Key technical characteristics--the bandwidth (upstream and downstream), latency, jitter, addressability, and "on-ness" (always-on), as defined in Chapter 2--distinguish several currently deployed or potential classes of applications. This section outlines the overall characteristics of each class and provides one or more specific examples of applications within each class. Notwithstanding their seeming variety, possible applications by and large depend on a few core, or primitive, signal or traffic types and connection characteristics, such as always-on. These core traffic types are characterized by their basic data rates, by whether they rely on file download or streaming (which in turn may have particular latency and jitter requirements), and the like. Performance and quality trade-offs reflect the interaction between the broadband link and other capabilities such as coding and compression and local storage.
Although there is no rigorous taxonomy of broadband applications, it is useful to draw associations between key characteristics of broadband and major application classes. For example, video-on-demand and other media streaming applications rely on the availability of downstream bandwidth, while information appliances require always-on service even though the bandwidth requirements may be low (see Table 3.1). Also of interest are "composite" applications that rely on a set of capabilities. For example, shared sports viewing requires substantial upstream and downstream bandwidth simultaneously. Furthermore, the composite broadband use in a home may be made up of multiple applications being used simultaneously by different family members.
The primary motivation today for residential broadband access is simply to improve the performance of the overall Web browsing experience. While many factors actually influence the perceived speed of Web browsing--including, most notably, the performance of the server itself and the performance of the server's connection to the rest of the Internet--moving from dial-up speeds to broadband speeds on a consumer's Internet access link will almost always provide dramatic perceived speed improvements in general Internet usage.
In addition to making the general Web experience more enjoyable, this speed improvement can also mean that new types of content become usable by the consumer. There is, for example, a widely held belief among commerce site operators that it is essential to minimize page-load times.1 Commerce sites thus depend on network performance in designing their pages, and any increase in that performance (either on average or for specific users that they can identify) means that they can increase the richness (and hence possibly the value) of their pages. For example, small images might be replaced by higher-resolution pictures that more closely approximate the quality available in print catalogs.
Other Web usage, such as simply reading long articles (for example, from online news sources), becomes more enjoyable with greater bandwidth, and hence the Web is a more attractive medium when the effective speed of information display approaches that experienced in physical page turning. Finally, certain types of real-time applications, such as streaming stock quotes, depend upon speed and timeliness to be valuable. Such applications can often run continuously in a part of the screen and attract user attention intermittently. However, to be effective, bandwidth must be sufficient for the performance of these applications and that of whatever other network interactions the user may be involved with.
Messaging of various kinds continues to show up in surveys as an important application. For example, a Jupiter MediaMetrix assessment of AOL usage for January 2001 reported that of 22 billion minutes spent on AOL's online service, 4.7 billion were spent on AOL e-mail, 2.8 billion on internal instant messaging, and 6.2 billion minutes on AOL instant messaging with users outside of AOL's online service; this contrasts with 2.1 billion inside all AOL content channels.2 Although many saw it as an application geared toward entertainment, messaging is also seeing increased use in a variety of business environments. While not demanding in terms of bandwidth (dial-up bandwidths are sufficient), broadband enhances messaging because it is always on.
Many users are familiar with downloading e-mail attachments or software upgrades. But many bulk file transfers are simply not practical without broadband. For example, downloading an entire application that might otherwise be delivered on a CD would require many hours over even the best dial-up connection--a 60-megabyte (MB) file would take about 4 hours on a link with a sustained 35-kbps transfer rate. For most people, this length of time is simply impractical, particularly if the dial-up line is also used for voice communications or is subject to periodic disconnection. On the other hand, a constant connection to the network at even modest broadband speeds may make such transfers reasonable.
It is important not to underestimate the impact of fast file-downloading capability on a very wide range of applications, including audio and video. Streaming is complicated compared with file downloading, and the main reasons that people do it, other than for real-time delivery, is because the files are so large that users do not want to wait while the files download; the files are too big to store locally conveniently (although storage space is rapidly becoming very inexpensive); and/or there are intellectual property protection concerns (but application of digital rights management technologies to stored files can provide protection comparable to that of encrypted streams). If one can move music files in a few seconds, videos in a minute or two, or an entire newspaper or book in a minute, many applications become practical. In addition, the economics are becoming more appealing with the spread of very large, cheap storage units. Downloading is of particular value when one wants the content for portable appliances--such as e-book readers or music players--though making this easy for consumers depends on addressing the in-home connectivity issues discussed below. Some typical figures for media bit rate and data file size, together with user-oriented parameters such as download time and the size of the data file acquired give a practical sense of the relationships between media type, broadband link capacity, and download times (see Table 3.2). Already, surveys correlate audio and video downloading with broadband,3 while indicating that (as of early 2001) fewer than half of home computer users used a media player.4
The interactivity demands of some games were alluded to above. Multiplayer games are of considerable interest because they connect growing numbers of people in a shared activity ("massively-multiplayer role-playing games"), providing both social and demand-stimulating dimensions. As of fall 2000, for example, Everquest involved up to 100,000 simultaneous users out of more than 300,000 paying subscribers. Of those subscribers, 30 percent had broadband connections. According to Sony, which provides Everquest, availability and reliability are key requirements; latency is less important in this game than in the shooter variety; and bandwidth demand is moderated by a design that presents graphics on the client software and transmits only changes in graphics and in game and character state.5
While activities based on Web browsing are generally improved by faster network connectivity, a small number of Internet-based applications are particularly sensitive to connection speed, latency, and response time. The two most prominent are day trading and some forms of multiplayer games (in which delays of as little as 50 milliseconds can impair game play). Note that these activities are not generally done through Web browsers, but rather through special-purpose interface software. Both of these call for functionality not easily achievable through any other means, suggesting they will continue to drive interest in broadband.
The most common model for consumer software distribution is one in which consumers purchase applications on CD-ROM or for Internet download. These are one-time purchases; except for upgrades, there is no recurring cost. In many cases, software vendors choose to sell their software in large bundles, of which the office-application suites are the most common instance. An alternative model is being explored by software vendors is the rental of particular applications on a by-use basis. Simple examples include financial planners and simple tax-preparation software built out of Web forms. While greater bandwidth would offer faster download times, it is unclear to what extent acceptance of this model depends on bandwidth or on consumer acceptance of a model in which the individual does not own the software. In many cases (for example, tax preparation software), users may want to control the data locally for privacy and security reasons. Other applications, such as games, could be obtained through rental, and there would be no such concerns.
Network storage applications provide users with an alternative to storing data on local hard drives or on removable storage media such as floppy disks or CD-ROM. There are two major advantages to this service. First, people use network-based storage rather than run their own local servers to do such things as sharing photos. It is hard to know whether storage will migrate into the home or out of the home when material can be stored in either place--much undoubtedly depends on pricing, confidence about access controls for out-of-the-home storage for certain kinds of materials, and so forth. Second, network-based storage provides redundant off-site storage. This is likely to be attractive to small and home businesses and to people who require disaster recovery (which might well include anyone with a PC who has had a disk crash). Privacy issues can be handled by only placing encrypted data on the remote store. For small business it seems likely, for reasons of performance, management, and support, that content will be hosted remotely by commercial Web hosting services rather than at the small business site.
The requirements depend on what sort of data is being stored. For example, photo (or video) storage may require relatively high upstream capacity to permit uploading in a reasonable time (and not tie up the connection). But file-system backups, which normally need to transfer only periodic, incremental updates, depend more on the always-on nature of the connection rather than the bandwidth (unless the volume of modified data is very large). One can imagine the emergence of a generation of operating systems with automatic continuous backup across the network as an option--greatly reducing the likelihood of data loss due to disk crashes or other computer failures.
Several interesting video applications depend on the ability to deliver still photos or short video clips. The emergence of inexpensive--albeit more expensive than their analog counterparts--digital still and video cameras enables easy capture of photos.
Because many audio applications do not demand especially high bandwidth, in notable contrast to video applications, they often work with at least some level of functionality over a fast dial-up connection. All of the currently deployed broadband technologies are fast enough to support the key audio applications that have emerged to date. These include conventional voice similar to telephony; voice as a complement to games and other interactive applications; and a full range of sound applications, beginning with music but including other types of content (e.g., news and other spoken word). As a result, some experience has been gained with the delivery of audio applications over the Internet in general, and via residential broadband in particular. This experience supports a key theme of this chapter--for many applications, the bandwidth provided by broadband services is a necessary but not a sufficient condition by itself to make an application work effectively. Factors such as which home networking technologies are used, the availability of special-purpose appliances, and the nature of user interfaces are also critical enablers of widespread use of audio applications. While there is much interest in broadband for video delivery, this chapter devotes considerable attention to audio as well, both because it is an important application and because understanding of audio applications is better grounded than that of video applications, given early efforts to deploy various audio applications.
Fundamentally, there are two ways to approach audio delivery--a file can be downloaded to a local computer and then played, or the data can be streamed from a remote computer to the local computer, played more or less as it is received. Clearly, the file transfer model is appropriate only for distributing prerecorded material; conversations by their very nature have to be conducted in a streaming mode, and streaming is also essential for "live" content that has high time value (such as commentary on a sporting event). The use of streaming delivery does mean that the audio is necessarily listened to in real time. While some streaming applications use encryption to make it difficult to keep a copy, some streaming applications permit a copy to be saved to a file for replay or other later use.
Streaming audio requires an end-to-end network connection that is fast enough to handle the actual encoded size of the audio file on a second-by-second basis (one end may be at a content server located somewhere within the broadband provider's network). In some applications, a technique known as buffering can be used to prevent transient network delays from interrupting playback. Audio is played from the end of the buffer as newly received audio is added to the start of the buffer. Still, network delay and jitter must be kept within bounds so that the buffered data are sufficient to imperceptibly smooth over these delays. The acceptable buffer size depends on human factors that vary according to the application. A few-second pause between when a request is made to play a song and when the song starts playing is probably acceptable, but a significantly longer delay is likely to be annoying. Delays of anywhere near this magnitude in a voice conversation are very distracting, however, as is familiar to anyone who has contended with even the half-second round-trip delay on a geosynchronous satellite circuit.
There are other circumstances in which the length of the delay affects acceptability. If one is streaming a live event, the sensation of being live is dependent on the stream delay. If users have access to the event through other media, they may notice even relatively small delays. For example, there are data streams that deliver information on sporting events. If a user runs one of these concurrently with an audio streaming feed of the sporting event, the inconsistencies may be noticeable--for example, a play is reported on the data feed before it is heard over the audio feed.
Another parameter affecting the performance of streaming audio is the packet loss rate. Typically, lost packets are not retransmitted because the resulting delays (the sum of the time it takes to determine that a packet is lost, the time it takes to transmit a request across the network, and the time it takes for the replacement packet to be delivered) would make the playback jerky.6 For typical audio applications, occasional packet loss turns into distortion and sound disruption and causes variable quality as the sound is reproduced. Depending on what the audio is and how often packet loss happens, this effect can be very annoying, though it may not be enough to make the application unsatisfactory. People do accept a certain degree of impairment due to interference when listening to the radio and tolerate brief dropouts when using cell phones or wireless handsets for wireline phones.
In the file download model, the key question is how long the user is willing to wait to receive the file. Simple calculation of the transfer times required for a 5-minute musical recording at different bandwidths yields an indication of the timescales involved (Table 3.3). Note that these times assume that the server transmitting the music has sufficient capacity to support the transfer rate offered by the last mile link and that there is no backbone network congestion that would reduce the effective transfer rate. In many real-world applications, either or both of these may turn out to be the actual limiting factors.
The acceptable download time depends considerably on the details of the application. For example, in a download-first and listen-later application, it may not be satisfactory if it takes more than a few seconds to cue up a song (and streaming may be a more appropriate approach). If the application downloads a collection of music in the background for later listening, however, this transfer time may be acceptable. But if the goal is to download a compilation of music and immediately listen to it "off-line" on a portable device, taking more than a few seconds to download each song is probably unacceptable.
Compression is a key determinant of the network performance requirements for an application such as audio. Compression algorithms for media rely on two basic principles--the removal of redundancy and the reduction of irrelevancy in the input signal. While mathematically lossless compression--in which the original signal can be completely restored upon decompression--is used in some archival, legal, and medical applications, the pragmatic goal in most media applications is some degree of perceptual losslessness.
While acceptable spoken voice quality is provided by a data rate as low as 4 kbps, music playback covers a wider range of data rates. Present storage and transmission costs generally mean that the maximum practical compressed signal data rate for many applications is 32 to 64 kbps. MP3-type encoding is commonly used today to compress audio at a variety of compression ratios. MP3 at 64 kbps provides a quality roughly analogous to (analog) FM radio quality--acceptable in some applications, particularly if it is to be played back through a low-quality system, but not as good as a CD played using high-quality equipment. Compact disk (CD) quality using today's compression algorithms requires 128 kpbs.7
The gap may also be growing between what generally available bandwidth supports and state-of-the-art audio. Consumer electronics companies are currently beginning to promote a series of super-high-fidelity recording schemes using higher-capacity DVD (digital versatile disk) media that provide a much higher quality than that of CD audio. Multichannel sound proposed for future HDTV-class applications would require a higher bit rate, with 320 kbps being a conservative figure for 5-channel sound.8
The wide range of bandwidth-quality trade-offs for sound is illustrated by radio broadcasting that is being streamed across the Internet. At the low end, services such as spinner.com stream sub-FM radio quality music at roughly 20 kbps. Much content is streamed at rates in the range of 20 to 100 kbps, with the low end serving dial-up users and the high end aimed at users in the workplace or with residential broadband. Toward the high end of that range, the quality lies somewhere between FM radio and CD quality. And at the high end lies uncompressed full-fidelity radio broadcasting at a data rate of 1.4 Mbps, as was demonstrated at the October 2000 meeting of the Internet 2 consortium. The majority of applications moving audio over the network today, however, operate toward the lower end of the quality-bandwidth curve.
The range of technology options today supports the observation that there are very different thresholds for what constitutes "minimally acceptable" music quality and what constitutes "high" quality. This is a very subjective matter--many people are willing to listen to AM radio, a large number find FM radio acceptable, and some significantly prefer CDs over FM because of the quality difference. In addition, acceptability varies from one recording to another; some lossy algorithms work reasonably well most of the time, but occasionally, particularly for certain types of music, they produce artifacts that are very audible and annoying to some minority of listeners, who will reject the compression strategy on this basis.
Applications such as telephony also require two-way delivery to and from the home. The same coding issues that arise for other audio also arise here, and there are tighter constraints posed by the more limited upstream bandwidths in today's broadband technologies. However, data rates alone will not compensate for inexpensive or poorly positioned microphones or for ambient noise. If one is to use a broadband connection to the Internet to substitute for conventional voice telephony conversations, a good handset will still be needed. It will, for example, be problematic to hold conversations having good sound quality using the PC analog of a speakerphone that is not close to the speaker's mouth, just as it is with conventional telephony.
To better understand the requirements of audio-based applications over broadband, it is important to examine a set of specific applications and practicalities of each application, including what consumers are likely to expect.
Playback of Music. Today's music-playback applications are attractive to people who like the convenience of playing music on their computers, who want free music via peer-to-peer applications, who want to listen to radio stations that do not broadcast in their geographic region, or who want to listen to events that they cannot get access to in other ways. This content is often not reproduced on high-fidelity equipment. As noted above, options for music content distribution available today are also generally inferior in quality to that of a well-produced audio CD. In order for network-delivered audio to substitute for audio CDs, at least for people who are particular about sound quality, it will be necessary to move up the quality-bandwidth curve somewhat from where typical applications are today.
While a number of PC-based audio applications have enjoyed widespread use, it is unlikely that consumers will want to be forced to sit near a PC whenever they listen to music. The configuration of a home will depend on household income, personal preference, and the like, but most homes have devices in various locations. For example, the average number of radios per U.S. household in 1998 was 5.6.9 Multiple audio CD players are also commonplace, and many homes have one or more high-performance stereo systems. Normally, each of these is controlled locally by selecting radio stations or inserting CDs and selecting tracks. If these devices are to be replaced by network-based playback, one of two configurations will be required: (1) specialized appliances in each room that are connected to a computer that is in turn connected to the broadband network or (2) specialized appliances that directly connect to the broadband network, probably through a home network. End-to-end streaming audio also depends on the performance of the in-home network's being roughly as good as that of the wide area network. This is generally not a problem with today's technologies; the slowest on the market now run at about 1 Mbps (HPNA 1.0 and HomeRF), and the trend in home networks is to support roughly 10 Mbps (HPNA 2.0, 802.11b, and HomeRF), with higher rates possible in the future (see Box 2.1 in Chapter 2). Depending on whether the desired content is stored locally or not, supporting multiple devices throughout a house may require delivery of multiple channels of sound. Current-technology broadband may be adequate to support two such connections with appropriate coding, but the presence of multiple radios suggests that audio could support demand for higher bandwidths.
Audio applications place considerable demands on the distribution networks within the home. If audio playback is to be available in all the places where people are likely to want the broadband equivalent of radios or CD players, the home network has to be near-ubiquitous. In some homes, this may require wireless, powerline networking or the addition of Ethernet cabling and jacks. In other homes, many rooms already have a telephone jack, and phone-line networking is a reasonable option. Another option is to distribute the audio signal itself using radio frequency transmitters and receivers. Indeed, appliances are beginning to appear that use radio connections to a "base station" for listening to radio or music that comes over a network. One could also imagine base station technology that does very low power broadcasting on FM radio frequencies (assuming that FCC spectrum use issues can be resolved) in order to leverage the existing installed base of radio receivers within the home--although this does not address the control interface issue.
Audio applications also require a control interface to select programs, switch from one track to another, and so forth. Each playback device requires some sort of input device, such as a keyboard or touchscreen. With a large collection of digital music, navigation becomes complicated, comparable to choosing selections from several shelves full of audio CDs.
There is a variety of architectural options for audio in the home. For example, audio could be streamed directly to the player or restreamed from local storage on the home computer to the listening point. Cost considerations make it unlikely that each playback device would have its own large store for audio built in, so at least some music is likely to be stored in some sort of household audio library, whether on a general-purpose computer or on some sort of specialized network device. Access controls will also be important; one would not want people outside the household to be able to request music from the home music archive (at least not through an appliance-type interface). Intellectual property issues raise additional access-control issues: presumably, one would expect to be able to play music that one has licensed on any appliance within one's home, but this capability is somewhat at odds with both the digital rights management systems being proposed by the music industry (such as the Secure Digital Music Initiative, SDMI), which sometimes tie an audio file to a single device. Comparable complications are raised by the prospect of having remote access to central repositories from multiple devices: does one, for instance, maintain a table of repositories and passwords in each device or install digital certificates in each device? Standards are also important--will consumers be able to select appliances from a variety of vendors or will they be locked into purchasing components (and even content) from a single company?
In summary, bandwidth is only one of multiple technology issues. Until these challenges are addressed, playing audio music over the Internet is likely to be an activity that supplements rather than replaces more conventional music-listening options for most people.
Listening to the Radio over the Net. Fundamentally, listening to radio calls for the same types of facilities as those for listening to audio, but there are a few differences and added complications. Radio is more likely to be streamed from the source rather than stored locally in the home in an audio storage server. Also, the control interface will be different: A radio service involves selecting channels rather than individual pieces of music (although the rise of video-on-demand makes one wonder about an audio equivalent). If current products are any guide, the interface may well resemble radios of today, with buttons used to select preset favorite channels. Some means of access to directories of radio stations on the Web, analogous to tuning in stations by frequency, is probably also required. And, faced with a greater number of choices, people may seek out new services along the lines of those being introduced in television services, such as Tivo and Replay. These include program guides and the capability of scanning program guides automatically for programs of interest. Another possible feature would be the capability of saving the last, say, half-hour of a broadcast to permit selecting a particular song or other material for local storage and later playback (there are, of course, interesting intellectual property rights issues to be worked out in this sort of scenario).
Network-Based Voice Telephony. In recent years, there has been growing interest in running telephony over general-purpose data networks, including the public Internet, instead of over the public telephone network. As an application of dial-up Internet service, Internet telephony arose as a less expensive alternative to conventional telephony. The decreased costs to users are a result of several factors: (1) by utilizing the Internet, which is typically made available to residential users on a flat-rate basis, Internet Protocol (IP) telephony avoids the per-minute charges generally assessed by long-distance carriers; (2) because a long-distance call can be placed with these services through a local call to an ISP, these services allow bypassing the per-minute access charges that long distance companies are required to pay local exchange carriers to terminate long-distance calls. From an overall industry perspective, there are also players moving toward replacing specialized telephone gear with IP-based equipment, seeking both to reduce costs and to introduce new functionality. There may also be efficiencies that result from running data and voice over a common network. IP telephony is being used today by some households and within some enterprise networks; it is increasingly also being used internal to the networks of a number of telecommunications carriers. Both deployments raise a series of complex policy issues.10
With residential broadband, which offers much greater bandwidth and always-on connectivity, IP telephony has the potential to move from a relatively marginal, hard-to-use application to a mass-market application. Depending on the architecture of a particular service, it might be a service that consumers run over their Internet connections simply by installing additional software and possibly making arrangements with a third party. Or, it may be a value-added service offered by the broadband service provider. In either case, IP telephony provides a way to bypass the local exchange carrier for telephone service. Price may not be the only selling point: Because IP telephony permits much more rapid innovation in services, the ease with which new features can be added may prove an additional customer draw.
Voice telephony applications do not require especially high bandwidth, with 64 kbps--or less with compression--in each direction being sufficient to provide the quality that people are used to from the conventional phone system. But these applications are much more sensitive than the pure "listening" applications in terms of network delay, jitter, and packet loss. Multiway conference calling raises additional architectural and performance issues. There are several ways that this can be done: as a series of point-to-point connections between individual participants and a control unit, or on a distributed basis using multicasting. For multiway conference calling control of delay and jitter is even more critical because of the number of sites involved. Whether meeting these requirements is best done by increasing network capacity or by incorporating quality-of-service mechanisms into network--and if the latter, which sort of mechanisms at what places in the network--is an open question.
Unlike conventional telephony, IP telephony comes in many varieties. In one major class, conversations are transmitted across the Internet end-to-end. Another possibility is to use IP-based voice service part of the way, perhaps only on the local access link, and connect calls to the traditional voice telephone system through a gateway. Here again the key barriers to acceptance are not bandwidth but the integration of network-based voice telephony with convenient handsets and "dialing" (call setup) devices. Given the popularity of cordless phones, it is unlikely that placing and receiving calls from PCs will find mass-market acceptance (although one might speculate about a PC role as a kind of base station). Other features that are important enablers of widespread adoption include the integration of ancillary services such as answering machines, voice mail, caller ID, and call waiting.
A number of companies have deployed IP telephony solutions, and it is a service that some broadband providers have chosen to add to their service bundles. But it is also possible that residential IP telephony may turn out to be a red herring. Voice telephony is important enough to most people that they are not willing to replace it with an unreliable service unless there is a compelling economic justification. Also, the voice telephone system works well--it is reliable, easy to use, and inexpensive (and getting more inexpensive every week it seems, at least within the United States). Except for people who are tremendously sensitive to the modest costs of long distance today (with rates of $0.05 to $0.07 per minute available as part of various calling plans) or who often place costly international calls (where IP telephony can effectively skirt the very steep tariffs still imposed by some countries), IP telephony may not be attractive unless it comes as an absolutely simple and seamless by-product of a broadband connection.
Audio Filtering and Searching. Audio, radio, and telephony are, for the most part, translations of existing applications to the network environment. This section concludes with a novel audio application--audio searching and filtering--which illustrates the potential for audio applications to pose considerably greater bandwidth requirements. The fundamental idea behind this class of applications is that one can use a computer program to "listen" for certain keywords in one or more audio streams using speech recognition technology. When the program recognizes one of the keywords it is looking for, it takes various actions, such as saving a segment of the audio stream, notifying a person, or putting the stream on speaker. (This, of course, presupposes the availability of large-vocabulary, speaker-independent keyword recognition software.) Key networking issues include how many streams need to be monitored and how large the streams are.
A reason for mentioning this particular application is that for all of the other applications discussed here, the number of channels is basically limited by the ability of a small number (members of a household, for example) of human beings to pay attention to the audio streams; even if the streams are being recorded for later playback rather than for immediate presentation, a household playing different music in each room and with four people on the phone could only use on the order of 10 concurrent audio channels. With sufficient computing power, one can imagine a search application consuming dozens of audio streams--perhaps even conceivably hundreds. Of course, from the point of view of minimizing network traffic, it might be better to push the searching and filtering application into the network, nearer the source of the audio, rather than keeping it close to the edges of the network. But it remains to be seen whether the infrastructure will appear to make such efficiencies possible. One advantage of filtering and searching in the home, at the edge of the network, is that one can conduct searches privately. It is also the case that the amount of computational power available to search and filter would scale much better if provided at the edges of the network than in the core. That is, if streams are available to the edges of the network (say, via broadcast), then the amount of filtering and searching that can be done is limited only by the number of end points that consumers have. If the filtering must be done within the network, then the capacity can be more difficult to grow in proportion to the number of users. These are the sorts of trade-offs between computing, communications, and storage that inevitably arise when new applications are being envisioned.
Video applications--considered broadly--form a useful complement to the audio applications discussed above in terms of understanding what broadband connections may enable and what else other than mere connectivity must be in place. In the public mind, video applications are perhaps the premier consumer applications for broadband, and they exemplify the gap between consumer expectations and what broadband today can actually deliver. Many people have vague ideas that broadband connections will support multiple channels of on-demand, personalized HDTV-quality video--both commercially produced content and interactive videoconference or videophone connections to family, friends, offices, and other destinations. It is not unreasonable to imagine the current or next generation of networks delivering hundreds of channels of broadcast video (including pay-per-view), and there has been experimentation with video-on-demand delivered from broadband providers' local caches, but such services have not been deployed on anything approaching a widespread basis. In practice, most of the video that is available over the Internet for normal users today, even those on a commercial broadband connection, is relatively small images at low and often uneven quality, but improvement in quality over time is expected with improvements in broadband price and performance. Improvements in Internet-based video would not necessarily mean that TV will migrate completely to the Internet: Some observe that existing television is an effective delivery channel for passive entertainment, while others anticipate an ultimate convergence (see Box 3.1).
In addition to sufficient bandwidth and appropriate means of controlling the displays, the growth and multiplication of video applications will be driven in large part by the availability of new capabilities. One such capability is inexpensive flat panel displays (or other technologies delivering the same functionality, such as projectors of some kind) that could be spread throughout the home or office. These technologies could ultimately enable a range of currently exotic applications, from immersive videoconferencing (where one wall of a room simply seems to open into another remote room) to social shared-entertainment experiences (imagine people watching a sports event with friends who are immersively videoconferenced in from a remote location, or a group of musicians playing with each other across the Internet11 ). Cameras are also an important technological enabler, as many of these video applications involve at least some in-home origination of video signals. Current inexpensive video cameras (i.e., so-called webcams) offer only a very small, low-quality image, but the cost-to-performance ratio of digital cameras continues to improve. In-home capabilities for storing and manipulating video--exemplified by incorporation of video capture and editing capabilities into some personal computers and by the advent of digital video recorders--complement the capabilities described above.
Video applications face many of the same issues as audio applications in terms of getting video content to the correct appliances through in-home distribution networks--the computer, the television, or perhaps some type of "videophone" appliance--as well as similar issues of integrating control with actual content distribution. But video permits a number of possible variants with interesting implications for both users and the broadband providers that carry these applications. These include:
As with audio, video can be delivered through two models: streamed video and download-and-play (file transfer). As with audio, interactive video applications require a streaming model. With respect to the download model, video is more challenging than audio is, because the multigigabyte size of a typical video file is large enough that local disk storage--either temporary or permanent--is a challenge, given the capacity of disks today. Simply fetching high-resolution video from a disk to display it after downloading is challenging for consumer-grade personal computers today. Moving high-resolution video around the home from one device to another is also problematic, owing to bandwidth requirements--this calls for in-home networks faster than those typically purchased by consumers today. For streaming, bandwidth requirements for video are much more complex and varied than for audio. This is due in part to the much larger range of quality trade-offs and to the fact that video is often accompanied by one or more channels of synchronized audio.
Another complication has to do with the way that video scales. While an audio transmission is always roughly scaled to the frequency range of human hearing, a video transmission defines a "window," a rectangle of pixels that are rendered on some sort of display. If the display has more pixels than the video transmission includes, there are various interpolation methods that can be used to "stretch" the video, but these produce artifacts and quality problems that are quite visible to the human eye if used too liberally. If there are more pixels in the transmission than the display device can render, there are decimation algorithms that can be used to scale the display down; these are less intrusive than the extrapolation/interpolation algorithms if the degree of downsizing is not too great, and they at least permit zooming (magnification) of particular parts of an image. TV screens define one "standard" window; computer monitors define a few others, but it is often the case that one is putting multiple video windows on a single monitor or TV screen (picture-in-picture). If a wide range of flat panel displays becomes commonplace around the home, then matters will become even more complex.
Within a video transmission window, there are issues of pixel depth (i.e., how much color detail each pixel offers) and of compression within and across video frames. There are very sophisticated algorithms available to reduce bandwidth through lossy compression, but these can produce visually annoying artifacts. Another variable is the frame rate--in essence the number of images that are transmitted per second--which will determine how much resolution the video transmission can provide for rapid movement and how jerky the playback will appear. Some applications may call for very low frame rates, while others may require 30 frames per second (a common standard for video) or more to provide a satisfactory viewing experience.
Latency and jitter--and packet loss rates--are much more serious issues for streaming video than audio because of the enormously higher data rates involved. Any kind of delay or packet loss snowballs rapidly into a very visible problem (presumably because it is hard to have a big enough buffer in RAM and because there is little redundacy when interframe compression algorithms are used). A body of experience suggests that, for many applications, reasonable results can be obtained by giving priority to a good-quality audio signal and simply doing the best that one can with video, using the remaining bandwidth. This seems to be true in a great deal of talking-head-type video and videoconferencing.
Finally, there are some major shortcomings today in protocols and standards for managing video delivery. Ideally, display devices (or display windows mapped onto display devices) would be able to tell a video source what their capabilities are. The applications that manage these displays would be able to assign priorities to different characteristics of the composite video and audio streams being transmitted to them. At a higher level, there is a need for identifying which of multiple video streams need to take precedence at a given moment. And all of this is complicated by models that stream video signals into the home through a single gateway that redistributes the signals to appropriate appliances using a home network.
With all of these variables, it is hard to tabulate simple bandwidth requirements for various video applications. There is also more room for subjective judgment about what constitutes acceptable video quality--a judgment that is more content-dependent than is the case with audio. In the audio world, it often suffices to differentiate between requirements for voice-quality and music-quality audio. In contrast, whereas a video transmission channel with a given set of parameters may be quite satisfactory for a "talking head" or a concert but completely unacceptable for a sporting event, that channel might be reasonable for a video-camera-based portal to a remote location that is shown in a digital picture frame but unusable for a videoconference between individuals. Another variable is the impact of audio in improving perceived video quality.
When video is considered as a personal communications medium, most people probably think of teleconferencing. However, widespread broadband may also make practical a more general capability of telepresence--having a continuous video window open into another space. Whereas teleconferencing brings to mind a fairly formal notion of communication, similar to a telephone call, telepresence can enable much more informal interaction. For example, in a business setting it may enable casual interactions between lab spaces that could permit easier collaborations. Though early telepresence trials were constrained by technical shortcomings, this work also pointed to the significant role that social practices play in their acceptance and usefulness and suggested that it is difficult to predict when telepresence applications will be successfully implemented.13
In a personal setting, telepresence may enable a parent to have a continuous window on a child at a day care facility, thus enabling a closer ongoing relationship, even with working parents. Telepresence could possibly enable new forms of extended-family relationships over distances. An interesting attribute of telepresence is that it potentially poses higher bandwidth demands than one might expect from videoconferencing applications. This is because the premise of telepresence is that the window is always open, to enable spontaneous observations and interactions. One example that is a simple evolution of telephone use today is school children holding shared homework sessions, connecting their respective homes for many hours of working, chatting, and collaborating on assignments.
Telepresence can encompass not only audio and video, but also haptic interaction, force feedback, and control of remote devices (teleoperation). One especially demanding application of telepresence has been seen in experiments with distributed music performances, which require minimal latency and jitter. Telepresence for music is under consideration for concerts, studio production, and master classes.14
Thus, the bandwidth requirements for telepresence are not limited by the number of people actively engaged in watching the video stream at any given moment. One can easily hypothesize the need for more video streams to be maintained to or from a location than the number of users at that location. Ultimately, such casual real-time applications may drive much higher bandwidth requirements.
Telemetry applications involve primarily numerical data streams. They are expected to grow with the proliferation, and networking, of embedded computing and communications systems--smart appliances and so on--as well as networking capabilities within and from the home. Sensors and controls are being developed for a variety of functions in a household, such as temperature and energy management, utility monitoring, appliance operation, and security. More sophisticated health-monitoring systems are also being developed. For example, it may become possible to undertake skin cancer screening from home, which requires an ability to capture and send high-resolution images.
There is growing interest in telemedicine services that require broadband access. Possible connections include patient-to-doctor (e.g., in rural health care, where travel to the doctor's office is difficult), patient-to-physical therapist (e.g., supporting rehabilitation after a patient returns home following hip surgery), and patient-to-family (e.g., to allow a family to watch a newborn in neonatal intensive care).
Telemetry applications rely critically on the always-on characteristics of broadband and the ability of broadband to multiplex many data streams (for example, to allow a medical device or an appliance to emit and transmit a data stream regardless of what else is going on over the broadband connection), in contrast to dial-up connections. In many cases the data streams involved are low bandwidth. However, some applications, such as webcams or health-monitoring devices that transmit images, could result in demand for capacity that is higher upstream than downstream. Although primarily deployed for planning communication with a medical service, the same broadband connections might also support emergency response capabilities similar to or enhancing today's telephone-based systems. Such services presume, of course, reliable always-available connections.
Peer-to-peer communication was the original design premise of the Internet. Particularly with the rise of the Web, the focus of communications on the Internet shifted to the client server as central Web servers became the primary residence of Internet content. Recently, however, peer-to-peer communications among end systems on the Internet has undergone a renaissance, owing at least in part to the grass-roots movement toward sharing content.
Napster, developed as a way of exchanging MP3-encoded music files, became a widely used peer-to-peer content distribution application. By offloading the file transfer to exchanges between individual computers, it relies much less on third-party servers than would be the traditional practice of many users downloading content from a single server. Napster still relies on a central directory server to provide people with pointers to content, but other peer-to-peer applications have emerged that largely remove this constraint. Gnutella, for example, is a Napster offshoot that allows users to conduct a search among linked, decentralized computers offering content; however, it still depends on some means for users to obtain the Internet address of at least one such linked computer (whether through a Web page, e-mail, or instant messaging). Although these recreational services have received a lot of attention,15 and their fate rests in part on the outcome of litigation and negotiations with the publishing industry, similar technologies have taken off for research activities.16
The motivations for deployment of these applications are several. Technical arguments include immunity from single-point failure and distribution of traffic load throughout the network. Much of the interest in Napster has, however, stemmed from another factor--the relative protection that peer-to-peer models offer from attempts to control the content distribution. A central Web server is a relatively easy target if one seeks to suppress undesired or illegal activity--in the case of Napster, the distribution of music in violation of copyright--while a distributed network of computers exchanging files is harder to detect and, because it potentially involves thousands or millions of participants, to take action against.17
There are additional compelling arguments for peer-to-peer applications. By their nature, they do not require the installation of servers or arrangements with businesses that offer hosting services (or other capabilities). Thus they offer a speed and ease of deployment applications much in the spirit of the Internet's pure end-to-end model--a new application depends only on software running on the individual computers and adequate network performance, and not on the installation of software on a hosting server. The appeal is twofold: nimbleness that comes from not having to coordinate with any other party when rolling out content or applications, and the freedom and control over one's own content that come from not having to involve a third party. Given such attributes, pilot efforts are underway to use the technology in business, the scientific community, and so forth.18
Although the mass appeal of community-access television is debatable, cable has provided a vehicle for communities, organizations, and individuals to gain some experience and to experiment with video content production. Broadband promises to generalize and build on that experience by enabling a more varied menu of content not constrained by finite studio and broadcast time slots. In the short term, constraints on long-haul bandwidth may preclude wide-area transmission, but most of the interest would be local in any event. Local-interest video programming requires high bandwidth within a community, suggesting that it is most likely to take off with fiber and most likely to be linked early to community-wide fiber networks. The traveling parent who wants to watch the local Little League game will have to settle for very low quality video, or pay very dearly (if that is even possible, since the limiting factors will be community connectivity to the core net, not just the traveling parent's connectivity). However, the ability for remote family and friends to see (literally) local activities is socially valuable; the sharing of family photos, Web sites for children (beginning prenatally in some instances), and other grass-roots activity begun with narrowband suggest the potential for growth.
There are many applications--distinguished by their not requiring delivery of information that changes in real time--that lend themselves to either a model of local hosting or a model in which users upload content to content servers. These include, for example, Web page hosting, making photos available for others to download, sharing music. (In contrast, there is no substitute for upstream capacity for applications that depend on transmission of delay-sensitive real-time content out of the home, as is required for telephony, videoconferencing, or webcams. Also, home control and other applications that access sensor information and then take control actions must access the actual home.) The content-hosting alternative still requires upstream capacity, but it involves the transfer of content only once each time it is modified; those accessing the content download or stream it from one or more third-party servers located somewhere in the Internet.19 Use of hosting services is a common practice for both business and personal content, and a number of businesses provide services in this area. Third-party hosting offers several advantages. The provider, who specializes in that sort of service, takes on responsibility for appropriate interconnection and colocation arrangements to ensure good performance for users throughout the Internet. A third-party hosting provider also generally provides other desirable functionality, such as redundant facilities, backup power, and data backup.
The choice between these two alternatives--local hosting or use of a hosting provider--depends on many factors. First, there are trade-offs, depending on how often things change versus how often they are used. For example, a home webcam might change once per minute and have to push a new image to a server at that point, but if the image is only accessed rarely, then most of those server updates will have been pointless and the network would be less loaded if users directly accessed the camera. Consumer preference, including such considerations as wishing to maintain personal control over content, also plays a role. The emergence of a "Napster culture" suggests demand for the local hosting approach, but the future of this model is unclear, as is the future of peer-to-peer itself, in part because many of today's broadband services provide limited upstream capacity and because ISPs may discourage or prohibit users from running their own servers or consuming large amounts of upstream bandwidth. The balance might tip as significantly more upstream bandwidth is made available in the local access segment.
Various business models assume an ability for different kinds of parties to push content into homes--that is, rather than await a specific request, always-on connectivity would enable these parties to transmit content into homes on a variety of schedules. Some of these arrangements would be highly functional--updates to device software, regular and automatic updates to databases maintained in the home, diagnostic probes (which would trigger responses), and so on. Other arrangements may be part of the "price" of a device or service, such as advertising.
Understanding how demand for networked capabilities and services will evolve is extraordinarily difficult. It is apparent that there are multiple broadband applications of interest and that some sort of composite of this is likely to typify future broadband use. To complement rampant speculation, a number of scholarly and corporate entities have begun to develop model homes of the future, which are laboratories, showcases, or both, for potential windows into new options for home life. These possibilities leverage many developments, as explained in a variety of speculative media pieces:
The fusion of technology and materials is making new forms possible. Add the potential of artificial intelligence, biometric sensing, robotics and mass customization, and it's little wonder that designers are imagining a new generation of houses in which people rule their environments, rather than submit to them. Web-linked companies already are rolling out model homes with all the click-and-drag amenities available today. They trumpet a lifestyle in which work, play and shopping are only a palm-held device away. It's the profusion of gadgets, and the dependence on them and the linkages among them, that will define the future of this house.20
These visions imply bandwidth demand associated with both individual household members and devices; people will use networks to communicate with each other, and devices will communicate with each other (and with people) directly, too. The descriptions suggest movement toward more symmetric communication capability--in the limit, equal upstream and downstream capacity--for homes; but how much remains an open question. In the meantime, the descriptions clearly argue for in-home networking and multiple access points in homes, and suggest choices still to be worked out about how data is managed in the home. How the data get sent around or through the home becomes a critical factor in a number of applications, as does the interplay between storage in the home and remote storage. And whether people will have to reboot their homes under various circumstances raises other questions, from how to contain various risks to interactions of the information infrastructure with power supply in the household to disaster recovery options (familiar to businesses and encouraged by commercial insurance).
There are perils in scrutinizing any one application too closely. Although it is important to appreciate how the technical requirements of applications vary, the promise of broadband is simultaneous support for a large number and a wide variety of applications rather than just one or two. Moreover, the broadband vision involves several people in a household using different applications concurrently--perhaps more than one application per individual--as well as more network-based interaction among people in multiple locations (from extended family to work or study collaborators to fellow hobbyists). At the same time, people are already experimenting with--and being subjected to more hype about--mobile computing and communications devices. The eventual context for broadband is thus one of anytime, anywhere networking and an information infrastructure that is pervasive and integral to many places and activities. The individual and social implications of this aggregate of activity suggest new behavioral patterns that themselves may stimulate new applications. The impact on quality of life can be much greater than that suggested by any one application, and its full potential hinges on growth in number of users as well as uses, since some applications involve sharing among households (and/or between households and a variety of public and private institutions).
The enabling technology remains only a piece of the picture, of course, and it interacts with expectations for how it would be used. In the committee's June 2000 workshop, Andrew Cohill, speaking from the experience of the Blacksburg Electronic Village initiative in Blacksburg, Virginia, noted that the aggregate bandwidth demand as conventional applications (communications such as telephony and entertainment such as radio and television) migrate to the Internet would exceed the bandwidth available from today's DSL and cable services. The expectation for significant outflow as well as inflow of content opens up the possibility of new kinds of connections from the home to points outside. For example, a family's (or friends' and family's) virtual private network (VPN) could be established to promote social sharing, much as corporate VPNs enable protected communication among co-workers and others granted access, regardless of location.
The majority of existing and potential broadband applications assume a person at the end of the pipe actively using the content being served, whether he or she is watching a movie, shopping on the Web, or talking to a doctor. With this assumption, there is a potential upper bound on the demand for broadband, as it is limited by the number of people in a typical home. However, some futurists, as well as some commercial appliance vendors, anticipate a demand that is more accurately bounded by the number of information appliances in the home--autonomous consumers and producers of content that rely on the always-on capabilities of a broadband connection.
Although the scenario of the dishwasher that independently calls the repairman for service has met with appropriate skepticism, there are already information appliances in the marketplace and in people's homes. Internet photo frames are a good example. Marketed by various companies, these frames are essentially an LCD (liquid crystal display) with a phone connection packaged as a traditional picture frame. In current versions, the frames connect to the Internet at off-hours (e.g., 3:00 a.m.) and download new photos that have been sent to the appropriate Web site by family members and friends. A simple extension of this idea would be wall art that loads art pieces from various museum collections.
Connecting these displays to live content (perhaps including a time delay) offers the ability of viewing, say, the London skyline and bridges fresh each day. A poster of Africa in a child's bedroom could be replaced with live webcam images from safari rides and waterholes.21 Returning to the theme of families sharing photos, appliance designers predict aesthetically pleasing (and privacy-preserving) representations of the well-being of a loved one. In one scenario, opening the portable photo frame of a family member while traveling triggers switching an art piece in the home from black-and-white to color.
Returning to the dishwasher, while there are sound objections to appliances requesting people to arrive at the owner's home, it is less far-fetched for an appliance under warranty to preorder new parts upon detecting a failure, or for prescribed medications to directly request refills. Although business models for new Internet services (e.g., online grocery reordering) may not work, some extensions to existing services may prove to be economical and desirable.
Distributed work and education--which depend on e-mail, file transfer, and sometimes on audio- and videoconference capabilities--have long been touted as applications for information networks; both have already benefited from narrowband Internet access. Following significant growth in the 1990s, a sizable minority of companies are believed to offer a telecommuting option to some employees, presumably as a result of the proliferation of personal computing and communications options as well as the impetus provided by a variety of situations (e.g., California earthquakes) that have increased transportation problems.22 At the same time, there have also been reports of dissatisfaction on the part of both employees and employers.
Forecasts have included expectations of growing use of multiple media (e.g., enabling simultaneous transmission of data and voice or of at least two streams of data) and of conferencing involving multiple media, including video as well as audio links. One enabler would be availability of connectivity comparable to the 10 Mbps typical of low-end office local area networks, with more symmetric bandwidth enabling more symmetrical use.
One small study examined reactions of people working at home to a transition to DSL service and found overall satisfaction based on the increase in their productivity attributed to higher-speed connectivity; people also noted that the productivity benefit depended on whether other home-based workers with whom they collaborated also had such connectivity.23 That kind of comment underscores the potential for qualitative change in an activity from widespread availability of a capability--change not visible when availability is unevenly distributed among a population, such as a group of teleworkers.
Distributed education, like distributed work, involves remote access to information and communications. Discussions of distributed education are more likely to involve use of still and moving images with broadband; they also involve conferencing for interaction among multiple students. Note that distributed education is expected to benefit both adults and children.
A new sort of composite application that some have begun to call "tele-webbing," which combines Internet access with conventional television viewing, is beginning to appear. Simplistically, accessing the Web while also watching television would qualify for this description, and indeed it is common for people to engage in other activities while also watching entertainment television that has low attention demands. Thus, the consumer who scans e-mail while watching a sitcom could be said to be tele-webbing. More interesting, however, are cases now emerging where the television watching and Web access are interrelated. For example, many sports Web sites now provide real-time Web applications that feed game statistics to a browser. Having such a site open while watching a televised sports event provides a deeper experience of the event. For an even more real-time experience, experiments have been done with making race-car telemetry information available concurrently with a race broadcast. This allows a measure of user selectivity in how the race is experienced, since the user can focus attention on a particular driver. Finally, various levels of viewer interactivity have been evaluated for making television game shows (which have long elicited vicarious play-along-at-home experiences) truly interactive. All of these ideas involve taking advantage of a second screen that the user can selectively use for added experiences. Importantly, all these applications involve constraints on tolerable latency for the data streams relative to the primary video streams. This class of applications may be another example of where the total bandwidth demand to a home may exceed what the user can consume at any instant because the value of these applications lies at least in part in the user's ability to instantly shift attention from one video feed to another screen full of information.
Community networking efforts to date provide a window into the interactions and synergistic possibilities presented by greater networking capabilities among people in a given area, who presumably have at least some shared interest in a common set of information or in communicating with each other. With disappearance of a number of the pioneering bulletin-board-type community networks, network communications have tended to become less geographically focused. And as dial-up Internet access via commercial ISPs has become widespread, community networking initiatives have, for the most part, focused less on building local infrastructure and more on content and services. Contemporary approaches to community networks are likely to emphasize a variety of service activities that accompany deployment and facilitate use, such as information resources and training, economic incubation, pilot and demonstration projects, and development of public-private partnerships. But regardless of how it is labeled, the attention to local interests has persisted; it is expressed in the various Web sites established by local governments, schools, libraries, athletic consortia, religious institutions, and so on--a diverse group of sources that defies the categorization of the more controlled local access cable television or local radio station and that offers the potential of upgraded offerings where capabilities are available.
Note that on a small scale, multiunit dwellings (e.g., apartment buildings) can serve as microcommunities: the availability of broadband to individuals in the component units is constrained by decision making of the owner; where the owner is supportive, all units can have this capability, but the reverse tends to be true as well. Also, in some communities, special centers have been established that offer broadband capabilities together with the hardware and software to take advantage of them--a physical portal.24 These communications centers complement the concentrations of demand in such public-interest (and often publicly supported) facilities as medical and education centers of different kinds. Thus, it is important to recognize that community networks have both infrastructural and content dimensions.
There is much potential for future applications that enrich or complement traditional content and communications channels, but excitement about them should be tempered by an appraisal of the time frame in which these applications could be realized and the nontechnical obstacles that retard their deployment. Much of the expectation surrounding broadband involves more than new technology--it also requires a transformation of societal structures, media, and other institutions. This section briefly discusses some of these factors.
One obstacle is the availability of content. A recent television commercial from Qwest exemplified the expectations--being able to access every book ever published in any language and every movie ever made available, on demand over the Internet. In reality, we are some time away from widespread video-on-demand; thousands of channels of "radio" over the Internet; abundant, high-quality educational video content; and so forth.25
In addition to technical obstacles, the familiar chicken-and-egg phenomenon comes into play. Without a mass market of consumers with broadband access, it is hard to develop a business model that justifies investment in new content (or translating old content). One new media businessperson, Andrew Sharpless, addressed the committee from his vantage at that time of developing new online services for Discovery Communications. He suggested that at least 10 million households would need to use broadband before meaningful content would emerge, and he noted that cable experience shows that serving 50 million customers is key to lining up advertisers (with online services, a top rating by Jupiter Media Metrix had become key to advertiser interest by 2000).26
Intellectual property rights issues are another large factor--the interests and holdings of broadband providers, users, and rights holders are not necessarily aligned. The 2000-2001 rise of Internet radio raised a set of issues related to content use fees, and the popularity of Napster and other content-sharing technologies heightened rights holders' concerns about control over their intellectual property, making intellectual property more prominent in the development of business plans.27
Finally, although content availability affects demand for broadband, one should not underestimate the volume and value of customer-provided content. Broadband is not only a mass media technology; it is also an interpersonal technology. As noted above, messaging and e-mail are both very popular applications, illustrating the value of broadband for communication as well as content delivery.28 Multiplayer games, one of the few profitable Internet applications today, rely on user-provided content. Telemedicine will rely, in large measure, on user-provided content, plus some professionally prepared patient education materials. Families will generate and want to distribute pictures and home movies.
One category of impact on quality of life derives from broadband's interaction with consumption of media: broadband is associated with the allocation of more time to media consumption overall, in part because it puts Internet use on a par with TV and radio use.29 Whether the increase in media consumption is transient or long term, and what it may imply for (or as a result of) other activities that may receive less time than media consumption, remain to be seen. It is not known whether the home is truly an infinite sink for bits over time or whether there is some limit that one can compute, based on something like human perception or expectations about other things going on in the home. In terms of general information access, one could argue that broadband provides limited content beyond that available through dial-up. There is little unique content available only to broadband users that is not duplicated on cable television today (e.g., CSPAN). In terms of applications, the prominent examples deal with entertainment--access to interactive games, or even a broader assortment of music than one can find on the local radio channels is unlikely to compel public policy support, but these applications, along with day-trading tools and e-commerce, exercise the technology (and, sometimes, the law) and build an experience base for the underlying capabilities.30 Attention to individual categories of information or applications can obscure the larger development, which is a shift to an expectation of ubiquitous access to a variety of information and applications. But ubiquity does not imply endless variety: experience with television shows that consumers limit themselves to seven to nine channels, implying high cost in searching for acceptable content--and that aids in effecting such search may be an important complement to content innovations per se. The oft-made contrast between children who are exposed to computer-based technology early on and adults who are introduced to it at older ages underscores the potential for cumulative experience to change people's expectations and behavior.
Several emerging applications described above may be more compelling from a policy perspective. Telecommuting can have positive impacts on the environment, local economies, and people's ability to earn a good living. Telemetry and monitoring applications can enhance health care delivery. Basic communications and telepresence applications can help keep children and elderly parents connected. And broadband can be used to deliver more sophisticated (multimedia and interactive) educational content. But many of these applications remain more promise than reality.
There is time to consider and act on possible negative impacts (from the obvious questions about privacy and security to the more idiosyncratic ones relating to cases of excessive use). For example, public- and private-sector attempts to deal with spam originating in the narrowband context are likely to take on new urgency in the broadband context. If any kind of communications of the past is any guide, people will send information whether there is demand or not, and the prospect of video spam may arouse people even more than fax and e-mail spam have. Security concerns have arisen, associated with the always-on nature of broadband. But with anticipated assimilation of broadband into a technology-intensive household, other concerns will arise. For example, just as people change physical locks after, say, a divorced spouse leaves, a kind of virtual door is developing with broadband, and there may be a kind of virtual set of keys to change, too. This is also a time to address the implications of technology options for the disabled: Some of the envisioned capabilities will make it easier for people with disabilities to remain in their homes; some may require appropriate design for effective use by people with disabilities. Consideration of differences in abilities leads naturally to consideration of human-computer interaction and user interfaces; progress in these areas may facilitate use by all.
While the lag in compelling applications may contain growth in demand for broadband, its silver lining may be to limit the impact of disparities in access and use. Measurements of the disparity are in flux, given progress in deployment and adoption, but significant differences have been noted by region, locality, racial and ethnic groups, income, educational attainment, and age.31 Income and educational attainment tend to drive demographic differences; geographic differences reflect the larger complex of factors governing deployment discussed in Chapter 4. Progress implies deeper understanding of differences in media consumption among population groups and options for diminishing disparities. Consumer advocates report an overall bimodal pattern, with clusters of low-volume and extremely high-volume users of media and differences in terms of what people use communications for. The bottom half pays $60 per month for all services, while the top half pays $200 per month. Different business models are needed to serve the lower half, and observable business models for broadband seem to focus on the upper, more-lucrative half.32 Of course, it is reasonable to expect that during the transition from broadband, even people who could shift will not do so at once, and those who do so first value the capabilities more and will pay more.
The picture painted in this chapter, of multidimensional change in household technology and activities, suggests that raising the floor for residential broadband implies addressing total and life-cycle costs. That is, the broadband-enabled changes in lifestyle and quality of life that could occur presume both network deployment and consumer electronics and applications (software, services), all of which may impinge on household budgets (for acquisition and operation or regular use costs) and requirements for know-how (the aggregate of technologies and activities imply new set-up, use, and maintenance activities). The full, short- and long-term impacts on household economics deserve further study, perhaps in conjunction with the model homes and communities that have been or will be initiated. For example, for connectivity alone, it would be useful to compare the costs of various scenarios, from current options, such as multiple phone lines plus cable or satellite, to broadband connectivity (with or without additional home connections), with different approaches to in-home networking and customer equipment. Business models make different assumptions about demand and willingness to pay.33 Understanding the implications of alternative approaches is a logical element of public- and private-sector planning. The combination of failures and successes in Internet-related and, more generally, media-related services underscores a dearth of social science insight into how people use and respond to new media.
Finally, there are the uncertainties relating to what people want and will do. Assimilation of these changes, of course, presumes assuring that the use of these capabilities is perceived as valuable (appealing and relevant) to multiple categories of people. Historically, with television, which was fundamentally familiar, the introduction of pay-TV programming led to consumers often electing to buy multiple services, which are differentiated. The new, nonincremental and interacting broadband options are less familiar than were variations on the TV theme. There is some evidence that willingness to pay increases with consumer control: in the committee's June 2000 workshop, for example, AT&T researcher Andrew Odlyzko compared people's willingness to pay $40 per month for cable for 100 Mbps consumed 3 hours per day with $70 per month for wireline phone for 64 kbps used 1 hour per day with $40 to $50 per month for wireless phone for 8 kbps used less than 10 minutes per day. This line of argument complements that of consumer advocates and others who argue for open access (see Chapter 5 in this report), as a counter to a fear that content coming into the home will be overly controlled by commercial providers. It is not surprising that local efforts that link deployment to economic development tend to feature awareness and training programs,34 while various nonprofit and entrepreneurial efforts seek to generate content that is of interest to specific demographic groups. Recognizing that socioeconomic context affects willingness and ability to use new technology does not necessarily make it easier to devise effective strategies, and trial and error is evident.
1 One rule of thumb, the "8-second rule," states that if it takes longer than 8 seconds for a page to appear on the consumer's screen, there is a high likelihood that the consumer will abandon the site. See, for example, Zona Research, 1999, The Economic Impacts of Unacceptable Web Site Download Speeds, available online at <http://www.zonaresearch.com/deliverables/white_papers/wp17/>.
2 See "AOL's Minutes." 2001. The Washington Post, March 8, p. E11.
3 "Survey Says: DSL Users Addicted to Broadband," April 3, 2001, available online at <http://www.sbc.com/News_Center/1,3950,31,00.html?query=20010403-1>. Pierre Bouvard and Warren Kurtzman. 2000. The Broadband Revolution: How Superfast Internet Access Changes Media Habits in American Households. Arbitron Company, New York. Available online at <http://www.arbitron.com/> and <http://www.colemanresearch.com/>.
4 "Reality Bytes." 2001. The Wall Street Journal, January 29, p. B8, using figures from Jupiter Research and Media Metrix.
5 Robert Gehorsam, personal communication, briefing to CSTB Committee on IT and Creativity, November 9, 2000.
6 With a low-latency network connection and a sufficiently large buffer, limited retransmission may be tenable, but this is not the typical practice in streaming protocols. Indeed, lower performance is observed in applications where a Transmission Control Protocol (TCP) connection (which builds in retransmission of lost data) is used in place of the raw User Datagram Protocol (UDP)-based transport. The time taken by the TCP algorithm to handle packet loss translates into much higher delay and jitter figures.
7 Released in 1980, the CD audio specification (the so-called red book standard) makes use of an inefficient compression scheme that requires about 1.5 Mbps. Today, considerably better compression algorithms are available.
8 Fortunately, the bit rate grows less than linearly with the number of channels.
9 U.S. Census Bureau. 2000. Statistical Abstract of the United States. U.S. Census Bureau, Department of Commerce, Washington, D.C., p. 567. Available online at <http://www.census.gov/prod/2001pubs/statab/sec18.pdf>.
10 See Chapter 4 of Computer Science and Telecommunications Board, National Research Council, 2001, The Internet's Coming of Age, National Academy Press, Washington, D.C.
11 Interactive, networked music performances are already being attempted.
12 Hal Varian. 2000. "Cool Media: A New Generation Is Turning the Tables on Television." The Industry Standard. November 20, p. 293.
13 See Steve Harrison and Paul Dourish. 1996. "Re-Place-ing Space: The Roles of Place and Space in Collaborative Systems." Proceedings of the ACM Conference on Computer-Supported Cooperative Work CSCW'96 (Boston, Mass.). ACM, New York. Draft version available online at <http://www.parc.xerox.com/csl/members/dourish/papers/place-paper.html>.
14 Chris Chafe, Stanford University, personal communication, briefing on digital music-making to CSTB Information Technology and Creativity Committee, January 12, 2001.
15 Similar services have also been introduced, such as AIMster, which leverages AOL's Instant Messenger for file transfer.
16 Intel has been encouraging such applications through the Intel Philanthropic Peer to Peer Program (see <http://www.intel.com/cure/program.htm>).
17 For an examination of technical and other factors surrounding intellectual property rights in a networked world, see Computer Science and Telecommunications Board, National Research Council. 2000. The Digital Dilemma. National Academy Press, Washington, D.C.
18 The surge of popularity in peer-to-peer applications also raises speculative questions of whether the Internet really is evolving toward being the basis of a distributed computer. If the answer is yes, then one might think of computer bus speeds as giving some sort of an upper limit to broadband speeds. Today, the 32-bit bus of a 1.5-GHz Pentium 4 runs at 48 Gbps in both directions (peak, or at least half of that speed in both directions). This view would support the eventual migration toward a fiber to each home (which, as is discussed elsewhere in this report, is something that may take some time to happen). There may also be an accompanying trend that, as speeds increase beyond the human limits of audio and video, the bandwidth demands become more symmetric.
19 Note that this model can also be applied to near-real-time content, such as audio or video broadcasts of live events; one copy of a stream can be pushed out to buffers on local content servers for multiple users to access, as demonstrated by Akamai's technology for streaming video. However, unlike uploaded content, streamed uploading implies a steady-state demand for upstream content.
20 Linda Hales. 2001. "Blobs, Pods and People." The Washington Post Magazine, March 25, p. 37.
21 See <http://www.africam.com/>.
22 Patricia Riley, Anu Mandavilli, and Rebecca Heino. 2000. "Observing the Impact of Communication and Information Technology on 'Net-Work'." Telework and the New Workplace of the 21st Century. U.S. Department of Labor, Washington, D.C. Available online at <http://www.dol.gov/dol/asp/public/telework/p2_3.htm>.
23 Riley, "Observing the Impact of Communication Technology," 2000.
24 Richard Civille, Michael Gurstein, and Kenneth Pigg. 2001. "Access to What? First Mile Issues for Rural Broadband," white paper; see Appendix C.
25 Unrealistic expectations have been rampant when it comes to home technology, if not the Internet generally. For example, the Washington Post published an article in 1994 that suggested that going online will not support new relationships, online banking, real-time game-playing, "basking" in multimedia, hobnobbing with celebrities, and online shopping, most of which has, in fact, happened, at least to some degree even with low bandwidth. See Jim Kennelly. 1994. "9 Ways Going On-Line Can Change Your Life and 6 Ways It Can't," The Washington Post: Fast Forward, September, pp. 9-13.
26 Andrew Sharpless, personal communication, briefing to the committee, November 1999. He discussed how Discovery Online scaled back its content expectations because of these considerations.
27 For an in-depth exploration of the issues surrounding intellectual property rights in a digital, networked environment, see Computer Science and Telecommunications Board, National Research Council, 2000, The Digital Dilemma, National Academy Press, Washington, D.C.
28 Some argue that the value of communications applications such as messaging is underappreciated compared to content delivery. See Andrew Odlyzko. 2001. "Content Is Not King." First Monday 6(2)(February). Available online at <http://www.firstmonday.org/issues/issue6_2/odlyzko/>.
29 Pierre Bouvard and Warren Kurtzman. 2000. The Broadband Revolution: How Superfast Internet Access Changes Media Habits in American Households. Arbitron Company, New York. Available online at <http://www.arbitron.com/> and <http://www.colemanresearch.com/>.
30 One might draw a limited analogy to the supply of simple games with the Windows operating system and Palm devices to help people get used to manipulating a mouse in the former case and the Graffiti writing system in the latter.
31 See NTIA's Falling Through the Net series.
32 Gene Kimmelman, Consumers Union, personal communication, briefing to the committee, November 1999.
33 For example, Time Warner Cable's $40 per month charge was based on the recognition that consumers were paying $19.95 per month to an ISP and more for a second phone line.
34 Glasgow, Kentucky, was a pioneer in providing broadband, but the experience showed slow adoption and uncertainty about why the capability should be used, necessitating efforts to generate awareness, interest, and use. See Anick Jesdanum. 2000. "Wiring Rural America: Just the Beginning." Associated Press, September 6. Available online at <http://www.msnbc.com/news/452691.asp?cp1=1>.