A Brief History of the Internet

发布网友 发布时间:2022-04-20 06:17

我来回答

6个回答

热心网友 时间:2022-04-11 22:48

五○年代末至六○年代对美国而言是一个不安的年代。从古巴的核子飞弹危机到与苏联的冷战、越战爆发、*干预第三世界国家造成对立等,皆在考验美国。美国人也一直笼罩在若与苏联发生战争,将遭受核子飞弹攻击的阴影之下。美*方深信,一旦发生大战,胜利者将会是拥有科技优势的一方,因此美国国防部于五○ 年代末成立 ARPA (Advanced Research Project Agency),有关信息处理技术为其中重要之研究工作,其它研究机构亦积极展开科技研发工作,其中计算机更是进步神速。

到六○年代末,各*研发单位与主要大学已拥有当时最尖端的计算机设备了。为了分享研发数据,美国国防部有意将现有的计算机设备连接起来成为一个网络,于是 ARPA 委托 BBN 公司研究发展此系统。这个网络的基本要求是战争爆发时,即使网络上的线路或设备部分遭受破坏,系统必须仍能运作。 BBN 利用「分封交换」(packet switching) 技术于 1969 年在美国西部架设了一个实验网络,连接加州与犹他州四所大学中的计算机设备,并订出 NCP (Network Control Protocol) 的通讯协议,这个网络的特色是万一有部分的网络出问题,其它计算机仍能靠其它线路维持联机,此网络称做 ARPANET,为因特网 Internet 的前身。

到 1971 年时,ARPANET 已经连接了包含哈佛大学与麻省理工学院等四十多个学校、军方及*机构单位,并制定出远程终端模拟 (Telnet) 与档案传输 (FTP) 标准协议。1972 年时于华府公开展示并成立 INWG (Inter-Networking Group),推举瑟夫 (Vinton Cerf,有人称此位先生为 Internet 之父,目前为 MCI副总) 担任*,以研订此网络之传输标准协议,同时首封电子邮件 (electric mail) 由BBN之汤林森 (Ray Tomlinson) 经ARPANET 发出。远程终端仿真、档案传输与电子邮件三种服务可说是 Internet 最早提供的应用服务。1973 年时,Internet 首度连往英国、挪威等美国以外地区。

1974年,瑟夫与康恩 (Bob Kahn) 提出 TCP/IP 通讯协议 (Transmission Control Protocol/Internet Protocol) ,解决了跨越不同计算机系统连接的问题,因此很受欢迎。1976年,由 BBN、史丹佛大学、College 大学研发出路由器 (router) 设备,使得网络连网更加的方便。后来美国国防部做了一个令人意想不到的决定:将 TCP/IP 的所有技术公开,让全世界免费使用。于是,一个原先因为战争的阴影而发展出来的网络尖端技术,最后竟然完全公开供全世界计算机通讯使用。

然而 ARPANET 究竟是美国国防网络,使用之用户都是经过选择的 (与军方有合作计划的研究单位、大学等) ,并非想连接就可以连接,因此有必要建立一个类似ARPANET 的网络但专供计算机科学研究用。1981年,美国国家科学基金会 (National Science Foundation,NSF) 出资建立了CSnet (Computer Science Network)正式运作。此时 TCP/IP 的原设计者瑟夫建议 ARPANET 与CSnet应该可以透过网关器 (gateway) 彼此相连,他更建议 CSnet应该可以让其下面的不同子网络能共享相同的网关器与 ARPANET
相连,此时真正的 Internet 才算诞生。

1986年,NSF 出资建立美国研发网络骨干系统 (NSFNET),提供高速之数据传输能力,使公民营研发机构及学校能以路由器连上此高速骨干网络,并于 1987 年委托 Merit、IBM、MCI 等专业维护运作厂商运作,提供研发单位更好之计算机网络。1989 年ARPANET 功成身退后,由 NSFNET 接手研发单位之计算机网络服务,1991 年建立商用之 Internet 连网交换服务 (Commercial Internet Exchange,CIX),Internet 商业化开始萌芽。

在 1993 年柯林顿公布总统之 E-Mail 地址,使大众可用 E-Mail 与总统交换意见,积极推动 NII 计划,以振兴美国经济。此后多媒体、使用便利的 WWW 浏览器之出现,广大民众不再害怕使用计算机网络,使得 Internet 急速地在世界各地发展。现在,Internet 已成为连接到全球大部分国家、超过十万四千个网络及三千万部计算机主机的庞然大物。

参考资料:网际网络的历史

热心网友 时间:2022-04-12 00:06

The Internet was the result of some visionary thinking by people in the early 1960s who saw great potential value in allowing computers to share information on research and development in scientific and military fields. J.C.R. Licklider of MIT, first proposed a global network of computers in 1962, and moved over to the Defense Advanced Research Projects Agency (DARPA) in late 1962 to head the work to develop it. Leonard Kleinrock of MIT and later UCLA developed the theory of packet switching, which was to form the basis of Internet connections. Lawrence Roberts of MIT connected a Massachusetts computer with a California computer in 1965 over dial-up telephone lines. It showed the feasibility of wide area networking, but also showed that the telephone line's circuit switching was inadequate. Kleinrock's packet switching theory was confirmed. Roberts moved over to DARPA in 1966 and developed his plan for ARPANET. These visionaries and many more left unnamed here are the real founders of the Internet.

The Internet, then known as ARPANET, was brought online in 1969 under a contract let by the renamed Advanced Research Projects Agency (ARPA) which initially connected four major computers at universities in the southwestern US (UCLA, Stanford Research Institute, UCSB, and the University of Utah). The contract was carried out by BBN of Cambridge, MA under Bob Kahn and went online in December 1969. By June 1970, MIT, Harvard, BBN, and Systems Development Corp (SDC) in Santa Monica, Cal. were added. By January 1971, Stanford, MIT's Lincoln Labs, Carnegie-Mellon, and Case-Western Reserve U were added. In months to come, NASA/Ames, Mitre, Burroughs, RAND, and the U of Illinois plugged in. After that, there were far too many to keep listing here.

The Internet was designed in part to provide a communications network that would work even if some of the sites were destroyed by nuclear attack. If the most direct route was not available, routers would direct traffic around the network via alternate routes.

The early Internet was used by computer experts, engineers, scientists, and librarians. There was nothing friendly about it. There were no home or office personal computers in those days, and anyone who used it, whether a computer professional or an engineer or scientist or librarian, had to learn to use a very complex system.

E-mail was adapted for ARPANET by Ray Tomlinson of BBN in 1972. He picked the @ symbol from the available symbols on his teletype to link the username and address. The telnet protocol, enabling logging on to a remote computer, was published as a Request for Comments (RFC) in 1972. RFC's are a means of sharing developmental work throughout community. The ftp protocol, enabling file transfers between Internet sites, was published as an RFC in 1973, and from then on RFC's were available electronically to anyone who had use of the ftp protocol.

Libraries began automating and networking their catalogs in the late 1960s independent from ARPA. The visionary Frederick G. Kilgour of the Ohio College Library Center (now OCLC, Inc.) led networking of Ohio libraries ring the '60s and '70s. In the mid 1970s more regional consortia from New England, the Southwest states, and the Middle Atlantic states, etc., joined with Ohio to form a national, later international, network. Automated catalogs, not very user-friendly at first, became available to the world, first through telnet or the awkward IBM variant TN3270 and only many years later, through the web. See The History of OCLC

The Internet matured in the 70's as a result of the TCP/IP architecture first proposed by Bob Kahn at BBN and further developed by Kahn and Vint Cerf at Stanford and others throughout the 70's. It was adopted by the Defense Department in 1980 replacing the earlier Network Control Protocol (NCP) and universally adopted by 1983.

The Unix to Unix Copy Protocol (UUCP) was invented in 1978 at Bell Labs. Usenet was started in 1979 based on UUCP. Newsgroups, which are discussion groups focusing on a topic, followed, providing a means of exchanging information throughout the world . While Usenet is not considered as part of the Internet, since it does not share the use of TCP/IP, it linked unix systems around the world, and many Internet sites took advantage of the availability of newsgroups. It was a significant part of the community building that took place on the networks.

Similarly, BITNET (Because It's Time Network) connected IBM mainframes around the ecational community and the world to provide mail services beginning in 1981. Listserv software was developed for this network and later others. Gateways were developed to connect BITNET with the Internet and allowed exchange of e-mail, particularly for e-mail discussion lists. These listservs and other forms of e-mail discussion lists formed another major element in the community building that was taking place.

In 1986, the National Science Foundation funded NSFNet as a cross country 56 Kbps backbone for the Internet. They maintained their sponsorship for nearly a decade, setting rules for its non-commercial government and research uses.

As the commands for e-mail, FTP, and telnet were standardized, it became a lot easier for non-technical people to learn to use the nets. It was not easy by today's standards by any means, but it did open up use of the Internet to many more people in universities in particular. Other departments besides the libraries, computer, physics, and engineering departments found ways to make good use of the nets--to communicate with colleagues around the world and to share files and resources.

While the number of sites on the Internet was small, it was fairly easy to keep track of the resources of interest that were available. But as more and more universities and organizations--and their libraries-- connected, the Internet became harder and harder to track. There was more and more need for tools to index the resources that were available.

The first effort, other than library catalogs, to index the Internet was created in 1989, as Peter Deutsch and his crew at McGill University in Montreal, created an archiver for ftp sites, which they named Archie. This software would periodically reach out to all known openly available ftp sites, list their files, and build a searchable index of the software. The commands to search Archie were unix commands, and it took some knowledge of unix to use it to its full capability.

At about the same time, Brewster Kahle, then at Thinking Machines, Corp. developed his Wide Area Information Server (WAIS), which would index the full text of files in a database and allow searches of the files. There were several versions with varying degrees of complexity and capability developed, but the simplest of these were made available to everyone on the nets. At its peak, Thinking Machines maintained pointers to over 600 databases around the world which had been indexed by WAIS. They included such things as the full set of Usenet Frequently Asked Questions files, the full documentation of working papers such as RFC's by those developing the Internet's standards, and much more. Like Archie, its interface was far from intuitive, and it took some effort to learn to use it well.

Peter Scott of the University of Saskatchewan, recognizing the need to bring together information about all the telnet-accessible library catalogs on the web, as well as other telnet resources, brought out his Hytelnet catalog in 1990. It gave a single place to get information about library catalogs and other telnet resources and how to use them. He maintained it for years, and added HyWebCat in 1997 to provide information on web-based catalogs.

In 1991, the first really friendly interface to the Internet was developed at the University of Minnesota. The University wanted to develop a simple menu system to access files and information on campus through their local network. A debate followed between mainframe adherents and those who believed in smaller systems with client-server architecture. The mainframe adherents "won" the debate initially, but since the client-server advocates said they could put up a prototype very quickly, they were given the go-ahead to do a demonstration system. The demonstration system was called a gopher after the U of Minnesota mascot--the golden gopher. The gopher proved to be very prolific, and within a few years there were over 10,000 gophers around the world. It takes no knowledge of unix or computer architecture to use. In a gopher system, you type or click on a number to select the menu selection you want.

Gopher's usability was enhanced much more when the University of Nevada at Reno developed the VERONICA searchable index of gopher menus. It was purported to be an acronym for Very Easy Rodent-Oriented Netwide Index to Computerized Archives. A spider crawled gopher menus around the world, collecting links and retrieving them for the index. It was so popular that it was very hard to connect to, even though a number of other VERONICA sites were developed to ease the load. Similar indexing software was developed for single sites, called JUGHEAD (Jonzy's Universal Gopher Hierarchy Excavation And Display).

In 1989 another significant event took place in making the nets easier to use. Tim Berners-Lee and others at the European Laboratory for Particle Physics, more popularly known as CERN, proposed a new protocol for information distribution. This protocol, which became the World Wide Web in 1991, was based on hypertext--a system of embedding links in text to link to other text, which you have been using every time you selected a text link while reading these pages. Although started before gopher, it was slower to develop.

The development in 1993 of the graphical browser Mosaic by Marc Andreessen and his team at the National Center For Supercomputing Applications (NCSA) gave the protocol its big boost. Later, Andreessen moved to become the brains behind Netscape Corp., which proced the most successful graphical type of browser and server until Microsoft declared war and developed its MicroSoft Internet Explorer.
Since the Internet was initially funded by the government, it was originally limited to research, ecation, and government uses. Commercial uses were prohibited unless they directly served the goals of research and ecation. This policy continued until the early 90's, when independent commercial networks began to grow. It then became possible to route traffic across the country from one commercial site to another without passing through the government funded NSFNet Internet backbone.

Delphi was the first national commercial online service to offer Internet access to its subscribers. It opened up an email connection in July 1992 and full Internet service in November 1992. All pretenses of limitations on commercial use disappeared in May 1995 when the National Science Foundation ended its sponsorship of the Internet backbone, and all traffic relied on commercial networks. AOL, Prodigy, and CompuServe came online. Since commercial usage was so widespread by this time and ecational institutions had been paying their own way for some time, the loss of NSF funding had no appreciable effect on costs.

Today, NSF funding has moved beyond supporting the backbone and higher ecational institutions to building the K-12 and local public library accesses on the one hand, and the research on the massive high volume connections on the other.

Microsoft's full scale entry into the browser, server, and Internet Service Provider market completed the major shift over to a commercially based Internet. The release of Windows 98 in June 1998 with the Microsoft browser well integrated into the desktop shows Bill Gates' determination to capitalize on the enormous growth of the Internet. Microsoft's success over the past few years has brought court challenges to their dominance. We'll leave it up to you whether you think these battles should be played out in the courts or the marketplace.

During this period of enormous growth, businesses entering the Internet arena scrambled to find economic models that work. Free services supported by advertising shifted some of the direct costs away from the consumer--temporarily. Services such as Delphi offered free web pages, chat rooms, and message boards for community building. Online sales have grown rapidly for such procts as books and music CDs and computers, but the profit margins are slim when price comparisons are so easy, and public trust in online security is still shaky. Business models that have worked well are portal sites, that try to provide everything for everybody, and live auctions. AOL's acquisition of Time-Warner was the largest merger in history when it took place and shows the enormous growth of Internet business! The stock market has had a rocky ride, swooping up and down as the new technology companies, the dot.com's encountered good news and bad. The decline in advertising income spelled doom for many dot.coms, and a major shakeout and search for better business models took place by the survivors.

A current trend with major implications for the future is the growth of high speed connections. 56K modems and the providers who supported them spread widely for a while, but this is the low end now. 56K is not fast enough to carry multimedia, such as sound and video except in low quality. But new technologies many times faster, such as cablemodems and digital subscriber lines (DSL) are predominant now.

Wireless has grown rapidly in the past few years, and travellers search for the wi-fi "hot spots" where they can connect while they are away from the home or office. Many airports, coffee bars, hotels and motels now routinely provide these services, some for a fee and some for free.

The next big growth area is the surge towards universal wireless access, where almost everywhere is a "hot spot". Municipal wi-fi or city-wide access, wiMAX offering broader ranges than wi-fi, and Verizon's EV-DO will joust for dominance in the USA in the months ahead. The battle is both economic and political.

Another trend that is beginning to affect web designers is the growth of smaller devices to connect to the Internet. Small tablets, pocket PCs, smart phones, game machines, and even GPS devices are now capable of tapping into the web on the go, and many web pages are not designed to work on that scale.

As Heraclitus said in the 4th century BC, "Nothing is permanent, but change!"

May you live in interesting times! (ostensibly an ancient Chinese curse)*
For more information on Internet history, visit these sites:

Hobbes' Internet Timeline . ©1993-8 by Robert H Zakon. Significant dates in the history of the Internet.
BBN Timeline. Similar to Hobbes'.
A Brief History of the Internet from the Internet Society. Written by some of those who made it happen.

热心网友 时间:2022-04-12 01:41

The largest network in the world. It is made up of more than 100 million computers in more than 100 countries covering commercial, academic and government endeavors. Originally developed for the U.S. military, the Internet became widely used for academic and commercial research. Users had access to unpublished data and journals on a variety of subjects. Today, the "Net" has become commercialized into a worldwide information highway, providing data and commentary on every subject and proct on earth.

The Internet's surge in growth in the mid 1990s was dramatic, increasing a hundredfold in 1995 and 1996 alone. There were two reasons. Up until then, the major online services (AOL, CompuServe, etc.) provided e-mail, but only to customers of the same service. As they began to connect to the Internet for e-mail exchange, the Internet took on the role of a global switching center. An AOL member could finally send mail to a CompuServe member, and so on. The Internet glued the world together for electronic mail, and today, SMTP, the Internet mail protocol, is the global e-mail standard.

Secondly, with the advent of graphics-based Web browsers such as Mosaic and Netscape Navigator, and soon after, Microsoft's Internet Explorer, the World Wide Web took off. The Web became easily available to users with PCs and Macs rather than only scientists and hackers at Unix workstations. Delphi was the first proprietary online service to offer Web access, and all the rest followed. At the same time, new Internet service providers (ISPs) rose out of the woodwork to offer access to indivials and companies. As a result, the Web grew exponentially, providing an information exchange of unprecedented proportion. The Web has also become "the" storehouse for drivers, updates and demos that are downloaded via the browser as well as a global transport for delivering information by subscription, both free and paid.

Although daily news and information is now available on countless Web sites, long before the Web, information on a myriad of subjects was exchanged via Usenet (User Network) newsgroups. Still thriving, newsgroup articles can be selected and read directly from your Web browser. See Usenet.

Chat rooms provide another popular Internet service. Internet Relay Chat (IRC) offers multiuser text conferencing on diverse topics. Dozens of IRC servers provide hundreds of channels that anyone can log onto and participate in via the keyboard. See IRC.

The Internet started in 1969 as the ARPAnet. Funded by the U.S. government, the ARPAnet became a series of high-speed links between major supercomputer sites and ecational and research institutions worldwide, although mostly in the U.S. A major part of its backbone was the National Science Foundation's NFSNet. Along the way, it became known as the "Internet" or simply "the Net." By the 1990s, so many networks had become part of it and so much traffic was not ecational or pure research that it became obvious that the Internet was on its way to becoming a commercial venture.

In 1995, the Internet was turned over to large commercial Internet providers (ISPs), such as MCI, Sprint and UUNET, which took responsibility for the backbones and have increasingly enhanced their capacities ever since. Regional ISPs link into these backbones to provide lines for their subscribers, and smaller ISPs hook either directly into the national backbones or into the regional ISPs.

Internet computers use the TCP/IP communications protocol. There are more than 100 million hosts on the Internet, a host being a mainframe or medium to high-end server that is always online via TCP/IP. The Internet is also connected to non-TCP/IP networks worldwide through gateways that convert TCP/IP into other protocols.

Before the Web and the graphics-based Web browser, the Internet was accessed from Unix terminals by academicians and scientists using command-driven Unix utilities. These utilities are still used; however, today, they reside in Windows, Mac and Linux machines as well. For example, an FTP program allows files to be uploaded and downloaded, and the Archie utility provides listings of these files. Telnet is a terminal emulation program that lets you log onto a computer on the Internet and run a program. Gopher provides hierarchical menus describing Internet files (not just file names), and Veronica lets you search Gopher sites. See FTP, Archie, Telnet, Gopher and Veronica.

Ironically, some of the original academic and scientific users of the Internet have developed their own Internet once again. Internet2 is a high-speed academic research network that was started in much the same fashion as the original Internet.

热心网友 时间:2022-04-12 03:32

The Internet was the result of some visionary thinking by people in the early 1960s who saw great potential value in allowing computers to share information on research and development in scientific and military fields. J.C.R. Licklider of MIT, first proposed a global network of computers in 1962, and moved over to the Defense Advanced Research Projects Agency (DARPA) in late 1962 to head the work to develop it. Leonard Kleinrock of MIT and later UCLA developed the theory of packet switching, which was to form the basis of Internet connections. Lawrence Roberts of MIT connected a Massachusetts computer with a California computer in 1965 over dial-up telephone lines. It showed the feasibility of wide area networking, but also showed that the telephone line's circuit switching was inadequate. Kleinrock's packet switching theory was confirmed. Roberts moved over to DARPA in 1966 and developed his plan for ARPANET. These visionaries and many more left unnamed here are the real founders of the Internet.

热心网友 时间:2022-04-12 05:40

The Internet was the result of some visionary thinking by people in the early 1960s who saw great potential value in allowing computers to share information on research and development in scientific and military fields. J.C.R. Licklider of MIT, first proposed a global network of computers in 1962, and moved over to the Defense Advanced Research Projects Agency (DARPA) in late 1962 to head the work to develop it. Leonard Kleinrock of MIT and later UCLA developed the theory of packet switching, which was to form the basis of Internet connections. Lawrence Roberts of MIT connected a Massachusetts computer with a California computer in 1965 over dial-up telephone lines. It showed the feasibility of wide area networking, but also showed that the telephone line's circuit switching was inadequate. Kleinrock's packet switching theory was confirmed. Roberts moved over to DARPA in 1966 and developed his plan for ARPANET. These visionaries and many more left unnamed here are the real founders of the Internet.

The Internet, then known as ARPANET, was brought online in 1969 under a contract let by the renamed Advanced Research Projects Agency (ARPA) which initially connected four major computers at universities in the southwestern US (UCLA, Stanford Research Institute, UCSB, and the University of Utah). The contract was carried out by BBN of Cambridge, MA under Bob Kahn and went online in December 1969. By June 1970, MIT, Harvard, BBN, and Systems Development Corp (SDC) in Santa Monica, Cal. were added. By January 1971, Stanford, MIT's Lincoln Labs, Carnegie-Mellon, and Case-Western Reserve U were added. In months to come, NASA/Ames, Mitre, Burroughs, RAND, and the U of Illinois plugged in. After that, there were far too many to keep listing here.

The Internet was designed in part to provide a communications network that would work even if some of the sites were destroyed by nuclear attack. If the most direct route was not available, routers would direct traffic around the network via alternate routes.

The early Internet was used by computer experts, engineers, scientists, and librarians. There was nothing friendly about it. There were no home or office personal computers in those days, and anyone who used it, whether a computer professional or an engineer or scientist or librarian, had to learn to use a very complex system.

The Internet matured in the 70's as a result of the TCP/IP architecture first proposed by Bob Kahn at BBN and further developed by Kahn and Vint Cerf at Stanford and others throughout the 70's. It was adopted by the Defense Department in 1980 replacing the earlier Network Control Protocol (NCP) and universally adopted by 1983.

The Unix to Unix Copy Protocol (UUCP) was invented in 1978 at Bell Labs. Usenet was started in 1979 based on UUCP. Newsgroups, which are discussion groups focusing on a topic, followed, providing a means of exchanging information throughout the world . While Usenet is not considered as part of the Internet, since it does not share the use of TCP/IP, it linked unix systems around the world, and many Internet sites took advantage of the availability of newsgroups. It was a significant part of the community building that took place on the networks.

Similarly, BITNET (Because It's Time Network) connected IBM mainframes around the ecational community and the world to provide mail services beginning in 1981. Listserv software was developed for this network and later others. Gateways were developed to connect BITNET with the Internet and allowed exchange of e-mail, particularly for e-mail discussion lists. These listservs and other forms of e-mail discussion lists formed another major element in the community building that was taking place.

In 1986, the National Science Foundation funded NSFNet as a cross country 56 Kbps backbone for the Internet. They maintained their sponsorship for nearly a decade, setting rules for its non-commercial government and research uses.

As the commands for e-mail, FTP, and telnet were standardized, it became a lot easier for non-technical people to learn to use the nets. It was not easy by today's standards by any means, but it did open up use of the Internet to many more people in universities in particular. Other departments besides the libraries, computer, physics, and engineering departments found ways to make good use of the nets--to communicate with colleagues around the world and to share files and resources.

While the number of sites on the Internet was small, it was fairly easy to keep track of the resources of interest that were available. But as more and more universities and organizations--and their libraries-- connected, the Internet became harder and harder to track. There was more and more need for tools to index the resources that were available.

The first effort, other than library catalogs, to index the Internet was created in 1989, as Peter Deutsch and his crew at McGill University in Montreal, created an archiver for ftp sites, which they named Archie. This software would periodically reach out to all known openly available ftp sites, list their files, and build a searchable index of the software. The commands to search Archie were unix commands, and it took some knowledge of unix to use it to its full capability.

In 1989 another significant event took place in making the nets easier to use. Tim Berners-Lee and others at the European Laboratory for Particle Physics, more popularly known as CERN, proposed a new protocol for information distribution. This protocol, which became the World Wide Web in 1991, was based on hypertext--a system of embedding links in text to link to other text, which you have been using every time you selected a text link while reading these pages. Although started before gopher, it was slower to develop.

The development in 1993 of the graphical browser Mosaic by Marc Andreessen and his team at the National Center For Supercomputing Applications (NCSA) gave the protocol its big boost. Later, Andreessen moved to become the brains behind Netscape Corp., which proced the most successful graphical type of browser and server until Microsoft declared war and developed its MicroSoft Internet Explorer.

Since the Internet was initially funded by the government, it was originally limited to research, ecation, and government uses. Commercial uses were prohibited unless they directly served the goals of research and ecation. This policy continued until the early 90's, when independent commercial networks began to grow. It then became possible to route traffic across the country from one commercial site to another without passing through the government funded NSFNet Internet backbone.

Delphi was the first national commercial online service to offer Internet access to its subscribers. It opened up an email connection in July 1992 and full Internet service in November 1992. All pretenses of limitations on commercial use disappeared in May 1995 when the National Science Foundation ended its sponsorship of the Internet backbone, and all traffic relied on commercial networks. AOL, Prodigy, and CompuServe came online. Since commercial usage was so widespread by this time and ecational institutions had been paying their own way for some time, the loss of NSF funding had no appreciable effect on costs.

Today, NSF funding has moved beyond supporting the backbone and higher ecational institutions to building the K-12 and local public library accesses on the one hand, and the research on the massive high volume connections on the other.

Microsoft's full scale entry into the browser, server, and Internet Service Provider market completed the major shift over to a commercially based Internet. The release of Windows 98 in June 1998 with the Microsoft browser well integrated into the desktop shows Bill Gates' determination to capitalize on the enormous growth of the Internet. Microsoft's success over the past few years has brought court challenges to their dominance. We'll leave it up to you whether you think these battles should be played out in the courts or the marketplace.

During this period of enormous growth, businesses entering the Internet arena scrambled to find economic models that work. Free services supported by advertising shifted some of the direct costs away from the consumer--temporarily. Services such as Delphi offered free web pages, chat rooms, and message boards for community building. Online sales have grown rapidly for such procts as books and music CDs and computers, but the profit margins are slim when price comparisons are so easy, and public trust in online security is still shaky. Business models that have worked well are portal sites, that try to provide everything for everybody, and live auctions. AOL's acquisition of Time-Warner was the largest merger in history when it took place and shows the enormous growth of Internet business! The stock market has had a rocky ride, swooping up and down as the new technology companies, the dot.com's encountered good news and bad. The decline in advertising income spelled doom for many dot.coms, and a major shakeout and search for better business models took place by the survivors.

A current trend with major implications for the future is the growth of high speed connections. 56K modems and the providers who supported them spread widely for a while, but this is the low end now. 56K is not fast enough to carry multimedia, such as sound and video except in low quality. But new technologies many times faster, such as cablemodems and digital subscriber lines (DSL) are predominant now.

Wireless has grown rapidly in the past few years, and travellers search for the wi-fi "hot spots" where they can connect while they are away from the home or office. Many airports, coffee bars, hotels and motels now routinely provide these services, some for a fee and some for free.

The next big growth area is the surge towards universal wireless access, where almost everywhere is a "hot spot". Municipal wi-fi or city-wide access, wiMAX offering broader ranges than wi-fi, and Verizon's EV-DO will joust for dominance in the USA in the months ahead. The battle is both economic and political.

Another trend that is beginning to affect web designers is the growth of smaller devices to connect to the Internet. Small tablets, pocket PCs, smart phones, game machines, and even GPS devices are now capable of tapping into the web on the go, and many web pages are not designed to work on that scale.
声明:本网页内容为用户发布,旨在传播知识,不代表本网认同其观点,若有侵权等问题请及时与本网联系,我们将在第一时间删除处理。
E-MAIL:11247931@qq.com