Rabu, 14 Februari 2007
Local Area Network Concepts and Products: Routers and Gateways
Linux IPv6 HOWTO
Internetworking Technology Handbook
An internetwork is a collection of individual networks, connected by intermediate networking devices, that functions as a single large network. Internetworking refers to the industry, products, and procedures that meet the challenge of creating and administering internetworks.
History of Internetworking
The first networks were time-sharing networks that used mainframes and attached terminals. Such environments were implemented by both IBM's Systems Network Architecture (SNA) and Digital's network architecture.
Local-area networks (LANs) evolved around the PC revolution. LANs enabled multiple users in a relatively small geographical area to exchange files and messages, as well as access shared resources such as file servers and printers.
Wide-area networks (WANs) interconnect LANs with geographically dispersed users to create connectivity. Some of the technologies used for connecting LANs include T1, T3, ATM, ISDN, ADSL, Frame Relay, radio links, and others. New methods of connecting dispersed LANs are appearing everyday.
Today, high-speed LANs and switched internetworks are becoming widely used, largely because they operate at very high speeds and support such high-bandwidth applications as multimedia and videoconferencing.
Internetworking evolved as a solution to three key problems: isolated LANs, duplication of resources, and a lack of network management. Isolated LANs made electronic communication between different offices or departments impossible. Duplication of resources meant that the same hardware and software had to be supplied to each office or department, as did separate support staff. This lack of network management meant that no centralized method of managing and troubleshooting networks existed.
Realizing the Information Future - The Internet and Beyond
- The federal government's promotion of the National Information Infrastructure through an administration initiative and supporting congressional actions;
- The runaway growth of the Internet, an electronic network complex developed initially for and by the research community; and
- The recognition by entertainment, telephone, and cable TV companies of the vast commercial potential in a national information infrastructure.
A national information infrastructure (NII) can provide a seamless web of interconnected, interoperable information networks, computers, databases, and consumer electronics that will eventually link homes, workplaces, and public institutions together. It can embrace virtually all modes of information generation, transport, and use. The potential benefits can be glimpsed in the experiences to date of the research and education communities, where access through the Internet to high-speed networks has begu n to radically change the way researchers work, educators teach, and students learn.
To a large extent, the NII will be a transformation and extension of today's computing and communications infrastructure (including, for example, the Internet, telephone, cable, cellular, data, and broadcast networks). Trends in each of these component areas are already bringing about a next-generation information infrastructure. Yet the outcome of these trends is far from certain; the nature of the NII that will develop is malleable. Choices will be made in industry and government, beginning with inv estments in the underlying physical infrastructure. Those choices will affect and be affected by many institutions and segments of society. They will determine the extent and distribution of the commercial and societal rewards to this country for invest ments in infrastructure-related technology, in which the United States is still currently the world leader.
1994 is a critical juncture in our evolution to a national information infrastructure. Funding arrangements and management responsibilities are being defined (beginning with shifts in NSF funding for the Internet), commercial service providers are playi ng an increasingly significant role, and nonacademic use of the Internet is growing rapidly. Meeting the challenge of "wiring up" the nation will depend on our ability not only to define the purposes that the NII is intended to serve, but also to ensure that the critical technical issues are considered and that the appropriate enabling physical infrastructure is put in place.
PVM: Parallel Virtual Machine - A Users' Guide and Tutorial for Networked Parallel Computing
Teach Yourself THE INTERNET in 24 Hours
Part I, "The Basics," takes you through some of the things you'll need to know before you start. You'll get a clear explanation of what the Internet is really like, learn how you can actually use the Internet in real life, find tips on Internet Service Providers, and receive an introduction to the World Wide Web.
Part II, "E-Mail: The Great Communicator," teaches you all you'll need to know about e-mail. Learn basics like reading and sending e-mail, as well as more advanced functions such as attaching documents, creating aliases, and more. You'll also find out all about listservs and how to use them to your advantage.
Part III, "News and Real-Time Communication," shows you many of the things that make the Internet an outstanding tool for communication. You'll learn about newsgroups and how to communicate with thousands of people by clicking your mouse. You'll also learn how to carry on live, real-time conversations over the Internet, as well as get information on some of the hottest new technology such as Net Phones.
Part IV, "The World Wide Web," shows you what is now the most exciting part of the Internet. Learn which browser is best for you, get the basics of Web navigation, and find out how to help your browser with plug-ins. Finally, you'll discover the most powerful tool on the Web today--the search engine--and more importantly, how to use it.
Part V, "Finding Information on the Net," explains some of the other useful functions of the Net. You'll learn how to transfer files and use Gopher. You'll also learn how to access libraries and other resources by using Telnet. Finally, this section will show you how to use the Internet to locate people, places, and things that might not be available directly through the Web.
Part VI, "Getting the Most Out of the Internet," shows you practical ways to use the Internet. You can find resources and techniques on how to get information about entertainment, education, and business. Finally, learn how to use the Internet just to have fun.
Client/Server Computing Second Edition
A strategy being adopted by many organizations is to flatten the management hierarchy. With the elimination of layers of middle management, the remaining individuals must be empowered to make the strategy successful. Information to support rational decision making must be made available to these individuals. Information technology (IT) is an effective vehicle to support the implementation of this strategy; frequently it is not used effectively. The client/server model provides power to the desktop, with information available to support the decision-making process and enable decision-making authority.
The Gartner Group, a team of computer industry analysts, noted a widening chasm between user expectations and the ability of information systems (IS) organizations to fulfill them. The gap has been fueled by dramatic increases in end-user comfort with technology (mainly because of prevalent PC literacy); continuous cost declines in pivotal hardware technologies; escalation in highly publicized vendor promises; increasing time delays between vendor promised releases and product delivery (that is, "vaporware"); and emergence of the graphical user in terface (GUI) as the perceived solution to all computing problems.
In this book you will see that client/server computing is the technology capable of bridging this chasm. This technology, particularly when integrated into the normal business process, can take advantage of this new literacy, cost-effective technology, and GUI friendliness. In conjunction with a well-architected systems development environment (SDE), it is possible for client/server computing to use the technology of today and be positioned to take advantage of vendor promises as they become real.
The amount of change in computer processing-related technology since the introduction of the IBM PC is equivalent to all the change that occurred during the previous history of computer technology. We expect the amount of change in the next few years to be even more geometrically inclined. The increasing rate of change is primarily attributable to the coincidence of four events: a dramatic reduction in the cost of processing hardware, a significant increase in installed and available processing power, the introduction of widely adopted software standards, and the use of object-oriented development techniques. The complexity inherent in the pervasiveness of these changes has prevented most business and government organizations from taking full advantage of the potential to be more competitive through improved quality, increased service, reduced costs, and higher profits. Corporate IS organizations, with an experience based on previous technologies, are often less successful than user groups in putting the new technologies to good use.
Taking advantage of computer technology innovation is one of the most effective ways to achieve a competitive advantage and demonstrate value in the marketplace. Technology can be used to improve service by quickly obtaining the information necessary to make decisions and to act to resolve problems. Technology can also be used to reduce costs of repetitive processes and to improve quality through consistent application of those processes. The use of workstation technology implemented as part of the business process and integrated with an organization's existing assets provides a practical means to achieve competitive advantage and to demonstrate value.
Computer hardware continues its historical trend toward smaller, faster, and lower-cost systems. Competitive pressures force organizations to reengineer their business processes for cost and service efficiencies. Computer technology trends prove to leading organizations that the application of technology is the key to successful reengineering of business processes.
Unfortunately, we are not seeing corresponding improvements in systems development. Applications developed by inhouse computer professionals seem to get larger, run more slowly, and cost more to operate. Existing systems consume all available IS resources for maintenance and enhancements. As personal desktop environments lead users to greater familiarity with a GUI, corporate IS departments continue to ignore this technology. The ease of use and standard look and feel, provided by GUIs in personal productivity applications at the desktop, is creating an expectation in the user community. When this expectation is not met, IS departments are considered irrelevant by their users.
Beyond GUI, multimedia technologies are using workstation power to re-present information through the use of image, video, sound, and graphics. These representations relate directly to the human brain's ability to extract information from images far more effectively than from lists of facts.
Accessing information CAN be as easy as tapping an electrical power utility. What is required is the will among developers to build the skills to take advantage of the opportunity offered by client/server computing.
This book shows how organizations can continue to gain value from their existing technology investments while using the special capabilities that new technologies offer. The book demonstrates how to architect SDEs and create solutions that are solidly based on evolving technologies. New systems can be built to work effectively with today's capabilities and at the same time can be based on a technical architecture that will allow them to evolve and to take advantage of future technologies.
For the near future, client/server solutions will rely on existing minicomputer and mainframe technologies to support applications already in use, and also to provide shared access to enterprise data, connectivity, and security services. To use existing investments and new technologies effectively, we must understand how to integrate these into our new applications. Only the appropriate application of standards based technologies within a designed architecture will enable this to happen.
It will not happen by accident.
Patrick N. Smith with Steven L. Guengerich