Daily Bulletin

The Times Real Estate

.

  • Written by Giovanni Navarria, Lecturer and Research Fellow, Sydney Democracy Network, School of Social and Political Sciences (SSPS), University of Sydney

This essay is the last of a four-part series, which commemorates the anniversary of the first ever message sent across the ARPANET, the progenitor of the Internet on October 29, 1969 - Read: Part 1, Part 2, Part 3.

In today’s hyper-tech world, almost any new device (even a fridge, let alone phones or computers) is born “smart” enough to connect easily with the global network. This is possible because at the core of this worldwide infrastructure we call the Internet is a set of shared communication standards, procedures and formats called protocols. However, when in the early 1970s, the first four-nodes of the ARPANET became fully functional things were a bit more complicated. Exchanging data between different computers (let alone different computer networks) was not as easy as it is today. Finally, there was a reliable packet-switching network to connect to, but no universal language to communicate through it. Each host, in fact, had a set of specific protocols and to login users were required to know the host’s own ‘language’. Using ARPANET was like being given a telephone and unlimited credit only to find out that the only users we can call don’t speak our language.image ARPANET interface for Xerox PARC’s PDP-10 Computer History

Predictably, the new network was scarcely used at the beginning. Excluding, in fact, the small circle of people directly involved in the project, a much larger crowd of potential users (e.g. graduate students, researchers and the many more who might have benefited from it) seemed wholly uninterested in using the ARPANET. The only thing that kept the network going in those early months was people changing jobs. In face, when researchers relocated to one of the other network sites – for instance from UCLA to Stanford – then, and only then, the usage of those sites’ resources increased. The reason was quite simple: the providential migrants brought the gift knowledge with them. They knew the procedures in use in the other site, and hence they knew how to “talk” with the host computer in their old department.

To find a solution to this frustrating problem, Roberts and his staff established a specific group of researchers – most of them still graduate students – to develop the host-to-host software. The group was initially called the Network Working Group (NWG) and was led by a UCLA graduate student, Steve Crocker. Later, in 1972, the group changed its name in International Network Working Group (INWG) and the leadership passed from Crocker to Vint Cerf. In the words of Crocker:

The Network Working Group consists of interested people from existing or potential ARPA network sites. Membership is not closed. The [NWG] is concerned with the HOST software, the strategies for using the network, and initial experience with the network.

The NWG was a special body (the first of its kind) concerned not only with monitoring and questioning the network’s technical aspects, but, more broadly, with every aspect of it, even the moral or philosophical ones. Thanks to Crocker’s imaginative leadership, the discussion in the group was facilitated by a highly original, and rather democratic method, still in use five decades later. To communicate with the whole group, all a member needed to do was to send a simple Request for Comment (RFC). To avoid stepping on someone’s toes, the notes were to be considered “unofficial” and with “no status”. Membership to the group was not closed and “notes may be produced at any site by anybody”. The minimum length of a RFC was, and still is “one sentence”.

The openness of the RFC process helped encourage participation among the members of a very heterogeneous group of people, ranging from graduate students to professors and program managers. Following a “spirit of unrestrained participation in working group meetings”, the RFC method proved to be a critical asset for the people involved in the project. It helped them reflect openly about the aims and goals of the network, within and beyond its technical infrastructure.

The significance of both the RFC method and the NWG goes far beyond the critical part they played in setting up the standards for today’s Internet. Both helped shape and strengthen a new revolutionary culture that in the name of knowledge and problem-solving tends to disregard power hierarchies as nuisances, while highlighting networking as the only path to find the best solution to a problem, any problem. Within this kind of environment, it is not one’s particular vision or idea that counts, but the welfare of the environment itself: that is, the network.

This particular culture informs the whole communication galaxy we call today the Internet; in fact, it is one of its defining elements. The offspring of the marriage between the RFC and the NGW are called web-logs, web forums, email lists, and of course social media while Internet-working is now a key-aspect in many processes of human interaction, ranging from solving technical issues, to finding solution to more complex social or political matters.

Widening the network

The NWG however needed almost two years to write the software, but eventually, by 1970 the ARPANET had its first host-to-host protocol, the Network Control Protocol (NCP). By December 1970 the original four-node network had expanded to 10 nodes and 19 hosts computers. Four months later, the ARPANET had grown to 15 nodes and 23 hosts.

By this time, despite delivering “data packets” for more than a year, the ARPANET showed almost no sign of “useful interactions that were taking place on [it]”. The hosts were plugged in, but they all lacked the right configuration (or knowledge) to properly use the network. To make “the world take notice of packet switching”, Roberts and his colleagues decided to give a public demonstration of the ARPANET and its potentials at the International Conference on Computer Communication (ICCC) held in Washington, D.C., in October 1972.

The demonstration was a success: “[i]t really marked a major change in the attitude towards the reality of packet switching” said Robert Kahn. It involved – among other things – demonstrating how tools for network measurement worked, displaying the IMPs network traffic, editing text at a distance, file transfers, and remote logins.

It was just a remarkable panoply of online services, all in that one room with about fifty different terminals.

The demonstration fully succeeded in showing how packet-switching worked to people that were not involved in the original project. It inspired others to follow the example set by Larry Roberts’ network. International nodes located in England and Norway were added in 1973; and in the following years, others packet-switching networks, independent from ARPANET, appeared worldwide. This passage from a relatively small experimental network to one (in principle) encompassing the whole world confronted the ARPANET’s designers with a new challenge: how to make different networks, that used different technologies and approaches, able to communicate with each other?

The concept of “Internetting”, or “open-architecture networking”, first introduced in 1972, illustrates the critical need for the network to expand beyond its limited restricted circle of host computers.

The existing Network Control Protocol (NCP) didn’t meet the requirements. It had been designed to manage communication host-to-host within the same network. To build a true open reliable and dynamic network of networks what was needed was a new general protocol. It took several years, but eventually, by 1978, Robert Kahn and Vint Cerf (two of the BBN guys) succeeded in designing it. They called it Transfer Control Protocol/Internet Protocol (TCP/IP). As Cerf explained

‘the job of the TCP is merely to take a stream of messages produced by one HOST and reproduce the stream at a foreign receiving HOST without change.’

To give an example: when a user sends or retrieve information across the Internet – e.g., access Web pages or upload files to a server - the TCP on the sender’s machine breaks the message into packets and send them out. The IP is instead the part of the protocol concerned with “the addressing and forwarding” of those individual packets. The IP is a critical part of our daily Internet experience: without it, it would be practically impossible to locate the information we are looking for among the billions of machines connected to the network today.

image TCP/IP How it works Vint Cerf/Web

On the receiving end, the TCP helps reassemble all the packets into the original messages, checking errors and sequence order. Thanks to TCP/IP the exchange of data packets between different and distant networks was finally possible

Cerf and Khan’s new protocol opened up new possible avenues of collaboration between the ARPANET and all the other networks around the world that had been inspired by ARPA’s work. The foundations for a worldwide network were laid, and the doors were wide open for anyone to join in.

image ARPANET 1969-1977 Wikipedia

Expansion of the ARPANET

In the years that followed, the ARPANET consolidated and expanded, all while remaining virtually unknown to the general public. On July 1, 1975, the network was placed under the direct control of the Defense Communication Agency (DCA). By then there were already 57 nodes in the network. The larger it grew, the more difficult it was to determine who was actually using it. There were, in fact, no tools to check the network users’ activity. The DCA began to worry. The mix of fast growth rate and lack of control could potentially become a serious issue for national security. The DCA, trying to control the situation, issued a series of warnings against any unauthorised access and use of the network. In his last newsletter before retiring to civilian life, the DCA’s appointed ARPANET Network Manager, Major Joseph Haughney wrote:

Only military personnel or ARPANET sponsor-validated persons working on government contracts or grants may use the ARPANET. […] Files should not be [exchanged] by anyone unless they are files that have been announced as ARPANET-public or unless permission has been obtained from the owner. Public files on the ARPANET are not to be considered public files outside of the ARPANET, and should not be transferred, or their contents given or sold to the general public without permission of DCA or the ARPANET sponsors.

However, these warnings were largely ignored as most of the networked nodes had, Haughney put it, “weak or nonexistent host access to the control mechanism”. By the early 1980s, the network was essentially an open access area for both authorised and non-authorised users. This situation was made worse by the drastic drop in computer prices. With the potential number of machines capable of connecting to the network increasing constantly, the concern over its vulnerability rose to new heights.

War Games

The 1983 hit film, War Games, about a young computer whiz who manages to connect to the super computer at NORAD and almost start World Word III from his bedroom, perfectly captured the mood of the militaries towards the network. By the end of that year, the Department of Defense ‘in its biggest step to date against illegal penetration of computers’ – as The New York Times reported – “split a global computer network into separate parts for military and civilian users, thereby limiting access by university- based researchers, trespassers and possibly spies”.

The ARPANET was effectively divided in two distinct networks: one still called ARPANET, mainly dedicated to research, and the other called MILNET, a military operational network, protected by strong security measures like encryption and restricted access control.

image ARPANET Map 1982 WIKIPEDIA By the mid 1980s the network was widely used by researchers and developers. But it was also being picked up by a growing number of other communities and networks. The transition towards a privatised Internet took ten more years, and it was largely handled by the National Science Foundation (NSF). The NSF’s own network NFTNET had started using the ARPANET as its backbone since 1984, but by 1988 the NSF had already initiated the commercialisation and privatisation of the Internet by promoting the development of “private” and “long-haul networks”. The role of these private networks was to build new or maintain existing local/regional networks, while providing access to their users to the whole Internet.

The ARPANET was officially decommissioned in 1990, whilst in 1995 the NFTNET was shut down and the Internet effectively privatised. By then, the network - no longer the private enclave of computer scientists or militaries - had become the Internet, a new galaxy of communication ready to be fully explored and populated.

The Internet

During its early stages, between the 60s and 70s, the communication galaxy spawned by the ARPANET was not only mostly uncharted space, but, compared to today’ standards, also mainly empty. It continued as such well into the 90s, before the technology pioneered with the ARPANET project became the backbone of the Internet.

image Sir Tim Berners-Lee and first World Wide Web page Tim Berners Lee/Web

In 1992, during its first phase of popularisation, the global networks connected to the Internet exchanged about 100 Gigabytes (GB) of traffic per day. Since then, data traffic has grown exponentially along with the number of users and the network’s popularity. A decade later, thanks to Tim Berners Lee’s World Wide Web (1989), there is an ever increasing availability of cheap and powerful tools to navigate the galaxy, not to mention the explosion of social media from 2005 onward. And so, ‘per day’ became ‘per second’, and in 2014 global Internet traffic peaked at 16,000 GBps, with experts forecasting the number to quadruple before the decade is out.

Still, numbers can sometimes be deceptive, as well as frustratingly confusing for the non-expert reader. What hides beneath their dry technicality is a simple fact: the enduring impact of that first stuttered hello at UCLA on October 29, 1969 has dramatically transcended the apparent technical triviality of making two computers talk to each other. Nearly five decades after Kleinrock and Kline’s experiment in California, the Internet has arguably become a driving force in the daily routines of more than three billion people worldwide. For a growing number of users, a mere minute of life on the Internet is to be part, simultaneously, of an endless stream of shared experiences that include, among other things, watching over 165,000 hours of video, being exposed to 10 million adverts, playing nearly 32,000 hours of music and sending and receiving over 200 million emails.

Albeit at different levels of participation, the lives of almost half of the world population are increasingly shaped by this expanding communication galaxy.

We use the global network almost for everything. ‘I’m on the Internet’, ‘Check the Internet’, ‘It’s on the Internet’ and other similar stock phrases have become portmanteau for an increasing range of activities: from chatting with friends to looking for love; from going on shopping sprees to studying for a University degree; from playing a game to earning a living; from becoming a sinner to connecting with God; from robbing a stranger to stalking a former lover; the list is virtually endless.

But there is much more than this. The expansion of the Internet is deeply entangled with the sphere of politics. The more people embrace this new age of communicative abundance, the more it affects the way in which we exercise our political will in this world. Barack Obama’s victory in 2008, the Indignados in Spain in 2011, the Five Star Movement in Italy in 2013, Julian Assange’s Wikileaks and Edward Snowden’s revelations of the NSA’s secret system of surveillance are but a handful of examples that show how, in just the last decade, the Internet has changed the way in which we engage with politics and challenge power. The Snowden’s files, however, also highlight the other, much darker side of the story: the more we become networked, the more we become obliviously exploitable, searchable, and monitored.

Several decades after the journey began, we have yet to reach the full potential of the ‘Intergalactic Network’ imagined by Licklider in the early 1960s. However, the quasi-perfect symbiosis between humans and computers that we experience every day, albeit not without shadows, it is arguably one of humanity’s greatest accomplishments.

Authors: Giovanni Navarria, Lecturer and Research Fellow, Sydney Democracy Network, School of Social and Political Sciences (SSPS), University of Sydney

Read more http://theconversation.com/how-the-internet-was-born-from-the-arpanet-to-the-internet-68072

Business News

Insulation Solutions for Meeting Modern Industrial Standards

As global energy costs soar and environmental regulations tighten, industries face unprecedented pressure to optimise their operations while minimising their ecological footprint. Modern industrial ...

Daily Bulletin - avatar Daily Bulletin

How Australian Startups Should Responsibly Collect, Use and Store Customer Data?

Owing to the digital landscape, data is the most important currency in the market. From giant e-commerce sharks to small businesses, every company is investing heavily to responsibly collect data an...

Daily Bulletin - avatar Daily Bulletin

Revolutionising Connections - The Power of Customer Engagement Software

As time goes by, customer expectations keep on rising ever so rapidly. Businesses that must keep pace will need future-ready tools to deliver connectedness at every touchpoint. Customer engagement a...

Daily Bulletin - avatar Daily Bulletin

LayBy Deals