Was this review helpful to you?
I kept the always-on watch face enabled and thus I could always see the time, although in a more limited view when in standby manual mode.
8 people found this helpful. Well the trio, live, more than lived up manual ONLINE to the recordings.

Peer 2 Peer Introduction

P2P is a strange topic to talk about since in normal network jargon it refers to connecting computers across the network to share resources and in some cases, to work as one (cluster computing). In this section, we will talk about P2P software and its history. What attracts users across cyberspace to this and what is the software available and why are the authorities working overtime to make P2P a thing of the past? All that and more as we move on. The traditional way of networking is client-server architecture. In this type, there are dedicated computers (servers) which let other computers (clients) access it and use its resources. In the P2P type,all computers are connected to a network and at any point of time any computer can act as a server and another computer that uses its resources is the client.

Peer-to-peer file-sharing is based on this protocol. When you install software such as Napster or Kazaa on your computer and access the Internet, it connects to other computers to download any stuff that you have searched using the client.Based on this protocol, Napster was launched in May 1999. This was the first P2P software but not in the truest sense of the word. This was because it still needed users to connect to a central server and once the client was identified, further file transfer was passed on to the nodes. This was furthered by software such as Kazaa and eDonkey. However, with each new software came a different implementation of the technology. We will talk about this when we go and pick apart each of the software.

Since 1999, there have been many changes in the P2P world. Napster has since closed down owing to a ruling by the American courts. Napster is still available, but in a form where it is no longer attractive enough for P2P users. Newer P2P software such as BitTorrent has reared its head in recent years and is the latest rage amongst P2P users. Filling the gap between Napster and BitTorrent was eDonkey, which was going strong until BitTorrent appeared. eDonkey is still pretty much available, but its favourite rating amongst users has dropped to a very large extent.

In the coming section, we will discuss the most used P2P software worldwide and how to use them. But before you start reading it, remember that downloading illegal stuff of the Internet such as movies or music you do not own or do not have a copyright of is illegal.

We do not condone such activities and the information provided here is only for the sake of informing you as a reader and not to give you ways of downloading stuff illegally. Please be aware of what is legal on the Internet and steer clear of activities that could potentially land in you in jail. With that said lets look at the software mentioned above and understand how it works.


Right Now!

evolution of internet-right now

Right now, the Internet is growing and at an amazing rate. This rate could perhaps be second only to the burgeoning population of third world countries. Internet use across the planet has grown at a whopping 146 per cent since 2000 and is continuously moving upwards.

The main reason for this is that the Internet has now become an accepted part of the mainstream urban life. The Internet is used for almost everything imaginable from recreation and entertainment to learning about rocket science. However, for every good thing, a million bad ones hit you. The same has happened with the Internet. The Internet is a huge resource for everything under the sun; however, not all the data available is harnessed in the right way or in the right direction.

There is an information overload and within a small time, you can turn from a learner to a researcher. Another factor is that all data that is available on the Internet is not true and should not be accepted at face value, which users normally do. This has been happening since the time Internet came into existence. With more and more users getting online everyday, it has become difficult for searching and sifting through data. Not all results provided by search engines are the one you are looking for. Depending on the search engine used, results provided could be based on which advertiser pays the most. For the statistician in you, here are a few to gulp.

Research to find out the number of Web sites visited by users residing in the US showed that the total number of users in a month was 164,961 out of which, Yahoo!, Time Warner and MSN were the most-visited domains. The Entertainment category comprised a total of 128,863 users whose most visited Web sites were Viacom Online, AOL Entertainment, Yahoo! Music and Windows Media in that order. And these figures are just for a month for users surfing the internet in the US! If we start collating data from other countries, we will definitely achieve a mind-boggling figure. So where do we go from here?


The Early Years

In the 1980s there was a widespread development and use of LAN’s, PC’s and workstations. This allowed the Internet to flourish. By 1985, the Internet was refined and well-established as a technology and although it was still limited, mostly to researchers and developers, regular computer users started using it for daily activities. Electronic mail or e-mail was the most used application and its interconnection between different mail systems was demonstrating the utility of broad based electronic communications between people.

Until this time, networking protocols were still being developed. However, the major protocol that was polished at the time was TCP/IP. In fact, by 1990, the ARPANET was decommissioned and TCP/IP had displaced most other Wide Area Networking (WAN) protocols and was fast becoming the accepted protocol for internetworking.

After a while, the World Wide Web came into existence. A consortium called the World Wide Web Consortium (W3C) was also formed which was led initially by Tim Berners-Lee (the inventor of the WWW) and Al Vezza. The W3C is the consortium that has taken on the responsibility for evolving the various protocols and standards associated with the web. This happened in 1992.

Soon after this, the commercialisation of the Internet started taking place. There was a marked increase in the number of communities across the Internet. Bulletin Board Services and Usenet groups were now home to more computer users than ever before. With the advent of such information, corporations and businesses also started looking at the Internet in terms of business viability.

In 1994, Pizza Hut started offering pizza ordering on its Web page while First Virtual, the first ‘cyberbank’, launched its online presence. By this time, there were more than 3,864,000 hosts on the Internet. Since then, the Net has grown rapidly and the process continues to this day.



ARPANET was transformed into the Internet as we know it today.The technology involved in this was a very basic packet switchingnetwork that later moved to include packet satellite networks, ground-based packet radio networks and other networks. The basic idea behind using the Internet was connecting multiple independent networks of rather arbitrary design, beginning with the ARPANET that, in other terms, follows a technical idea of open architecture networking.

Until that time, there was only one method of networking, which was the circuit switching method. In this method, networks interconnecting at the circuit level passed individual bits on a synchronous basis along a portion of an end-to-end circuit between a pair of end locations.

What this means, is that in the circuit switching mode, there is a dedicated path by which data travels from the source to the destination. In packet switching mode, this is not the case. The hub or the router, which acts as the intermediary, decides the best path for sending or receiving the data packets that gives this process more flexibility with increased efficiency.

The future of networking was based on this skeleton. Further down the line, the TCP/IP (Transmission Control Protocol/Internet Protocol) was developed since the Network Control Protocol of NCP, which was in use at that time, did not have the ability to address networks (and machines) further downstream than a destination IMP. This was the main driving force behind the development of another protocol which we know as TCP/IP, today.

Further, down-the-line applications such as Telnet were developed, which is a very basic application for remote logging and file transfer. However, e-mail remained the biggest and single most innovation from that era. Other applications being developed at that time included packet-based voice communication (the precursor of Internet telephony), various models of file and disk sharing, and early “worm” programs that showed the concept of agents (and, of course, viruses). This form of the Internet, though, was very limited and soon the commercialisation of the fledgling Internet seemed to be a very real possibility.