By: Boris Rousseau
The history of mankind has been greatly influenced by his successes and failures in communicating his perception of the world around him to others. Marshall McLuhan, the Canadian communications expert, argued that the technology of communication greatly affects our understanding of the message to be communicated. “The medium is the massage” [1]
In the developed world, telephony is the dominant communications technology. Over 20% of the world’s population has access to a telephone [2]. Indeed an advanced and reliable telephony network is considered necessary for a nation to consider itself developed. Like most everyday technologies, telephony can seem a like magic. Strictly speaking, telephony involves reproducing sounds a distance. Typically the sounds are those produced by the human voice and are transmitted as electrical impulses over wires in a similar manner to that proposed by Alexander Graham Bell in 1876. Other transmission technologies such as wireless or radio transmission and optical fiber are also widely used but rarely provide the direct connection between the home and the network. This direct connection is often known as the access network or ‘last mile’.
The network itself is made up of a collection of telephony switches or exchanges that are used to route voice data from one telephone to another. Exchanges are the practical alternative to connecting every telephone to every other telephone using a wire or group of wires for each connection. A phone call is a virtual connection or circuit between two telephones that is maintained by the network for the duration of the call and provides a consistent and reliable flow of voice data between phones.
In Ireland the vast majority of home telephones are analogue. This refers to the data transmission method used. Voice data is represented by electrical impulses that are analogues of the voice that produced them. The strength of the electrical signal varies linearly with the loudness of the human voice
that produced it. This is an acceptable method of transmission over short distances. Unfortunately the world is an extremely hostile place for telephone wires and analogue telephony transmissions. There are many factors that can produce small degradations in the signal, cumulatively producing inaudible and unclear reproductions at the other end of the ‘phone line’. To preserve the signal’s integrity over long distances it becomes necessary to represent the voice as digital data. This happens before transmission in the case of digital transmission systems like Integrated Services Digital Network (ISDN) and between or at telephone exchanges for a normal ‘analogue call’. This protects data’s integrity two ways. The level of the signal is sampled at regular intervals and then approximated or quantised by a fixed number within a certain tolerance of the signal’s strength or amplitude. The samples are then transmitted as binary data. Simply put, binary or base-2 data represents numbers using a sequence of 1s and 0s. It is much easier to recover the state of a signal where there are only two possible states at any one time then it is if the state is an analogue of something infinitely variable, like loudness of the human voice. Counter-intuitively it is also possible to transmit binary voice data faster. The information is conveyed in a more elaborate form but is harder to corrupt and can therefore be transmitted faster and using a wider range of transmission technologies e.g. Ultra High Frequency (UHF) radio or optical fiber.
I’ve only provided a brief introduction into the complexity of the communications devices that we take for granted. Mobile phones and the wireless network will be explored in more detail in later columns.
[1] The Medium is the Massage, Marshall McLuhan et al., 1967
[2] State of the World Forum, State of the World Index, 2000


By: Boris Rousseau
You may have heard of HTML (Hyper-Text Markup Language), the underlying format used for defining all Web pages. The beauty of HTML was its simplicity, but it was not very flexible. If people wanted to add new types of markup (e.g. a way of centring all headings) a whole new version of HTML was needed.
XML was developed to avoid this problem. Instead of simply defining one standard, it is a standard for defining standards (i.e. a meta language). As long as you follow the rules, you can define what you like, you just need to make the definitions available as well. So, XML is really nothing more than a standardised system for using a text format for representing structured information (most commonly on the Web). This structured information may contain both content (words, pictures, etc.) and some indication of what role that content plays (for example, book, author, title).
XML stands for eXtensible Markup Lanugage. Each of these words describes an important part of what XML is and what it does.
The first word is Extensible, which gives XML much of its strength and flexibility. In XML, you create the tags you want to use. XML extends your ability to describe a document, letting you define meaningful tags for your applications. For example, if your site contains many library terms, you can create a tag called for those terms.
The second word in XML is Markup. This is the purpose of XML: to identify elements within your document. By marking up your document, you begin to give meaning to the pieces within. You identify the bits and pieces in a way that gives them value and context. And, with extensible markup you can mark up the document in ways that match your needs.
The third word in XML is Language. This states that XML follows a firm set of rules. It may let you create an extensible set of markup tags, but its structure and syntax remain firm and clearly defined. In technology, the term “language” is often automatically appended to the word “programming” as in “programming language”.
IT people often assume that all languages are for programming and for creating a set of actions. But a language is just a way of describing something – be it a program’s actions or a markup definition. Extensible Markup Language is a means of marking up data, using a specific syntax.
All this description justifies the fact that XML is a great way to share information over the Internet, because:
* it is open; XML can be used to exchange data with other user across different platforms in an independent way.
* self-describing nature makes it an effective choice for business-to-business and extranet solutions because of its extensible and markup nature.
* you can share without any prior coordination. Mechanisms in XML allow you to discover the structure of a class of XML documents.
To illustrate this, here is a short example of XML document describing a library:

Charles M. Schulz

As you can see above, an XML document comprises three parts:
* An optional prolog
* The body of the document, consisting of one or more element, in the form of a hierarchical tree that may also contain character data
* An optional epilogue comprising comments, processing instruction (note, there is no epilogue in the above example)
Whether they realize it or not, many people will be coming in contact with XML in the years to come, whether it be that the web page they are reading was actually originally authored in XML, or the menu on their mobile phone was written in XML. In particular business people will find that XML becomes core to the way business computing is done. Already the Irish Tax Office are experimenting with submission of tax returns using a standard format, actually specified using XML.

A Truly Smarter Phone

By: Cathal O’Riordan
In 10 years people will visit museums and marvel at the sheer size and weight of the early generation mobile handsets. Hardly worthy of the term ‘mobile’ by today’s standards, these colossal brutes had limited battery power and user features. For example, the Nokia Mobira Senator, circa 1982, weighed 9.8 Kg. Thankfully, as advancements were made in microelectronics the size of mobile handsets continued to diminish to become the small and sometimes unobtrusive devices we use in our everyday lives.
Traditionally mobile phones have been voice-centric, used mainly for making and receiving calls. Achievable data transmission rates over Global System for Mobile (GSM) networks have made it impractical to offer services like e-mail (with attachments), Internet or video/music streaming but the recent introduction of the General Packet Radio System (GPRS) and forthcoming 3rd Generation (3G) networks make these types of services a reality to customers. Manufacturers of both hardware and software are now facing new challenges, as they endeavour to provide the content-rich environments mobile users have come to expect of their desktop systems.
We are now seeing the latest evolution of mobile handsets with the introduction of the Smart Phone to the mobile market. A Smart Phone is a combination of both phone and Personal Digital Assistant (PDA) in a single device. It offers the capabilities of a traditional mobile phone but in addition allows a user to perform many of the tasks currently available on more familiar systems. Heralded as the Swiss-army knife of portable electronics, these devices boast colour-rich displays, Personal Information Management tools (calendar, e-mail, appointments etc.), Internet Browsing, Multimedia Messaging Service (MMS), document/spreadsheet editing software, personalisation (ring tones, colour schemes, images etc.), streaming media, games and digital image capture all packaged in a small convenient unit. This level of functionality comes at a price though. Smart Phones are expected to be marketed as premium priced handsets fetching anywhere between EUR200 and EUR800.
The Smart Phone market is becoming very competitive with leading companies like Microsoft, Palm, Sony Ericsson, Motorola, and Nokia all entering the arena. Each company is shaping the industry with the expertise it brings from its indigenous marketplace. Software manufacturers such as Microsoft and Palm produce the platforms or operating systems, which are then licensed for distribution with phones from equipment manufacturers. For example, Microsoft, in association with UK based mobile manufacturer Sendo ( is set to launch its first mobile phone operating system called Smartphone 2002 (the actual phone will be called the Sendo Z100). Code-named “Stinger”, it is similar in appearance to the accomplished Windows PC platform and will thrust Microsoft into yet another popular consumer market. However, Smartphone 2002 has been fraught with delays and is not expected for release until early 2003. Other leading competitors such as Symbian and Palm OS have already launched their Smart Phone platforms and have been adopted by both mobile manufacturers and consumers alike. Symbian has been a popular choice amongst the latest phones from Nokia and Sony Ericsson. The Nokia 7650, 3650 and Sony Ericsson P800 all exploit the rich feature-set of the Symbian platform.
Nobody is certain who will be the emerging victor in the Smart Phone market. Each mobile manufacturer is backing the platform they think is a clear winner, but in this new and unpredictable environment no one is a safe bet. Industry sceptics fear that users will be unable to use the features of a Smart Phone in the way they were intended because of the relative immaturity of high-speed mobile networks. Although the underlying principle of a Smart Phone is to operate both with and without the presence of a connection to the Internet or network, for most features this is a necessity. What is apparent though is the benefit of the Smart Phone to the consumer. Users will now be able to proclaim true mobility where the applications and services they once associated with their PC or laptop will soon be available from a mobile phone.

Computer Viruses Why Do They Infect?

By: Jonathan Brazil
There are many definitions of a virus but put simply a virus is just a set of malicious instructions telling a computer to do something nasty. This behaviour can be trivial, resulting in the unwanted deletion of personal files. Yes, I did say trivial, when you consider the comparable catastrophe of your personal files being sent to recipients via email and your PC rendered useless to stop you discovering the misdemeanour. Personal files should be backed up anyway to a separate medium such as floppy, Zip or CD; it does not take a virus to destroy your personal data!
There are several reasons for the rapid spread of viruses in recent times. Popular operating systems that have captured a large market share have done so because of sophisticated features allowing system tasks to be automated via a script of instructions. This gives the system a nicer feel and appears more powerful to the experienced user but is also the Achilles heel of the system. If the operating system knows how to process these instructions there is nothing stopping somebody from writing a script of instructions to perform some less graceful activity.
Viruses spread in a number of ways; e-mail viruses spread by sending a copy of themselves to every e-mail address in your contacts directory thus prolonging the lifecycle of the virus by potentially infecting more users. However, the
biggest cause of virus dispersion is lack of vigilance on the users behalf. People open files that are e-mailed from people they don’t know; these files could do anything. You might receive an e-mail virus from someone you know but despite the impersonal facade of e-mail, people still add personal touches. If the e-mail has no greeting, no meaningful content or an attachment name that makes no sense then don’t open it. If it follows the aforementioned pattern and you weren’t expecting it then it probably wasn’t for you and you don’t want to see what it can do. Vigilance is the only sure way to protect your system, don’t be tempeted by clicking into the unknown.
On an interesting note, you should be aware that many emails warning of viruses are in fact hoaxes. They waste time, energy and computing resources by spreading rumours. One of the first of these hoaxes was the notorious “Good Times” hoax, started in December 1994 and still going around in various forms, saying not to read any email with “Good Times” as the subject. As a good rule of thumb, check an authoritative website to see if you are dealing with a hoax, before mailing virus warnings to all you friends and colleagues.
Viruses have plagued mankind since the dawn of time and it seems only fitting that, as we complete our transition into the computer era, viruses should remain the scourge of the general public. The common cold is no longer the only infectious thing we have to worry about.
There are many virus protection programs. Every computer should really have one installed and have the list of viruses regularly updated (this can be done automatically over the Internet for most programs).

Time and distance based billing models are dead!

By: Conor Ryan
To survive in the rapidly expanding services market, service providers have to overhaul their traditional billing and accounting models. The flat rate, monthly subscription, unlimited usage for a fixed price is a ‘going-out-of-business’ model, and it is widely accepted amongst consumers and providers that future billing for present and emerging services will be content and usage based.
This poses the question as to why consumers are not being charged for content to a greater extent already. Up to now the widespread and continued usage of the flat rate model has been attributed to the fact that most existing service providers have been accustomed to distance and time based billing. It is the comfort zone. It has always worked in the past and it will always work in the future, and, above all, it is simple. However a lot of new services, and some existing services, are Internet Protocol (IP) based. In an IP world, geography is irrelevant. Distance-based billing just does not work. Time-based billing presents even more catastrophic failures, as it is perceived that it discourages the user from using the network. Also, until recently, service providers were locked into the ubiquitous flat-rate business model by their fear that nobody would pay for content when they could get it, or something similar elsewhere for free (similar to freeware software scenario). The Service Providers could also, justifiably, explain their resistance to wringing content for value (through usage, value, service, application or transaction-based charging) by pointing to the absence of sophisticated billing systems adapted to the IP environment by virtue of their ability, extract detailed network usage information and exchange usage/accounting information etc.
The proliferation of consumers’ Quality of Service expectations has also highlighted severe inadequacies with the service providers’ flat-rate based business models. If one considers a simple comparison of the two models whereby, in the traditional model a consumer requests a download of a movie of size three hundred megabytes, but the actual download took 332 megabytes because of some retransmission issues. The provider can only charge for 300 megabytes and hence has lost on 32 megabytes while also failing to provide a quality service (perhaps because of a fault with the network provider). In a content-based model the consumer is charged for the movie not the megabytes i.e. the movie download cost is EUR19.99 irrespective of the time or size of the download or from where it is downloaded.
The Telecommunications Software Systems Group (TSSG) has spent the last five years researching billing and charging software systems to support such a billing model. The TSSG has recently gained recognition for this research through receipt of a substantial grant from Enterprise Ireland to fund the development of a commercial rating product labelled the Rating Bureau System (RBS).

Page 203 of 210« First...102030...201202203204205...210...Last »