The surge in making use of electronic commerce (Ecommerce) by the business sector has been incredible because its inception only a few years ago. From federal governments to international companies to one-person start-ups, e-commerce is progressively deemed an essential business technique of the future. Alleviate of transaction, broadening markets, and decreased overheads are aspects that make e-commerce solutions increasingly more attractive, as noticeable with the growth of online sales.
Here are few issues and issues which influence the development of Internet, e-commerce and e-business applications.
The World Wide Web was developed as a way of dispensing documentation within the large lab at CERN in Geneva. Because many of the developers of the technology were unaware of its potential there are a variety of problems connected with its big expansion.
Probably the best understood of these is that the Internet is lacking space for identifying computer systems. The present technology utilized to transfer data around the Internet is such that in the relatively near future we will lose space to hold these distinct addresses. Gladly this is an issue that has been recognized and groups of scientists around the globe have developed new technologies which will eventually overcome this problem, one of these technologies being a new version of the method made use of to move data over the Internet.
Web servers are what are called stateless servers. What this suggests is that in their pure kind they keep no memory of what has actually previously taken place to them in between requests; for example, when a request is processed by a Web server for a page they have no direct understanding about whether the page request was made by the same browser that requested a previous page to be returned.
While this was not serious when Web servers were being generally used for dispensing documentation (their original use) it is a significant problem in e-commerce. One example of this is the shopping cart, or as it is known in the United Kingdom, the shopping trolley. When you see an e-tailer and purchase products you communicate with a simulation of a shopping cart which keeps details of the goods that you have actually purchased. At the end of your interaction a Web page, commonly called a checkout page, will show the contents of the shopping cart and present you with the monetary overall of your purchases. Web servers as initially envisaged are unable to do this as they have no knowledge of any previous visit: they would not be able to bear in mind the previous purchase.
In the relatively early days of the Web this was seen to be a problem and a type of programming referred to as Common Gateway Interface programming was established which enabled a Web server to have a memory. There are a number of other, more recent technologies which have been developed to manage this issue. The very first is cookies; these are portions of data which are kept on the computer running the Web browser and which can be accessed by the browser throughout their interaction with a particular Web site. Such cookies can, for example, store the data connected with a shopping cart. Another technology made use of to store state is servlets; this is a technology, which employs cookies, and which is connected with Java; it enables the programmer to develop recyclable code that can be plugged into a server and which keeps data constantly in the Web server.
Another example of an issue with Web servers which develops from their initial capability is the fact that Web pages were designed to be static: they were files which were stored on a computer system and delivered in their stored form to anyone using a browser to access them. Lots of e-commerce and e-business applications need something much more vibrant, for instance there are a variety of financial service websites on the internet which supply customers with updated stock and share rates. These prices are saved on Web pages and need to alter very regularly – frequently every couple of seconds. There have been a number of add-on technologies that have been established in order to deal with this problem.
One early solution is something referred to as a Server Side Include in which parts of a Web page are marked as being dynamic and, prior to their being sent out to the browser, they are updated with data that has changed. Servlets are also utilized to produce vibrant pages, for example they can be configured to return particular Web pages to a browser consisting of content packed in from a database. Another technology which has become really prominent over the last two years is understood generically as dynamic pages. This is a more flexible version of Server Side Includes which allows the Java programmer to insert data into a Web page at defined points on a real-time basis.
Security and privacy
The Internet is not an especially secure place. There are two elements to this: the first is that information is extensively released throughout the Internet which can be made use of for criminal and near-criminal activities. The 2nd aspect is that given that the Internet is an open system, details of its underlying technologies are easily offered to anybody. This means that the method data passes through the Internet is in the general public domain; the penalty of this is that, theoretically, anybody with the right tools can eavesdrop on data passing from one computer system on the Internet to another.
It is worth examining the very first problem. Already you have met one of the penalties of data being easily released on the Internet: the fact that spammers can use programs referred to as address harvesters to send huge amounts of unsolicited e-mail to users. There are far more severe symptoms of this issue, for instance a phenomenon that has taken place in the last three years is cyberstalking. This is where a user of the Internet discovers the details of another user’s e-mail account and pesters them electronically, sending them emails, contacting them through newsgroups and intruding into the chat rooms that they use.
The possession of an e-mail address can even provide the means where someone can reduce part of a networked system. It is fairly simple to set a computer to send many countless emails to a computer system which is managing e-mail communication for a company or organisation; the volume of emails can be so high that the computer system is unable to carry out its main function: that of enabling personnel of the company or organisation to send and receive emails.
The second element of security is that data flow throughout the World Wide Web and the protocols utilized to interact with computers in the Internet are public. This suggests that anybody who wants to go into a computer system which has a connection to the Internet or anybody who wants to check out the data going through it has a significant benefit. There is, however, a contrary point of view which specifies that by keeping security information open any security breaches can be plugged quickly by spots produced from an educated community of developers.
There are significant gains for the criminal in having the ability to access a ‘safe and secure’ system, for instance a criminal who can check out the information of a credit card passing along a transmission line from a browser to a Web server, can use that data to buy products over the net and remain undiscovered up until the next time the charge card declaration is provided to the card holder; in this respect they have a significant advantage over the wrongdoer who just takes the card. A bad guy who wants to mess up a network – perhaps they are a disgruntled former employee of the company – can send out a program online which is then executed on the internal network of the company and deletes essential files. A business spy can keep track of the data being sent down a communication line and find that it is from a company to a widely known research and development organisation which specialises in specific niche products. This information, even simply the name of the R&D company, is important to any competitor.
When the Internet and the World Wide Web were developed security was low on the agenda. There were two factors for this: the very first is that the developers of the embryonic Internet were tussling with what was then unique technology and the majority of their focus was on fundamental purposes such as developing and keeping reputable communications; the second factor is that very few individuals recognized then that the Internet was going to be utilized for commercial purposes.
Programming and abstraction
In the early 1990s programming an application for the Internet was a tough proposition. Java, when it appeared in 1996, enabled developers to deal with another computer on a network essentially as if it was an input or output device; the programming code required to send out data or get data from another computer system varied just somewhat from that needed to send and receive data from files.
Dispersed things are items which are saved on computers in a network, and to which messages can be sent out as if they were things living on the computer which is sending the messages. In this way a programmer establishes software for a dispersed system in the same way that they would for a single computer system: by specifying classes and by performing code including objects defined by the classes, with the code sending messages to the items; the real information of how the transport of messages takes place would be concealed from the programmer.
E-commerce experts speak of a Web year. This is the time which it takes to give application a traditional system that would typically take a calendar year to develop. Existing estimates are that fiscal year amounts seven Web years. No place is there more of a crucial for companies to develop products and services quickly, together with the computing facilities needed to support them, than in e-commerce. In software engineering terms this has actually generated a number of software development methods which are loosely described by the term quick application development. In technology terms it has generated a number of ideas which go some way along the course which ends with providing centers that make it possible for companies to develop systems by just bolting elements together, with many of the parts being specified using design templates.
The rapid development of object-oriented programming languages such as C++ and Java has meant that the last five years have actually seen a growth of technologies that allow a developer to program software components that can be reused time and time again in applications besides those which they were initially developed for.
Structure and data
A problem that is being significantly experienced by Internet companies is that they have to interchange a big quantity of data which such data naturally does not have structure. For example, HTML has proved to be a long-lasting markup language for developing Web pages; however, there are no centers within the language, for instance, to suggest whether a product of data, say a three-digit number, represents the price of a product or some per hour rate charged by a company employee.
There are also problems with browsers. There are 2 primary browsers used by users of the World Wide Web: Internet Explorer and Netscape Navigator. Each of these browsers can display the browser pages they process in different ways, especially if they include sophisticated facilities of HTML.
There is also a more issue with browsers which ares more significant than the one specificed in the previous paragraph. Networking technologies are now being utilized in conjunction with other technologies such as those connected with mobile phone technology and tv. This has actually lead to the emergence of a variety of different markup languages which are concentrated on specific devices, for instance there is a markup language called WML (Wireless Markup Language) which is used to show files on Internet mobile phones. The diversity of such languages indicates that the overhead in preserving a number of versions of a file for various media can be huge.
Gladly a technology has actually been developed referred to as XML which can be utilized to indicate structure in a file. There are also a number of tools available which permit the developer to keep a single version of a document revealed in a language specified by XML and easily convert it into a type that can be displayed on a variety of media consisting of television, Internet phones and a variety of World Wide Web browsers.
A dispersed transaction is a sequence of operations applied to a variety of dispersed databases which form a single functional step. For instance, a deal which moves a quantity of cash from a customer’s account to an account owned by the same customer is an example of a transaction. It consists of two operations: the operation of debiting one account and the operation of crediting another account. There are a number of issues associated with dispersed deals. This is the issue of deadlock: that a transaction applied at one server might be awaiting data which is currently consisted of on another server, with the other server waiting for some resource that is held on the first server. For example, the first server may consist of the account data that the second server needs to finish a deal, while the second server may need other account data for it to proceed.
Creating a dispersed system can also be an issue, for example the fact that computers in a distributed system are signed up with by communication media which can stretch over thousands of miles offers an included dimension to the design process because response time can be an issue. Another, equally severe problem is that of dependability, for example that a hardware breakdown can lower a poorly-designed distributed system.
As an example of one design issue that a distributed systems developer has to deal with consider that of duplicated data. A duplicated database is a database which exists in the exact same type at a number of points in a dispersed system. There are 2 reasons for having replicated databases: the very first is reliability. When a system consists of a variety of duplicated databases and among them becomes not available – perhaps because of a hardware fault – another database can take over its role. The second factor is to improve response time. A designer of a distributed system will try and place a database close to its users, typically linked via a quick local area network. Often the original database that is made use of is a long distance away and can only be accessed by means of slow Internet cables; hence duplicating the database and placing it near the users typically leads to a large decrease in response time.
However, using duplication comes at a cost: each replicated database needs to keep up-to-date data and will need to coordinate with other databases in order to do this; this gives rise to synchronization traffic over the network which supports the databases, and can result in an extremely slow response time. Creating for data duplication, where the quantity of duplication and the location of the reproduced data is such that response time is decreased, but traffic is not increased to the point where all the gains are nullified, is an art.