Friday, June 12, 2009

Geniune Leather Handbag

Geniune Leather Handbag

Think style with genuine leather bag Trieste, detailed with decorative metal and detachable pouch. An essential elegant handbag in Dark Chocolate Brown with suedette lining goes well with your work also an all season bag.

Details


Genuine leather with suedette lining, hanging branding charm, inside 1 zipper pocket, magnetic lock closure, silver metal detailing
Material : Genuine Leather

Genuine Leather Messenger Bag


Genuine Leather Messenger Bag

Overview

Everyday transport for your essential utilities, Cream Roger a stylish messenger bag comes with secure cushioned pocket and flap closure. The adjustable broad patterened canvas shoulder belt adds style and comfort to carry. 
 
Detail
Broad canvas shoulder belt, special section for laptop, 2 slip pockets inside, 1 mobile pocket inside, magnetic closures
Material : Genuine Leather
 

 

Brown Leather Wallet

Brown Leather Wallet

Composed in genuine leather with 3 color stitch styling, this Brown Keller is a 3 fold style wallet with 2 bill compartments, multi card slots, slip pockets and detachable picture and card pouch to the inner.

Details

Construction 2 bill compartments, 7 card slots, 3 slip pockets, 2 photo pockets and 1 pouch pocket with detachable picture and card pouch
Color : Brown
Material : Genuine Leather

Leather Belt

Leather Belt

Quick Overview

Constructed in genuine full grain leather, snuffed with resin finish, edged with orange inking this Jet Black leather belt goes well with your casual dressing.
 
Construction : Leather belt edged with metal buckle
Material : Genuine Leather
Buckle : Antique silver metal buckle, matt finish

History of Linux

The Unix operating system was conceived and implemented in the 1960s and first released in 1970. Its wide availability and portability meant that it was widely adopted, copied and modified by academic institutions and businesses, with its design being influential on authors of other systems.
The
GNU Project, started in 1984 by Richard Stallman, had the goal of creating a "complete Unix-compatible software system"[10] composed entirely of free software. The next year Stallman created the Free Software Foundation and wrote the GNU General Public License (GNU GPL) in 1989. By the early 1990s, many of the programs required in an operating system (such as libraries, compilers, text editors, a Unix shell, and a windowing system) were completed, although low-level elements such as device drivers, daemons, and the kernel were stalled and incomplete.[11] Linus Torvalds has said that if the GNU kernel had been available at the time (1991), he would not have decided to write his own.[12]

Friday, April 24, 2009

Origin of returned content

The origin of the content sent by server is called:

Serving static content is usually much faster (from 2 to 100 times) than serving dynamic content, especially if the latter involves data pulled from a database.


Path translation


Web servers are able to map the path component of a Uniform Resource Locator (URL) into:

  • a local file system resource (for static requests);
  • an internal or external program name (for dynamic requests).

For a static request the URL path specified by the client is relative to the Web server's root directory.

Consider the following URL as it would be requested by a client:

http://www.example.com/path/file.html

The client's web browser will translate it into a connection to www.example.com with the following HTTP 1.1 request:

GET /path/file.html HTTP/1.1
Host: www.example.com

The web server on www.example.com will append the given path to the path of its root directory. On Unix machines, this is commonly /var/www. The result is the local file system resource:

/var/www/path/file.html

Load limits

A web server (program) has defined load limits, because it can handle only a limited number of concurrent client connections (usually between 2 and 60,000, by default between 500 and 1,000) per IP address (and TCP port) and it can serve only a certain maximum number of requests per second depending on:

  • its own settings;
  • the HTTP request type;
  • content origin (static or dynamic);

The web server will then read the file, if it exists, and send a response to the client's web browser. The response will describe the content of the file and contain the file itself.

web server


  1. A computer program that is responsible for accepting HTTP requests from clients (user agents such as web browsers), and serving them HTTP responses along with optional data contents, which usually are web pages such as HTML documents and linked objects (images, etc.).
  2. A computer that runs a computer program as described above.
Common features

Although web server programs differ in detail, they all share some basic common features.

  1. HTTP: every web server program operates by accepting HTTP requests from the client, and providing an HTTP response to the client. The HTTP response usually consists of an HTMLMIME-types). If some error is found in client request or while trying to serve it, a web server has to send an error response which may include some custom HTML or text messages to better explain the problem to end users. document, but can also be a raw file, an image, or some other type of document (defined by
  2. Logging: usually web servers have also the capability of logging some detailed information, about client requests and server responses, to log files; this allows the webmaster to collect statistics by running log analyzers on these files.

In practice many web servers implement the following features also:

  1. Authentication, optional authorization request (request of user name and password) before allowing access to some or all kind of resources.
  2. Handling of static content (file content recorded in server's filesystem(s)) and dynamic content by supporting one or more related interfaces (SSI, CGI, SCGI, FastCGI, JSP,ColdFusion, PHP, ASP, ASP.NET, Server API such as NSAPI, ISAPI, etc.).
  3. HTTPS support (by SSL or TLS) to allow secure (encrypted) connections to the server on the standard port 443 instead of usual port 80.
  4. Content compression (i.e. by gzip encoding) to reduce the size of the responses (to lower bandwidth usage, etc.).
  5. Virtual hosting to serve many web sites using one IP address.
  6. Large file support to be able to serve files whose size is greater than 2 GB on 32 bit OS.
  7. Bandwidth throttling to limit the speed of responses in order to not saturate the network and to be able to serve more clients.

Web site design


A web site is a collection of information about a particular topic or subject. Designing a web site is defined as the arrangement and creation of web pages that in turn make up a web site[citation needed]. A web page consists of information for which the web site is developed. A web site might be compared to a book, where each page of the book is a web page.

There are many aspects (design concerns) in this process, and due to the rapid development of the Internet, new aspects may emerge. For non-commercial web sites, the goals may vary depending on the desired exposure and response. For typical commercial web sites, the basic aspects of design are:

  • The content: the substance, and information on the site should be relevant to the site and should target the area of the public that the website is concerned with.
  • The usability: the site should be user-friendly, with the interface and navigation simple and reliable.
  • The appearance: the graphics and text should include a single style that flows throughout, to show consistency. The style should be professional, appealing and relevant.
  • The visibility: the site must also be easy to find via most, if not all, major search engines and advertisement media.

A web site typically consists of text and images. The first page of a web site is known as the Home page or Index. Some web sites use what is commonly called a Splash Page. Splash pages might include a welcome message, language or region selection, or disclaimer. Each web page within a web site is an HTML file which has its own URL. After each web page is created, they are typically linked together using a navigation menu composed of hyperlinks. Faster browsing speeds have led to shorter attention spans and more demanding online visitors and this has resulted in less use of Splash Pages, particularly where commercial web sites are concerned[citation needed].

Once a web site is completed, it must be published or uploaded in order to be viewable to the public over the internet. This may be done using an FTP client. Once published, the web master may use a variety of techniques to increase the traffic, or hits, that the web site receives. This may include submitting the web site to a search engine such as Google or Yahoo, exchanging links with other web sites, creating affiliations with similar web sites, etc.

Applications of databases

Databases are used in many applications, spanning virtually the entire range of computer software. Databases are the preferred method of storage for large multiuser applications, where coordination between many users is needed. Even individual users find them convenient, and many electronic mail programs and personal organizers are based on standard database technology. Software database drivers are available for most database platforms so that application software can use a common Application Programming Interface to retrieve the information stored in a database. Two commonly used database APIs are JDBC and ODBC.

Database Website security

Database security denotes the system, processes, and procedures that protect a database from unintended activity.

Security is usually enforced through access control, auditing, and encryption.

  • Access control ensures and restricts who can connect and what can be done to the database.
  • Auditing logs what action or change has been performed, when and by whom.
  • Encryption: Since security has become a major issue in recent years, many commercial database vendors provide built-in encryption mechanisms. Data is encoded natively into the tables and deciphered "on the fly" when a query comes in. Connections can also be secured and encrypted if required using DSA, MD5, SSL or legacy encryption standard.

Enforcing security is one of the major tasks of the DBA.

In the United Kingdom, legislation protecting the public from unauthorized disclosure of personal information held on databases falls under the Office of the Information Commissioner. United Kingdom based organizations holding personal data in electronic format (databases for example) are required to register with the Data Commissioner.

Database Site

Post-relational database models

Products offering a more general data model than the relational model are sometimes classified as post-relational. The data model in such products incorporates relations but is not constrained by the Information Principle, which requires that all information is represented by data values in relations.

Some of these extensions to the relational model actually integrate concepts from technologies that pre-date the relational model. For example, they allow representation of a directed graph with trees on the nodes.

Some products implementing such models have been built by extending relational database systems with non-relational features. Others, however, have arrived in much the same place by adding relational features to pre-relational systems. Paradoxically, this allows products that are historically pre-relational, such as PICK and MUMPS, to make a plausible claim to be post-relational in their current architecture.