Tuesday, February 3, 2009

The history of computer

It is difficult to identify who the device as the earliest computer, partly because the term "computer" is subject to different interpretations over time. Originally, the term "computer" referred to the person who performed numerical calculations (a human computer), often with the help of a mechanical calculating device.

History of the modern computer begins with two separate technologies - that of automated calculation and that of programmability.

Examples of early mechanical calculating devices included the abacus is a slide rule and arguably the astrolabe and the Antikythera mechanism (which dates from around 150-100 BC). Hero of Alexandria (10-70 century AD) built a mechanical theater which performed a game lasting 10 minutes and was operated by a complex system of ropes and drums that can be considered as a means of deciding which parts of the mechanism that performs the action, and when. This is the essence of programmability.

What is Computer

The first devices that resemble modern computers date to the mid 20th century (1940-1945), although the computer concept and various machines similar to computers existed earlier. Early electronic computers were the size of a large room, consuming as much power as several hundred modern personal computers (PC). Modern computers are based on small integrated circuits and are millions to billions of times more capable while a fraction of the space.

Today, simple computers may be made small enough for a wristwatch and be powered from a watch battery. Personal computers, in various forms are icons of the Information Age and are what most people think as a "computer", but the most common form of computer in use today is the embedded computer. Embedded computers are small, simple devices that are used for other devices - for example, they can be found in machines ranging from fighter aircraft to industrial robots, digital cameras and toys.

Thursday, January 29, 2009

What Should You Know About Your Graphic Designer?

The color printing a company has done is often the first thing a person will see in connection to that company. This is the very point of most advertising. You’re trying to target people who either don’t know who you are or haven’t had any interest before in shopping at your store.

This means your advertising is going to be a very important part of your marketing push, and the backbone of any good advertising is the design. Many companies choose to hire outside graphic designers in order to get their work accomplished, and if this is what your company plans on trying, there are certain details you should be aware of, and questions you should be sure to ask before hiring anyone.

One of the most important questions is whether or not they’ve designed something like what you’re looking to have done. If a brochure is your color printing of choice, ask them directly what experience they have with brochures. Hopefully they’ll actually have samples already made up of those very brochures they’ve worked on to show you, but if they don’t, be sure to confirm their experience with the style you want.

Now, I’m not saying to never hire someone who hasn’t done exactly what you’re looking for. After all, they might’ve worked on a number of other styles that are close enough to brochures that they’ll do a perfectly fine job, but then, maybe they’ve only handled simple designs like posters and don’t know how to approach the more complicated styling of brochures.

Only you’ll be able to decide whether or not you think they’re right for you, just be sure that you’re aware that different ads need distinctly different skills to make.

Online technical support

Virus And PC Protection Help
Virus can not only intimidate your isolation, they can lessen your PC performance and even hang your PC.
A lot of computer users have put back their PCs considering the computer is not doing fine enough when they truly want to take away the virus from their computer to get good performance.

You can avoid and treat the problem of Virus and other unreceptive software by learning fundamental anticipatory procedures and learning how to take protective steps to defend your PC when eliminating unreceptive software.

As you know that, computer virus is a part of malevolent program which is capable to influence the normal process of your PC. Virus can damage your hardisk or can remove your files. Computer virus problem generally occurs when you access internet. Do not attempt to download corrupt files or folders through internet. Additionally always keep back up for your files.

However, Askpcexperts provide you Virus Removal help. Askpcexperts guarantee you that now you need not to worry about Computer virus problems. To prevent virus askpcexperts suggest you that try to evade unlawful websites or ads.

Monday, January 8, 2007

Special-Purpose Supercomputers

Special-purpose supercomputers are high-performance computing devices with a hardware architecture dedicated to a single problem. This allows the use of specially programmed FPGA chips or even custom VLSI chips, allowing higher price/performance ratios by sacrificing generality. They are used for applications such as astrophysics computation and brute-force codebreaking.

Examples of special-purpose supercomputers:
Deep Blue, for playing chess Reconfigurable computing machines or parts of machines GRAPE, for astrophysics and molecular dynamics Deep Crack, for breaking the DES cipher

Supercomputer Programming

The parallel architectures of supercomputers often dictate the use of special programming techniques to exploit their speed. Special-purpose Fortran compilers can often generate faster code than C or C++ compilers, so Fortran remains the language of choice for scientific programming, and hence for most programs run on supercomputers. To exploit the parallelism of supercomputers, programming environments such as PVM and MPI for loosely connected clusters and OpenMP for tightly coordinated shared memory machines are being used.

Supercomputer Operating Systems

Supercomputer operating systems, today most often variants of Linux or UNIX, are every bit as complex as those for smaller machines, if not more so. Their user interfaces tend to be less developed, however, as the OS developers have limited programming resources to spend on non-essential parts of the OS (i.e., parts not directly contributing to the optimal utilization of the machine's hardware). This stems from the fact that because these computers, often priced at millions of dollars, are sold to a very small market, their R&D budgets are often limited. (The advent of Unix and Linux allows reuse of conventional desktop software and user interfaces.)
Interestingly this has been a continuing trend throughout the supercomputer industry, with former technology leaders such as Silicon Graphics taking a back seat to such companies as NVIDIA, who have been able to produce cheap, feature-rich, high-performance, and innovative products due to the vast number of consumers driving their R&D.

Historically, until the early-to-mid-1980s, supercomputers usually sacrificed instruction set compatibility and code portability for performance (processing and memory access speed). For the most part, supercomputers to this time (unlike high-end mainframes) had vastly different operating systems. The Cray-1 alone had at least six different proprietary OSs largely unknown to the general computing community. Similarly different and incompatible vectorizing and parallelizing compilers for Fortran existed. This trend would have continued with the ETA-10 were it not for the initial instruction set compatibility between the Cray-1 and the Cray X-MP, and the adoption of UNIX operating system variants (such as Cray's Unicos and today's Linux.)
For this reason, in the future, the highest performance systems are likely to have a UNIX flavor but with incompatible system-unique features (especially for the highest-end systems at secure facilities).

Processing Techniques in Super Computer

Vector processing techniques were first developed for supercomputers and continue to be used in specialist high-performance applications. Vector processing techniques have trickled down to the mass market in DSP architectures and SIMD processing instructions for general-purpose computers.

Modern video game consoles in particular use SIMD extensively and this is the basis for some manufacturers' claim that their game machines are themselves supercomputers. Indeed, some graphics cards have the computing power of several TeraFLOPS. The applications to which this power can be applied was limited by the special-purpose nature of early video processing. As video processing has become more sophisticated, Graphics processing units (GPUs) have evolved to become more useful as general-purpose vector processors, and an entire computer science sub-dicipline has arisen to exploit this capability: General-Purpose Computing on Graphics Processing Units (GPGPU.)

Supercomputer Design

Supercomputers using custom CPUs traditionally gained their speed over conventional computers through the use of innovative designs that allow them to perform many tasks in parallel, as well as complex detail engineering. They tend to be specialized for certain types of computation, usually numerical calculations, and perform poorly at more general computing tasks. Their memory hierarchy is very carefully designed to ensure the processor is kept fed with data and instructions at all times—in fact, much of the performance difference between slower computers and supercomputers is due to the memory hierarchy. Their I/O systems tend to be designed to support high bandwidth, with latency less of an issue, because supercomputers are not used for transaction processing.

As with all highly parallel systems, Amdahl's law applies, and supercomputer designs devote great effort to eliminating software serialization, and using hardware to accelerate the remaining bottlenecks.

Software Tools for Super Computer

Software tools for distributed processing include standard APIs such as MPI and PVM, and open source-based software solutions such as Beowulf and openMosix which facilitate the creation of a sort of "virtual supercomputer" from a collection of ordinary workstations or servers. Technology like ZeroConf (Rendezvous/Bonjour) pave the way for the creation of ad hoc computer clusters. An example of this is the distributed rendering function in Apple's Shake compositing application. Computers running the Shake software merely need to be in proximity to each other, in networking terms, to automatically discover and use each other's resources. While no one has yet built an ad hoc computer cluster that rivals even yesteryear's supercomputers, the line between desktop, or even laptop, and supercomputer is beginning to blur, and is likely to continue to blur as built-in support for parallelism and distributed processing increases in mainstream desktop operating systems. An easy programming language for supercomputers remains an open research topic in Computer Science.