Wednesday, September 15, 2010

CELLPHONE HAZARDS

Dos and Dont's

- Use landlines instead of mobiles wherever possible.

- Don't use a cellphone if the signal is weak-that's when they shoot out more radiation.

- Talk less, text more; invest in a hands-free kit.

- Opt for a low SAR (specific absorption rate) phone and avoid sleeping with it.

- Children below eight should not be allowed to use mobile phones except in emergencies.

- It's difficult, but try not to use a cellphone for more than one-two minutes at a stretch.

- Try to live away from a cell phone tower-ideally more than five km away.

7 Reasons Why You Should Worry

The data is mixed and experts differ. But there's growing concern that the ubiquitous wireless technology has the potential to become the next public health disaster.

Heating trouble

Cellphones generate heat and these radiofrequencies are absorbed mostly by the head and neck. Heavy use increases the risk of raised cellular, tissue as well as body temperature.

Pregnancy and fertility

Pregnant women more likely to have children with problems. Cellphones can affect male fertility, too.

Genetic danger

Cellphone and cellphone tower radiation stress cells, releasing DNA-damaging free radicals and stress proteins that can cause degenerative brain damage like Alzheimer's, Parkinson's or Multiple Sclerosis.

Sleep disorders

May activate brain's stress system; make people more alert, bring down the ability to wind down and reduce sleep.

Children run a greater risk

With their developing nervous system, growing tissues, thinner skulls, children run a greater risk of energy absorption. It can lead to memory loss, inability to learn and behavioural issues.

Long-term effects

Over 10 years of heavy cellphone use can double the risk of brain cancer and up the risk of ear tumours by four times. Long-term use, especially on one ear, may damage tissues in the inner ear and lead to deafness.

Cancer and tumours

Some researchers hold that radiofrequency fields are tumour initiators or they increase the uptake of carcinogens in cells. The World Health Organisation Interphone study in 2008 suggested a connection between long-term use and three types of tumours: glioma, cancer of the parotid (a salivary gland near the ear) and acoustic neuroma (a tumour that occurs where the ear meets the brain).

-- 7 hours of talk time a week and 56 messages on average. That's how badly hooked metro indians are to cellphones.

--Compiled by Damayanti Datta

Wednesday, August 4, 2010

VIRUS

Computer virus
From Wikipedia, the free encyclopedia
This is the latest accepted revision, accepted on 3 August 2010.Jump to: navigation, search
Not to be confused with Malware.
A computer virus is a computer program that can copy itself[1] and infect a computer. The term "virus" is also commonly but erroneously used to refer to other types of malware, including but not limited to adware and spyware programs that do not have the reproductive ability. A true virus can spread from one computer to another (in some form of executable code) when its host is taken to the target computer; for instance because a user sent it over a network or the Internet, or carried it on a removable medium such as a floppy disk, CD, DVD, or USB drive.[2]

Viruses can increase their chances of spreading to other computers by infecting files on a network file system or a file system that is accessed by another computer.[3][4]

As stated above, the term "computer virus" is sometimes used as a catch-all phrase to include all types of malware, even those that do not have the reproductive ability. Malware includes computer viruses, computer worms, Trojan horses, most rootkits, spyware, dishonest adware and other malicious and unwanted software, including true viruses. Viruses are sometimes confused with worms and Trojan horses, which are technically different. A worm can exploit security vulnerabilities to spread itself automatically to other computers through networks, while a Trojan horse is a program that appears harmless but hides malicious functions. Worms and Trojan horses, like viruses, may harm a computer system's data or performance. Some viruses and other malware have symptoms noticeable to the computer user, but many are surreptitious or simply do nothing to call attention to themselves. Some viruses do nothing beyond reproducing themselves.

Contents [hide]
1 History
1.1 Academic work
1.2 Science Fiction
1.3 Virus programs
2 Infection strategies
2.1 Nonresident viruses
2.2 Resident viruses
3 Vectors and hosts
4 Methods to avoid detection
4.1 Avoiding bait files and other undesirable hosts
4.2 Stealth
4.2.1 Self-modification
4.2.2 Encryption with a variable key
4.2.3 Polymorphic code
4.2.4 Metamorphic code
5 Vulnerability and countermeasures
5.1 The vulnerability of operating systems to viruses
5.2 The role of software development
5.3 Anti-virus software and other preventive measures
5.4 Recovery methods
5.4.1 Virus removal
5.4.2 Operating system reinstallation
6 See also
7 References
8 Further reading
9 External links

[edit] History
[edit] Academic work
The first academic work on the theory of computer viruses (although the term "computer virus" was not invented at that time) was done by John von Neumann in 1949 who held lectures at the University of Illinois about the "Theory and Organization of Complicated Automata". The work of von Neumann was later published as the "Theory of self-reproducing automata".[5] In his essay von Neumann postulated that a computer program could reproduce.

In 1972 Veith Risak published his article "Selbstreproduzierende Automaten mit minimaler Informationsübertragung" (Self-reproducing automata with minimal information exchange).[6] The article describes a fully functional virus written in assembler language for a SIEMENS 4004/35 computer system.

In 1980 Jürgen Kraus wrote his diplom thesis "Selbstreproduktion bei Programmen" (Self-reproduction of programs) at the University of Dortmund.[7] In his work Kraus postulated that computer programs can behave in a way similar to biological viruses.

In 1984 Fred Cohen from the University of Southern California wrote his paper "Computer Viruses - Theory and Experiments".[8] It was the first paper to explicitly call a self-reproducing program a "virus"; a term introduced by his mentor Leonard Adleman.

An article that describes "useful virus functionalities" was published by J.B. Gunn under the title "Use of virus functions to provide a virtual APL interpreter under user control" in 1984.[9]

[edit] Science Fiction
The Terminal Man, a science fiction novel by Michael Crichton (1972), told (as a sideline story) of a computer with telephone modem dialing capability, which had been programmed to randomly dial phone numbers until it hit a modem that is answered by another computer. It then attempted to program the answering computer with its own program, so that the second computer would also begin dialing random numbers, in search of yet another computer to program. The program is assumed to spread exponentially through susceptible computers.

The actual term 'virus' was first used in David Gerrold's 1972 novel, When HARLIE Was One. In that novel, a sentient computer named HARLIE writes viral software to retrieve damaging personal information from other computers to blackmail the man who wants to turn him off.

[edit] Virus programs
The Creeper virus was first detected on ARPANET, the forerunner of the Internet, in the early 1970s.[10] Creeper was an experimental self-replicating program written by Bob Thomas at BBN Technologies in 1971.[11] Creeper used the ARPANET to infect DEC PDP-10 computers running the TENEX operating system.[12]Creeper gained access via the ARPANET and copied itself to the remote system where the message, "I'm the creeper, catch me if you can!" was displayed. The Reaper program was created to delete Creeper.[13]

A program called "Elk Cloner" was the first computer virus to appear "in the wild" — that is, outside the single computer or lab where it was created.[14] Written in 1981 by Richard Skrenta, it attached itself to the Apple DOS 3.3 operating system and spread via floppy disk.[14][15] This virus, created as a practical joke when Skrenta was still in high school, was injected in a game on a floppy disk. On its 50th use the Elk Cloner virus would be activated, infecting the computer and displaying a short poem beginning "Elk Cloner: The program with a personality."

The first PC virus in the wild was a boot sector virus dubbed (c)Brain[16], created in 1986 by the Farooq Alvi Brothers in Lahore, Pakistan, reportedly to deter piracy of the software they had written[17]. However, analysts have claimed that the Ashar virus, a variant of Brain, possibly predated it based on code within the virus.[original research?]

Before computer networks became widespread, most viruses spread on removable media, particularly floppy disks. In the early days of the personal computer, many users regularly exchanged information and programs on floppies. Some viruses spread by infecting programs stored on these disks, while others installed themselves into the disk boot sector, ensuring that they would be run when the user booted the computer from the disk, usually inadvertently. PCs of the era would attempt to boot first from a floppy if one had been left in the drive. Until floppy disks fell out of use, this was the most successful infection strategy and boot sector viruses were the most common in the wild for many years.[1]

Traditional computer viruses emerged in the 1980s, driven by the spread of personal computers and the resultant increase in BBS, modem use, and software sharing. Bulletin board-driven software sharing contributed directly to the spread of Trojan horse programs, and viruses were written to infect popularly traded software. Shareware and bootleg software were equally common vectors for viruses on BBS's.[citation needed]

Macro viruses have become common since the mid-1990s. Most of these viruses are written in the scripting languages for Microsoft programs such as Word and Excel and spread throughout Microsoft Office by infecting documents and spreadsheets. Since Word and Excel were also available for Mac OS, most could also spread to Macintosh computers. Although most of these viruses did not have the ability to send infected e-mail, those viruses which did take advantage of the Microsoft Outlook COM interface.[citation needed]

Some old versions of Microsoft Word allow macros to replicate themselves with additional blank lines. If two macro viruses simultaneously infect a document, the combination of the two, if also self-replicating, can appear as a "mating" of the two and would likely be detected as a virus unique from the "parents".[18]

A virus may also send a web address link as an instant message to all the contacts on an infected machine. If the recipient, thinking the link is from a friend (a trusted source) follows the link to the website, the virus hosted at the site may be able to infect this new computer and continue propagating.

Viruses that spread using cross-site scripting were first reported in 2002[19], and were academically demonstrated in 2005.[20] There have been multiple instances of the cross-site scripting viruses in the wild, exploiting websites such as MySpace and Yahoo.

[edit] Infection strategies
In order to replicate itself, a virus must be permitted to execute code and write to memory. For this reason, many viruses attach themselves to executable files that may be part of legitimate programs. If a user attempts to launch an infected program, the virus' code may be executed simultaneously. Viruses can be divided into two types based on their behavior when they are executed. Nonresident viruses immediately search for other hosts that can be infected, infect those targets, and finally transfer control to the application program they infected. Resident viruses do not search for hosts when they are started. Instead, a resident virus loads itself into memory on execution and transfers control to the host program. The virus stays active in the background and infects new hosts when those files are accessed by other programs or the operating system itself.

[edit] Nonresident viruses
Nonresident viruses can be thought of as consisting of a finder module and a replication module. The finder module is responsible for finding new files to infect. For each new executable file the finder module encounters, it calls the replication module to infect that file.

[edit] Resident viruses
Resident viruses contain a replication module that is similar to the one that is employed by nonresident viruses. This module, however, is not called by a finder module. The virus loads the replication module into memory when it is executed instead and ensures that this module is executed each time the operating system is called to perform a certain operation. The replication module can be called, for example, each time the operating system executes a file. In this case the virus infects every suitable program that is executed on the computer.

Resident viruses are sometimes subdivided into a category of fast infectors and a category of slow infectors. Fast infectors are designed to infect as many files as possible. A fast infector, for instance, can infect every potential host file that is accessed. This poses a special problem when using anti-virus software, since a virus scanner will access every potential host file on a computer when it performs a system-wide scan. If the virus scanner fails to notice that such a virus is present in memory the virus can "piggy-back" on the virus scanner and in this way infect all files that are scanned. Fast infectors rely on their fast infection rate to spread. The disadvantage of this method is that infecting many files may make detection more likely, because the virus may slow down a computer or perform many suspicious actions that can be noticed by anti-virus software. Slow infectors, on the other hand, are designed to infect hosts infrequently. Some slow infectors, for instance, only infect files when they are copied. Slow infectors are designed to avoid detection by limiting their actions: they are less likely to slow down a computer noticeably and will, at most, infrequently trigger anti-virus software that detects suspicious behavior by programs. The slow infector approach, however, does not seem very successful.

[edit] Vectors and hosts
Viruses have targeted various types of transmission media or hosts. This list is not exhaustive:

Binary executable files (such as COM files and EXE files in MS-DOS, Portable Executable files in Microsoft Windows, the Mach-O format in OSX, and ELF files in Linux)
Volume Boot Records of floppy disks and hard disk partitions
The master boot record (MBR) of a hard disk
General-purpose script files (such as batch files in MS-DOS and Microsoft Windows, VBScript files, and shell script files on Unix-like platforms).
Application-specific script files (such as Telix-scripts)
System specific autorun script files (such as Autorun.inf file needed by Windows to automatically run software stored on USB Memory Storage Devices).
Documents that can contain macros (such as Microsoft Word documents, Microsoft Excel spreadsheets, AmiPro documents, and Microsoft Access database files)
Cross-site scripting vulnerabilities in web applications (see XSS Worm)
Arbitrary computer files. An exploitable buffer overflow, format string, race condition or other exploitable bug in a program which reads the file could be used to trigger the execution of code hidden within it. Most bugs of this type can be made more difficult to exploit in computer architectures with protection features such as an execute disable bit and/or address space layout randomization.
PDFs, like HTML, may link to malicious code. PDFs can also be infected with malicious code.

In operating systems that use file extensions to determine program associations (such as Microsoft Windows), the extensions may be hidden from the user by default. This makes it possible to create a file that is of a different type than it appears to the user. For example, an executable may be created named "picture.png.exe", in which the user sees only "picture.png" and therefore assumes that this file is an image and most likely is safe.

An additional method is to generate the virus code from parts of existing operating system files by using the CRC16/CRC32 data. The initial code can be quite small (tens of bytes) and unpack a fairly large virus. This is analogous to a biological "prion" in the way it works but is vulnerable to signature based detection. This attack has not yet been seen "in the wild".

[edit] Methods to avoid detection
In order to avoid detection by users, some viruses employ different kinds of deception. Some old viruses, especially on the MS-DOS platform, make sure that the "last modified" date of a host file stays the same when the file is infected by the virus. This approach does not fool anti-virus software, however, especially those which maintain and date Cyclic redundancy checks on file changes.

Some viruses can infect files without increasing their sizes or damaging the files. They accomplish this by overwriting unused areas of executable files. These are called cavity viruses. For example, the CIH virus, or Chernobyl Virus, infects Portable Executable files. Because those files have many empty gaps, the virus, which was 1 KB in length, did not add to the size of the file.

Some viruses try to avoid detection by killing the tasks associated with antivirus software before it can detect them.

As computers and operating systems grow larger and more complex, old hiding techniques need to be updated or replaced. Defending a computer against viruses may demand that a file system migrate towards detailed and explicit permission for every kind of file access.

[edit] Avoiding bait files and other undesirable hosts
A virus needs to infect hosts in order to spread further. In some cases, it might be a bad idea to infect a host program. For example, many anti-virus programs perform an integrity check of their own code. Infecting such programs will therefore increase the likelihood that the virus is detected. For this reason, some viruses are programmed not to infect programs that are known to be part of anti-virus software. Another type of host that viruses sometimes avoid are bait files. Bait files (or goat files) are files that are specially created by anti-virus software, or by anti-virus professionals themselves, to be infected by a virus. These files can be created for various reasons, all of which are related to the detection of the virus:

Anti-virus professionals can use bait files to take a sample of a virus (i.e. a copy of a program file that is infected by the virus). It is more practical to store and exchange a small, infected bait file, than to exchange a large application program that has been infected by the virus.
Anti-virus professionals can use bait files to study the behavior of a virus and evaluate detection methods. This is especially useful when the virus is polymorphic. In this case, the virus can be made to infect a large number of bait files. The infected files can be used to test whether a virus scanner detects all versions of the virus.
Some anti-virus software employs bait files that are accessed regularly. When these files are modified, the anti-virus software warns the user that a virus is probably active on the system.
Since bait files are used to detect the virus, or to make detection possible, a virus can benefit from not infecting them. Viruses typically do this by avoiding suspicious programs, such as small program files or programs that contain certain patterns of 'garbage instructions'.

A related strategy to make baiting difficult is sparse infection. Sometimes, sparse infectors do not infect a host file that would be a suitable candidate for infection in other circumstances. For example, a virus can decide on a random basis whether to infect a file or not, or a virus can only infect host files on particular days of the week.

[edit] Stealth
Some viruses try to trick antivirus software by intercepting its requests to the operating system. A virus can hide itself by intercepting the antivirus software’s request to read the file and passing the request to the virus, instead of the OS. The virus can then return an uninfected version of the file to the antivirus software, so that it seems that the file is "clean". Modern antivirus software employs various techniques to counter stealth mechanisms of viruses. The only completely reliable method to avoid stealth is to boot from a medium that is known to be clean.

[edit] Self-modification
Most modern antivirus programs try to find virus-patterns inside ordinary programs by scanning them for so-called virus signatures. A signature is a characteristic byte-pattern that is part of a certain virus or family of viruses. If a virus scanner finds such a pattern in a file, it notifies the user that the file is infected. The user can then delete, or (in some cases) "clean" or "heal" the infected file. Some viruses employ techniques that make detection by means of signatures difficult but probably not impossible. These viruses modify their code on each infection. That is, each infected file contains a different variant of the virus.

[edit] Encryption with a variable key
A more advanced method is the use of simple encryption to encipher the virus. In this case, the virus consists of a small decrypting module and an encrypted copy of the virus code. If the virus is encrypted with a different key for each infected file, the only part of the virus that remains constant is the decrypting module, which would (for example) be appended to the end. In this case, a virus scanner cannot directly detect the virus using signatures, but it can still detect the decrypting module, which still makes indirect detection of the virus possible. Since these would be symmetric keys, stored on the infected host, it is in fact entirely possible to decrypt the final virus, but this is probably not required, since self-modifying code is such a rarity that it may be reason for virus scanners to at least flag the file as suspicious.

An old, but compact, encryption involves XORing each byte in a virus with a constant, so that the exclusive-or operation had only to be repeated for decryption. It is suspicious for a code to modify itself, so the code to do the encryption/decryption may be part of the signature in many virus definitions.

[edit] Polymorphic code
Polymorphic code was the first technique that posed a serious threat to virus scanners. Just like regular encrypted viruses, a polymorphic virus infects files with an encrypted copy of itself, which is decoded by a decryption module. In the case of polymorphic viruses, however, this decryption module is also modified on each infection. A well-written polymorphic virus therefore has no parts which remain identical between infections, making it very difficult to detect directly using signatures. Antivirus software can detect it by decrypting the viruses using an emulator, or by statistical pattern analysis of the encrypted virus body. To enable polymorphic code, the virus has to have a polymorphic engine (also called mutating engine or mutation engine) somewhere in its encrypted body. See Polymorphic code for technical detail on how such engines operate.[21]

Some viruses employ polymorphic code in a way that constrains the mutation rate of the virus significantly. For example, a virus can be programmed to mutate only slightly over time, or it can be programmed to refrain from mutating when it infects a file on a computer that already contains copies of the virus. The advantage of using such slow polymorphic code is that it makes it more difficult for antivirus professionals to obtain representative samples of the virus, because bait files that are infected in one run will typically contain identical or similar samples of the virus. This will make it more likely that the detection by the virus scanner will be unreliable, and that some instances of the virus may be able to avoid detection.

[edit] Metamorphic code
To avoid being detected by emulation, some viruses rewrite themselves completely each time they are to infect new executables. Viruses that utilize this technique are said to be metamorphic. To enable metamorphism, a metamorphic engine is needed. A metamorphic virus is usually very large and complex. For example, W32/Simile consisted of over 14000 lines of Assembly language code, 90% of which is part of the metamorphic engine.[22][23]

[edit] Vulnerability and countermeasures
[edit] The vulnerability of operating systems to viruses
Just as genetic diversity in a population decreases the chance of a single disease wiping out a population, the diversity of software systems on a network similarly limits the destructive potential of viruses. This became a particular concern in the 1990s, when Microsoft gained market dominance in desktop operating systems and office suites. The users of Microsoft software (especially networking software such as Microsoft Outlook and Internet Explorer) are especially vulnerable to the spread of viruses. Microsoft software is targeted by virus writers due to their desktop dominance, and is often criticized for including many errors and holes for virus writers to exploit. Integrated and non-integrated Microsoft applications (such as Microsoft Office) and applications with scripting languages with access to the file system (for example Visual Basic Script (VBS), and applications with networking features) are also particularly vulnerable.

Although Windows is by far the most popular target operating system for virus writers, viruses also exist on other platforms. Any operating system that allows third-party programs to run can theoretically run viruses. Some operating systems are less secure than others. Unix-based operating systems (and NTFS-aware applications on Windows NT based platforms) only allow their users to run executables within their own protected memory space.

An Internet based experiment revealed that there were cases when people willingly pressed a particular button to download a virus. Security analyst Didier Stevens ran a half year advertising campaign on Google AdWords which said "Is your PC virus-free? Get it infected here!". The result was 409 clicks.[24][25]

As of 2006[update], there are relatively few security exploits targeting Mac OS X (with a Unix-based file system and kernel).[26] The number of viruses for the older Apple operating systems, known as Mac OS Classic, varies greatly from source to source, with Apple stating that there are only four known viruses, and independent sources stating there are as many as 63 viruses. Many Mac OS Classic viruses targeted the HyperCard authoring environment. The difference in virus vulnerability between Macs and Windows is a chief selling point, one that Apple uses in their Get a Mac advertising.[27] In January 2009, Symantec announced the discovery of a trojan that targets Macs.[28] This discovery did not gain much coverage until April 2009.[28]

While Linux, and Unix in general, has always natively blocked normal users from having access to make changes to the operating system environment, Windows users are generally not. This difference has continued partly due to the widespread use of administrator accounts in contemporary versions like XP. In 1997, when a virus for Linux was released – known as "Bliss" – leading antivirus vendors issued warnings that Unix-like systems could fall prey to viruses just like Windows.[29] The Bliss virus may be considered characteristic of viruses – as opposed to worms – on Unix systems. Bliss requires that the user run it explicitly, and it can only infect programs that the user has the access to modify. Unlike Windows users, most Unix users do not log in as an administrator user except to install or configure software; as a result, even if a user ran the virus, it could not harm their operating system. The Bliss virus never became widespread, and remains chiefly a research curiosity. Its creator later posted the source code to Usenet, allowing researchers to see how it worked.[30]

[edit] The role of software development
Because software is often designed with security features to prevent unauthorized use of system resources, many viruses must exploit software bugs in a system or application to spread. Software development strategies that produce large numbers of bugs will generally also produce potential exploits.

[edit] Anti-virus software and other preventive measures
Many users install anti-virus software that can detect and eliminate known viruses after the computer downloads or runs the executable. There are two common methods that an anti-virus software application uses to detect viruses. The first, and by far the most common method of virus detection is using a list of virus signature definitions. This works by examining the content of the computer's memory (its RAM, and boot sectors) and the files stored on fixed or removable drives (hard drives, floppy drives), and comparing those files against a database of known virus "signatures". The disadvantage of this detection method is that users are only protected from viruses that pre-date their last virus definition update. The second method is to use a heuristic algorithm to find viruses based on common behaviors. This method has the ability to detect novel viruses that anti-virus security firms have yet to create a signature for.

Some anti-virus programs are able to scan opened files in addition to sent and received e-mails "on the fly" in a similar manner. This practice is known as "on-access scanning". Anti-virus software does not change the underlying capability of host software to transmit viruses. Users must update their software regularly to patch security holes. Anti-virus software also needs to be regularly updated in order to recognize the latest threats.

One may also minimize the damage done by viruses by making regular backups of data (and the operating systems) on different media, that are either kept unconnected to the system (most of the time), read-only or not accessible for other reasons, such as using different file systems. This way, if data is lost through a virus, one can start again using the backup (which should preferably be recent).

If a backup session on optical media like CD and DVD is closed, it becomes read-only and can no longer be affected by a virus (so long as a virus or infected file was not copied onto the CD/DVD). Likewise, an operating system on a bootable CD can be used to start the computer if the installed operating systems become unusable. Backups on removable media must be carefully inspected before restoration. The Gammima virus, for example, propagates via removable flash drives.[31][32]

[edit] Recovery methods
Once a computer has been compromised by a virus, it is usually unsafe to continue using the same computer without completely reinstalling the operating system. However, there are a number of recovery options that exist after a computer has a virus. These actions depend on severity of the type of virus.

[edit] Virus removal
One possibility on Windows Me, Windows XP, Windows Vista and Windows 7 is a tool known as System Restore, which restores the registry and critical system files to a previous checkpoint. Often a virus will cause a system to hang, and a subsequent hard reboot will render a system restore point from the same day corrupt. Restore points from previous days should work provided the virus is not designed to corrupt the restore files or also exists in previous restore points.[33] Some viruses, however, disable System Restore and other important tools such as Task Manager and Command Prompt. An example of a virus that does this is CiaDoor. However, many such viruses can be removed by rebooting the computer, entering Windows safe mode, and then using system tools.

Administrators have the option to disable such tools from limited users for various reasons (for example, to reduce potential damage from and the spread of viruses). A virus can modify the registry to do the same even if the Administrator is controlling the computer; it blocks all users including the administrator from accessing the tools. The message "Task Manager has been disabled by your administrator" may be displayed, even to the administrator.[citation needed]

Users running a Microsoft operating system can access Microsoft's website to run a free scan, provided they have their 20-digit registration number. Many websites run by anti-virus software companies provide free online virus scanning, with limited cleaning facilities (the purpose of the sites is to sell anti-virus products). Some websites allow a single suspicious file to be checked by many antivirus programs in one operation.

[edit] Operating system reinstallation
Reinstalling the operating system is another approach to virus removal. It involves either reformatting the computer's hard drive and installing the OS and all programs from original media, or restoring the entire partition with a clean backup image. User data can be restored by booting from a Live CD, or putting the hard drive into another computer and booting from its operating system with great care not to infect the second computer by executing any infected programs on the original drive; and once the system has been restored precautions must be taken to avoid reinfection from a restored executable file.

These methods are simple to do, may be faster than disinfecting a computer, and are guaranteed to remove any malware. If the operating system and programs must be reinstalled from scratch, the time and effort to reinstall, reconfigure, and restore user preferences must be taken into account. Restoring from an image is much faster, totally safe, and restores the exact configuration to the state it was in when the image was made, with no further trouble.

[edit] See also
Adware
Antivirus software
Computer insecurity
Computer worm
Crimeware
Cryptovirology
Linux malware
List of computer virus hoaxes
List of computer viruses
List of computer viruses (all)
Malware
Mobile viruses
Multipartite virus
Spam
Spyware
Trojan horse (computing)
Virus hoax


[edit] References
1.^ a b Dr. Solomon's Virus Encyclopedia, 1995, ISBN 1897661002, Abstract at http://vx.netlux.org/lib/aas10.html
2.^ Jussi Parikka (2007) "Digital Contagions. A Media Archaeology of Computer Viruses", Peter Lang: New York. Digital Formations-series. ISBN 978-0-8204-8837-0, p. 19
3.^ http://www.bartleby.com/61/97/C0539700.html
4.^ http://www.actlab.utexas.edu/~aviva/compsec/virus/whatis.html
5.^ von Neumann, John (1966). "Theory of Self-Reproducing Automata". Essays on Cellular Automata (University of Illinois Press): 66–87. http://cba.mit.edu/events/03.11.ASE/docs/VonNeumann.pdf. Retrieved June 10., 2010.
6.^ Risak, Veith (1972), "Selbstreproduzierende Automaten mit minimaler Informationsübertragung", Zeitschrift für Maschinenbau und Elektrotechnik, http://www.cosy.sbg.ac.at/~risak/bilder/selbstrep.html
7.^ Kraus, Jürgen (February 1980), Selbstreproduktion bei Programmen, http://vx.netlux.org/lib/pdf/Selbstreproduktion%20bei%20programmen.pdf
8.^ Cohen, Fred (1984), Computer Viruses - Theory and Experiments, http://all.net/books/virus/index.html
9.^ Gunn, J.B. (June 1984). "Use of virus functions to provide a virtual APL interpreter under user control". ACM SIGAPL APL Quote Quad archive (ACM New York, NY, USA) 14 (4): 163–168. ISSN 0163-6006. http://portal.acm.org/ft_gateway.cfm?id=801093&type=pdf&coll=GUIDE&dl=GUIDE&CFID=93800866&CFTOKEN=49244432.
10.^ "Virus list". http://www.viruslist.com/en/viruses/encyclopedia?chapter=153310937. Retrieved 2008-02-07.
11.^ Thomas Chen, Jean-Marc Robert (2004). "The Evolution of Viruses and Worms". http://vx.netlux.org/lib/atc01.html. Retrieved 2009-02-16.
12.^ Jussi Parikka (2007) "Digital Contagions. A Media Archaeology of Computer Viruses", Peter Lang: New York. Digital Formations-series. ISBN 978-0-8204-8837-0, p. 50
13.^ See page 86 of Computer Security Basics by Deborah Russell and G. T. Gangemi. O'Reilly, 1991. ISBN 0937175714
14.^ a b Anick Jesdanun (1 September 2007). "School prank starts 25 years of security woes". CNBC. http://www.cnbc.com/id/20534084/. Retrieved 2010-01-07.
15.^ "The anniversary of a nuisance". http://www.cnn.com/2007/TECH/09/03/computer.virus.ap/.
16.^ Boot sector virus repair
17.^ http://www.youtube.com/watch?v=m58MqJdWgDc
18.^ Vesselin Bontchev. "Macro Virus Identification Problems". FRISK Software International. http://www.people.frisk-software.com/~bontchev/papers/macidpro.html.
19.^ Berend-Jan Wever. "XSS bug in hotmail login page". http://seclists.org/bugtraq/2002/Oct/119.
20.^ Wade Alcorn. "The Cross-site Scripting Virus". http://www.bindshell.net/papers/xssv/.
21.^ http://www.virusbtn.com/resources/glossary/polymorphic_virus.xml
22.^ Perriot, Fredrick; Peter Ferrie and Peter Szor (May 2002). "Striking Similarities" (PDF). http://securityresponse.symantec.com/avcenter/reference/simile.pdf. Retrieved September 9, 2007.
23.^ http://www.virusbtn.com/resources/glossary/metamorphic_virus.xml
24.^ Need a computer virus?- download now
25.^ http://blog.didierstevens.com/2007/05/07/is-your-pc-virus-free-get-it-infected-here/
26.^ "Malware Evolution: Mac OS X Vulnerabilities 2005-2006". Kaspersky Lab. 2006-07-24. http://www.viruslist.com/en/analysis?pubid=191968025. Retrieved August 19, 2006.
27.^ Apple - Get a Mac
28.^ a b Sutter, John D. (22 April 2009). "Experts: Malicious program targets Macs". CNN.com. http://www.cnn.com/2009/TECH/04/22/first.mac.botnet/index.html. Retrieved 24 April 2009.
29.^ McAfee. "McAfee discovers first Linux virus". news article. http://math-www.uni-paderborn.de/~axel/bliss/mcafee_press.html.
30.^ Axel Boldt. "Bliss, a Linux "virus"". news article. http://math-www.uni-paderborn.de/~axel/bliss/.
31.^ "Symantec Security Summary — W32.Gammima.AG." http://www.symantec.com/security_response/writeup.jsp?docid=2007-082706-1742-99
32.^ "Yahoo Tech: Viruses! In! Space!" http://tech.yahoo.com/blogs/null/103826
33.^ "Symantec Security Summary — W32.Gammima.AG and removal details." http://www.symantec.com/security_response/writeup.jsp?docid=2007-082706-1742

Sunday, August 1, 2010

Friday, July 16, 2010

hostory of computer graphics

From Wikipedia, the free encyclopediaJump to: navigation, search
This article is about graphics created using computers. For the article about the scientific study of computer graphics, see Computer graphics (computer science). For other uses, see Computer graphics (disambiguation).

A Blender 2.45 screenshot.
A 2D projection of a 3D projection of a 4D Pentachoron performing a double rotation about two orthogonal planes.Computer graphics are graphics created using computers and, more generally, the representation and manipulation of image data by a computer.

The development of computer graphics, has made computers easier to interact with, and better for understanding and interpreting many types of data. Developments in computer graphics have had a profound impact on many types of media and have revolutionized animation, movies and the video game industry.


The term computer graphics has been used in a broad sense to describe "almost everything on computers that is not text or sound"[1]. Typically, the term computer graphics refers to several different things:

the representation and manipulation of image data by a computer
the various technologies used to create and manipulate images
the images so produced, and
the sub-field of computer science which studies methods for digitally synthesizing and manipulating visual content, see study of computer graphics
Compute graphics is the life line of today's computer world Today, computers and computer-generated images touch many aspects of daily life. Computer imagery is found on television, in newspapers, for example in weather reports, or for example in all kinds of medical investigation and surgical procedures. A well-constructed graph can present complex statistics in a form that is easier to understand and interpret. In the media "such graphs are used to illustrate papers, reports, theses", and other presentation material.[2]

Many powerful tools have been developed to visualize data. Computer generated imagery can be categorized into several different types: 2D, 3D, 5D, and animated graphics. As technology has improved, 3D computer graphics have become more common, but 2D computer graphics are still widely used. Computer graphics has emerged as a sub-field of computer science which studies methods for digitally synthesizing and manipulating visual content. Over the past decade, other specialized fields have been developed like information visualization, and scientific visualization more concerned with "the visualization of three dimensional phenomena (architectural, meteorological, medical, biological, etc.), where the emphasis is on realistic renderings of volumes, surfaces, illumination sources, and so forth, perhaps with a dynamic (time) component".[3]

Computer Graphics
The advance in computer graphics was to come from one MIT student, Ivan Sutherland. In 1961 Sutherland created another computer drawing program called Sketchpad. Using a light pen, Sketchpad allowed one to draw simple shapes on the computer screen, save them and even recall them later. The light pen itself had a small photoelectric cell in its tip. This cell emitted an electronic pulse whenever it was placed in front of a computer screen and the screen's electron gun fired directly at it. By simply timing the electronic pulse with the current location of the electron gun, it was easy to pinpoint exactly where the pen was on the screen at any given moment. Once that was determined, the computer could then draw a cursor at that location.

Sutherland seemed to find the perfect solution for many of the graphics problems he faced. Even today, many standards of computer graphics interfaces got their start with this early Sketchpad program. One example of this is in drawing constraints. If one wants to draw a square for example, s/he doesn't have to worry about drawing four lines perfectly to form the edges of the box. One can simply specify that s/he wants to draw a box, and then specify the location and size of the box. The software will then construct a perfect box, with the right dimensions and at the right location. Another example is that Sutherland's software modeled objects - not just a picture of objects. In other words, with a model of a car, one could change the size of the tires without affecting the rest of the car. It could stretch the body of the car without deforming the tires.

These early computer graphics were Vector graphics, composed of thin lines whereas modern day graphics are Raster based using pixels. The difference between vector graphics and raster graphics can be illustrated with a shipwrecked sailor. He creates an SOS sign in the sand by arranging rocks in the shape of the letters "SOS." He also has some brightly colored rope, with which he makes a second "SOS" sign by arranging the rope in the shapes of the letters. The rock SOS sign is similar to raster graphics. Every pixel has to be individually accounted for. The rope SOS sign is equivalent to vector graphics. The computer simply sets the starting point and ending point for the line and perhaps bend it a little between the two end points. The disadvantages to vector files are that they cannot represent continuous tone images and they are limited in the number of colors available. Raster formats on the other hand work well for continuous tone images and can reproduce as many colors as needed.

Also in 1961 another student at MIT, Steve Russell, created the first video game, Spacewar. Written for the DEC PDP-1, Spacewar was an instant success and copies started flowing to other PDP-1 owners and eventually even DEC got a copy. The engineers at DEC used it as a diagnostic program on every new PDP-1 before shipping it. The sales force picked up on this quickly enough and when installing new units, would run the world's first video game for their new customers.

E. E. Zajac, a scientist at Bell Telephone Laboratory (BTL), created a film called "Simulation of a two-giro gravity attitude control system" in 1963. In this computer generated film, Zajac showed how the attitude of a satellite could be altered as it orbits the Earth. He created the animation on an IBM 7090 mainframe computer. Also at BTL, Ken Knowlton, Frank Sindon and Michael Noll started working in the computer graphics field. Sindon created a film called Force, Mass and Motion illustrating Newton's laws of motion in operation. Around the same time, other scientists were creating computer graphics to illustrate their research. At Lawrence Radiation Laboratory, Nelson Max created the films, "Flow of a Viscous Fluid" and "Propagation of Shock Waves in a Solid Form." Boeing Aircraft created a film called "Vibration of an Aircraft."

It wasn't long before major corporations started taking an interest in computer graphics. TRW, Lockheed-Georgia, General Electric and Sperry Rand are among the many companies that were getting started in computer graphics by the mid 1960's. IBM was quick to respond to this interest by releasing the IBM 2250 graphics terminal, the first commercially available graphics computer.

Ralph Baer, a supervising engineer at Sanders Associates, came up with a home video game in 1966 that was later licensed to Magnavox and called the Odyssey. While very simplistic, and requiring fairly inexpensive electronic parts, it allowed the player to move points of light around on a screen. It was the first consumer computer graphics product.

Also in 1966, Sutherland at MIT invented the first computer controlled head-mounted display (HMD). Called the Sword of Damocles because of the hardware required for support, it displayed two separate wireframe images, one for each eye. This allowed the viewer to see the computer scene in stereoscopic 3D. After receiving his Ph.D. from MIT, Sutherland became Director of Information Processing at ARPA (Advanced Research Projects Agency), and later became a professor at Harvard.

Dave Evans was director of engineering at Bendix Corporation's computer division from 1953 to 1962, after which he worked for the next five years as a visiting professor at Berkeley. There he continued his interest in computers and how they interfaced with people. In 1968 the University of Utah recruited Evans to form a computer science program, and computer graphics quickly became his primary interest. This new department would become the world's primary research center for computer graphics.

In 1967 Sutherland was recruited by Evans to join the computer science program at the University of Utah. There he perfected his HMD. Twenty years later, NASA would re-discover his techniques in their virtual reality research. At Utah, Sutherland and Evans were highly sought after consultants by large companies but they were frustrated at the lack of graphics hardware available at the time so they started formulating a plan to start their own company.

A student by the name of Ed Catmull started at the University of Utah in 1970 and signed up for Sutherland's computer graphics class. Catmull had just come from The Boeing Company and had been working on his degree in physics. Growing up on Disney, Catmull loved animation yet quickly discovered that he didn't have the talent for drawing. Now Catmull (along with many others) saw computers as the natural progression of animation and they wanted to be part of the revolution. The first animation that Catmull saw was his own. He created an animation of his hand opening and closing. It became one of his goals to produce a feature length motion picture using computer graphics. In the same class, Fred Parke created an animation of his wife's face. Because of Evan's and Sutherland's presence, UU was gaining quite a reputation as the place to be for computer graphics research so Catmull went there to learn 3D animation.

As the UU computer graphics laboratory was attracting people from all over, John Warnock was one of those early pioneers; he would later found Adobe Systems and create a revolution in the publishing world with his PostScript page description language. Tom Stockham led the image processing group at UU which worked closely with the computer graphics lab. Jim Clark was also there; he would later found Silicon Graphics, Inc.

The first major advance in 3D computer graphics was created at UU by these early pioneers, the hidden-surface algorithm. In order to draw a representation of a 3D object on the screen, the computer must determine which surfaces are "behind" the object from the viewer's perspective, and thus should be "hidden" when the computer creates (or renders) the image.

[edit] Image types
[edit] 2D computer graphics

Raster graphic sprites (left) and masks (right)2D computer graphics are the computer-based generation of digital images—mostly from two-dimensional models, such as 2D geometric models, text, and digital images, and by techniques specific to them. The word may stand for the branch of computer science that comprises such techniques, or for the models themselves.

2D computer graphics are mainly used in applications that were originally developed upon traditional printing and drawing technologies, such as typography, cartography, technical drawing, advertising, etc.. In those applications, the two-dimensional image is not just a representation of a real-world object, but an independent artifact with added semantic value; two-dimensional models are therefore preferred, because they give more direct control of the image than 3D computer graphics, whose approach is more akin to photography than to typography.

[edit] Pixel art
Pixel art is a form of digital art, created through the use of raster graphics software, where images are edited on the pixel level. Graphics in most old (or relatively limited) computer and video games, graphing calculator games, and many mobile phone games are mostly pixel art.

[edit] Vector graphics

Example showing effect of vector graphics versus raster (bitmap) graphics.Vector graphics formats are complementary to raster graphics, which is the representation of images as an array of pixels, as it is typically used for the representation of photographic images.[4] There are instances when working with vector tools and formats is best practice, and instances when working with raster tools and formats is best practice. There are times when both formats come together. An understanding of the advantages and limitations of each technology and the relationship between them is most likely to result in efficient and effective use of tools.

[edit] 3D computer graphics
3D computer graphics in contrast to 2D computer graphics are graphics that use a three-dimensional representation of geometric data that is stored in the computer for the purposes of performing calculations and rendering 2D images. Such images may be for later display or for real-time viewing.

Despite these differences, 3D computer graphics rely on many of the same algorithms as 2D computer vector graphics in the wire frame model and 2D computer raster graphics in the final rendered display. In computer graphics software, the distinction between 2D and 3D is occasionally blurred; 2D applications may use 3D techniques to achieve effects such as lighting, and primarily 3D may use 2D rendering techniques.

computer graphics are often referred to as 3D models. Apart from the rendered graphic, the model is contained within the graphical data file. However, there are differences. A 3D model is the mathematical representation of any three-dimensional object. A model is not technically a graphic until it is visually displayed. Due to 3D printing, 3D models are not confined to virtual space. A model can be displayed visually as a two-dimensional image through a process called 3D rendering, or used in non-graphical computer simulations and calculations.

[edit] Computer animation

An example of Computer animation produced using Motion captureComputer animation is the art of creating moving images via the use of computers. It is a subfield of computer graphics and animation. Increasingly it is created by means of 3D computer graphics, though 2D computer graphics are still widely used for stylistic, low bandwidth, and faster real-time rendering needs. Sometimes the target of the animation is the computer itself, but sometimes the target is another medium, such as film. It is also referred to as CGI (Computer-generated imagery or computer-generated imaging), especially when used in films.

Virtual entities may contain and be controlled by assorted attributes, such as transform values (location, orientation, and scale) stored in an object's transformation matrix. Animation is the change of an attribute over time. Multiple methods of achieving animation exist; the rudimentary form is based on the creation and editing of keyframes, each storing a value at a given time, per attribute to be animated. The 2D/3D graphics software will interpolate between keyframes, creating an editable curve of a value mapped over time, resulting in animation. Other methods of animation include procedural and expression-based techniques: the former consolidates related elements of animated entities into sets of attributes, useful for creating particle effects and crowd simulations; the latter allows an evaluated result returned from a user-defined logical expression, coupled with mathematics, to automate animation in a predictable way (convenient for controlling bone behavior beyond what a hierarchy offers in skeletal system set up).

To create the illusion of movement, an image is displayed on the computer screen then quickly replaced by a new image that is similar to the previous image, but shifted slightly. This technique is identical to the illusion of movement in television and motion pictures.

[edit] Concepts and principles
[edit] Image
An image or picture is an artifact that resembles a physical object or person. The term includes two-dimensional objects like photographs and sometimes includes three-dimensional representations. Images are captured by optical devices—such as cameras, mirrors, lenses, telescopes, microscopes, etc. and natural objects and phenomena, such as the human eye or water surfaces.

A digital image is a representation of a two-dimensional image in binary format as a sequence of ones and zeros. Digital images include both vector images and raster images, but raster images are more commonly used.

[edit] Pixel

In the enlarged portion of the image individual pixels are rendered as squares and can be easily seen.In digital imaging, a pixel (or picture element[5]) is a single point in a raster image. Pixels are normally arranged in a regular 2-dimensional grid, and are often represented using dots or squares. Each pixel is a sample of an original image, where more samples typically provide a more xxaccurate representation of the original. The intensity of each pixel is variable; in color systems, each pixel has typically three components such as red, green, and blue.

[edit] Graphics
Graphics are visual presentations on some surface, such as a wall, canvas, computer screen, paper, or stone to brand, inform, illustrate, or entertain. Examples are photographs, drawings, line art, graphs, diagrams, typography, numbers, symbols, geometric designs, maps, engineering drawings, or other images. Graphics often combine text, illustration, and color. Graphic design may consist of the deliberate selection, creation, or arrangement of typography alone, as in a brochure, flier, poster, web site, or book without any other element. Clarity or effective communication may be the objective, association with other cultural elements may be sought, or merely, the creation of a distinctive style.


Arambilet: Dots on the I's, D-ART 2009 Online Digital Art Gallery, exhibited at IV09 and CG09 computer Graphics conferences, at Pompeu Fabra University, Barcelona; Tianjin University, China; Permanent Exhibition at the London South Bank University[edit] Rendering
A 2D/3D scene file contains objects in a strictly defined language or data structure. It would contain geometry, viewpoint, texture, lighting, and shading information as a description of the virtual scene. The data contained in the scene file is then passed to a rendering program to be processed and output to a digital image or raster graphics image file. The rendering program is usually built into the computer graphics software, though others are available as plug-ins or entirely separate programs; examples include mental images' mental ray and Disney Pixar's RenderMan. The term "rendering" may be by analogy with an "artist's rendering" of a scene. Though the technical details of rendering methods vary, the general challenges to overcome in producing a 2D image from a 3D representation stored in a scene file are outlined collectively as the graphics pipeline along a rendering device, such as a GPU. A GPU is a purpose-built device able to assist a CPU in performing complex rendering calculations. If a scene is to look relatively realistic and predictable under virtual lighting, the rendering software should solve the rendering equation. The rendering equation doesn't account for all lighting phenomena, but is a general lighting model for computer-generated imagery. 'Rendering' is also used to describe the process of calculating effects in a video editing file to produce final video output.

3D projection
3D projection is a method of mapping three dimensional points to a two dimensional plane. As most current methods for displaying graphical data are based on planar two dimensional media, the use of this type of projection is widespread, especially in computer graphics, engineering and drafting.
Ray tracing
Ray tracing is a technique for generating an image by tracing the path of light through pixels in an image plane. The technique is capable of producing a very high degree of photorealism; usually higher than that of typical scanline rendering methods, but at a greater computational cost.
Shading

Example of shading.Shading refers to depicting depth in 3D models or illustrations by varying levels of darkness. It is a process used in drawing for depicting levels of darkness on paper by applying media more densely or with a darker shade for darker areas, and less densely or with a lighter shade for lighter areas. There are various techniques of shading including cross hatching where perpendicular lines of varying closeness are drawn in a grid pattern to shade an area. The closer the lines are together, the darker the area appears. Likewise, the farther apart the lines are, the lighter the area appears. The term has been recently generalized to mean that shaders are applied.
Texture mapping
Texture mapping is a method for adding detail, surface texture, or colour to a computer-generated graphic or 3D model. Its application to 3D graphics was pioneered by Dr Edwin Catmull in 1974. A texture map is applied (mapped) to the surface of a shape, or polygon. This process is akin to applying patterned paper to a plain white box. Multitexturing is the use of more than one texture at a time on a polygon.[6] Procedural textures (created from adjusting parameters of an underlying algorithm that produces an output texture), and bitmap textures (created in an image editing application) are, generally speaking, common methods of implementing texture definition from a 3D animation program, while intended placement of textures onto a model's surface often requires a technique known as UV mapping.
Anti-aliasing
Rendering resolution-independent entities (such as 3D models) for viewing on a raster (pixel-based) device such as a LCD display or CRT television inevitably causes aliasing artifacts mostly along geometric edges and the boundaries of texture details; these artifacts are informally called "jaggies". Anti-aliasing methods rectify such problems, resulting in imagery more pleasing to the viewer, but can be somewhat computationally expensive. Various anti-aliasing algorithms (such as supersampling) are able to be employed, then customized for the most efficient rendering performance versus quality of the resultant imagery; a graphics artist should consider this trade-off if anti-aliasing methods are to be used. A pre-anti-aliased bitmap texture being displayed on a screen (or screen location) at a resolution different than the resolution of the texture itself (such as a textured model in the distance from the virtual camera) will exhibit aliasing artifacts, while any procedurally-defined texture will always show aliasing artifacts as they are resolution-independant; techniques such as mipmapping and texture filtering help to solve texture-related aliasing problems.
[edit] Volume rendering

Volume rendered CT scan of a forearm with different colour schemes for muscle, fat, bone, and blood.Volume rendering is a technique used to display a 2D projection of a 3D discretely sampled data set. A typical 3D data set is a group of 2D slice images acquired by a CT or MRI scanner.

Usually these are acquired in a regular pattern (e.g., one slice every millimeter) and usually have a regular number of image pixels in a regular pattern. This is an example of a regular volumetric grid, with each volume element, or voxel represented by a single value that is obtained by sampling the immediate area surrounding the voxel.

[edit] 3D modeling
3D modeling is the process of developing a mathematical, wireframe representation of any three-dimensional object, called a "3D model", via specialized software. Models may be created automatically or manually; the manual modeling process of preparing geometric data for 3D computer graphics is similar to plastic arts such as sculpting. 3D models may be created using multiple approaches: use of NURBS curves to generate accurate and smooth surface patches, polygonal mesh modeling (manipulation of faceted geometry), or polygonal mesh subdivision (advanced tessellation of polygons, resulting in smooth surfaces similar to NURBS models). A 3D model can be displayed as a two-dimensional image through a process called 3D rendering, used in a computer simulation of physical phenomena, or animated directly for other purposes. The model can also be physically created using 3D Printing devices.

[edit] Pioneers in graphic design
Charles Csuri
Charles Csuri is a pioneer in computer animation and digital fine art and created the first computer art in 1964. Csuri was recognized by Smithsonian as the father of digital art and computer animation, and as a pioneer of computer animation by the Museum of Modern Art (MoMA) and (ACM-SIGGRAPH).
Donald P. Greenberg
Donald P. Greenberg is a leading innovator in computer graphics. Greenberg has authored hundreds of articles and served as a teacher and mentor to many prominent computer graphic artists, animators, and researchers such as Robert L. Cook, Marc Levoy, and Wayne Lytle. Many of his former students have won Academy Awards for technical achievements and several have won the SIGGRAPH Achievement Award. Greenberg was the founding director of the NSF Center for Computer Graphics and Scientific Visualization.
A. Michael Noll
Noll was one of the first researchers to use a digital computer to create artistic patterns and to formalize the use of random processes in the creation of visual arts. He began creating digital computer art in 1962, making him one of the earliest digital computer artists. In 1965, Noll along with Frieder Nake and Georg Nees were the first to publicly exhibit their computer art. During April 1965, the Howard Wise Gallery exhibited Noll's computer art along with random-dot patterns by Bela Julesz.

A modern render of the Utah teapot, an iconic model in 3D computer graphics created by Martin Newell, 1975.Other pioneers
Jim Blinn
Arambilet
Benoît B. Mandelbrot
Henri Gouraud
Bui Tuong Phong
Pierre Bézier
Paul de Casteljau
Daniel J. Sandin
Alvy Ray Smith
Ton Roosendaal
Ivan Sutherland
Steve Russell
[edit] The study of computer graphics
The study of computer graphics is a sub-field of computer science which studies methods for digitally synthesizing and manipulating visual content. Although the term often refers to three-dimensional computer graphics, it also encompasses two-dimensional graphics and image processing.

As an academic discipline, computer graphics studies the manipulation of visual and geometric information using computational techniques. It focuses on the mathematical and computational foundations of image generation and processing rather than purely aesthetic issues. Computer graphics is often differentiated from the field of visualization, although the two fields have many similarities.

[edit] Applications
Computer graphics portal
Computer Science portal
Computational biology
Computational physics
Computer-aided design
Computer simulation
Digital art
Education
Graphic design
Infographics
Information visualization
Rational drug design
Scientific visualization
Video Games
Virtual reality
Web design

Thursday, July 15, 2010

computer graphics

Industry survey reveals the determination of the games sector to avoid the mistakes of the music industry
Can the games industry avoid the pitfalls of the music industry? Many are sure hoping not to repeat those mistakes.

A survey of selected delegates and speakers at the forthcoming London Games Conference has revealed a determination within the games industry to not repeat the mistakes made by the music industry with regards to digital distribution.

Stuart Dinsey, managing director of Intent Media, an organizer of the conference, commented: “Our survey has shown that the games sector is determined to learn from the mistakes made by the music industry by embracing digital distribution earlier and more efficiently. Whilst the claims of parity in three years with retail sales may be wide of the mark, it’s very clear that digital distribution is gaining a momentum that would have been unthinkable even a year ago.”

The London Games Conference, due to be held at BAFTA on Tuesday 27th October, is set to explore the various aspects of digital distribution as well as examine successful businesses that have closed the gap between creators and customers. Delegates will also hear two researched based presentations examining the growth of the digital sector across all genres.

“Digital distribution is the defining issue within the games industry over the next three years. The long-term success of every player within games will be judged on how they develop their digital strategies. The London Games Conference is set to be just the beginning of an industry wide debate that will continue long after the 27th,” Dinsey continued.

Companies participating at the conference include Microsoft, Sony, Playfish, Jagex, Direct2Drive, Sega and Chart Track.



--------------------------------------------------------------------------------