Sunday, April 18, 2010

Free Download Firefox 3.6

The Web is all about innovation, and Firefox sets the pace with dozens of new features to deliver a faster, more secure and customizable Web browsing experience for all.

User Experience. The enhancements to Firefox provide the best possible browsing experience on the Web. The new Firefox smart location bar, affectionately known as the "Awesome Bar," learns as people use it, adapting to user preferences and offering better fitting matches over time.

Performance. Firefox is built on top of the powerful new Gecko platform, resulting in a safer, easier to use and more personal product.

Security. Firefox raises the bar for security. The new malware and phishing protection helps protect from viruses, worms, trojans and spyware to keep people safe on the Web.

Click Here for Download

Customization. Everyone uses the Web differently, and Firefox lets users customize their browser with more than 5,000 add-on

Saturday, April 17, 2010

Google Chrome 5.0.375.9 Beta

Google Chrome is a browser that combines a minimal design with sophisticated technology to make the web faster, safer, and easier.

One box for everything

Type in the address bar and get suggestions for both search and web pages.

Thumbnails of your top sites

Access your favorite pages instantly with lightning speed from any new tab.

Incognito mode

Don't want pages you visit to show up in your web history? Choose incognito mode for private browsing.

Safe browsing

Google Chrome warns you if you're about to visit a suspected phishing, malware or otherwise unsafe website.

For information about alpha and developer builds, check out the Chrome dev channel here.

Click Here for Download

Friday, April 16, 2010

Free Download GMail Drive

GMail Drive creates a virtual filesystem around your Google Mail account, allowing you to use Gmail as a storage medium.

GMail Drive creates a virtual filesystem on top of your Google Gmail account and enables you to save and retrieve files stored on your Gmail account directly from inside Windows Explorer. GMail Drive literally adds a new drive to your computer under the My Computer folder, where you can create new folders, copy and drag'n'drop files to it.

Ever since Google started to offer users a Gmail e-mail account, which includes storage space of 6000 megabytes, you have had plenty of storage space but not a lot to fill it up with. With GMail Drive you can easily copy files to your Google Mail Account and retrieve them again.

When you create a new file using GMail Drive, it generates an e-mail and posts it to your account. The e-mail appears in your normal Inbox folder, and the file is attached as an e-mail attachment. GMail Drive periodically checks your mail account (using the Gmail search function) to see if new files have arrived and to rebuild the directory structures. But basically GMail Drive acts as any other hard-drive installed on your computer.

Click Here for Download

Thursday, April 15, 2010

Free for Download Ad-Aware 2010 8.2

Ad-Aware gives you comprehensive malware protection. With real-time monitoring, threat alerts, and automatic updates you can rest easy knowing that you are protected.

  • Shop, bank, and make travel arrangements online - We keep you safe from password stealers, keyloggers, spyware, trojans, online fraudsters, identity thieves and other potential cyber criminals.
  • Control your privacy - Erase tracks left behind while surfing the Web - on browsers such as Internet Explorer, Opera, and Firefox - in one easy click.
  • Get Peace of Mind - Know that your personal information is kept safe from dangerous intruders and prying eyes.
For Download Click Here

Friday, April 9, 2010

Free Download CloneCD 5.3.1.4

CloneCD is the perfect tool to make backup copies of your music and data CDs, regardless of copy protection. CloneCD's award-winning user interface allows you to copy almost any CD in just a few mouse clicks.
Since the release of 5.0, CloneCD is not only able to copy CDs but also most DVD formats, such as DVD-R, DVD-RW, DVD+R, DVD+RW, DVD+R Dual Layer and DVD-RAM. The movies are copied 1:1 and therefore not modified (compressed). Note that to copy movie DVDs you also require AnyDVD.
CloneCD also works with other formats such as ISO and UDF files and copies CDs/DVDs with the new SafeDisc 3 Copy Protection System. CloneCD allows you to create perfect 1:1 copies of your valuable original compact discs. Should your copy-protected music CD not play in your car audio, the backup created by CloneCD will.
Slysoft combine knowledge and innovation with many years of experience and direct communication with customers to provide constant improvements, therefore making CloneCD the highest quality copying application around.

Thursday, April 8, 2010

Free Download Nero Burning Rom 9.4.26.0

Nero is the next generation of the world's most trusted integrated digital media and home entertainment software suite. It features new cutting-edge functionality that makes enjoying digital media content simple.
This easy-to-use yet powerful multimedia suite, gives you the freedom to create, rip, copy, burn, edit, share, and upload online. Whatever you want – music, video, photo, and data – enjoy and share with family and friends anytime, anywhere.
With easy-to-use Nero StartSmart command center, your digital life has never been more flexible, feasible, and fun.
Nero 9 Highlights:
Fast and easy rip, burn, Autobackup, and copy functions
Backup files to CDs, DVDs, and Blu-ray Discs
Copy, burn, share, upload, and create music mixes like a DJ
Quick photo and video upload to My Nero and other online communities
Watch, record, pause, and customize your live TV experience

Free Download Anti Vir Personal 9.0.0.418

Avira AntiVir Personal - FREE Antivirus is a reliable free antivirus solution, that constantly and rapidly scans your computer for malicious programs such as viruses, Trojans, backdoor programs, hoaxes, worms, dialers etc. Monitors every action executed by the user or the operating system and reacts promptly when a malicious program is detected.
Detects and removes more than 150,000 viruses
Always among the winners of comparison test featured in computer journals
The resident Virus Guard serves to monitor file movements automatically, e.g. downloading of data from the internet
Scanning and repair of macro viruses
Protection against previously unknown macro viruses
Protection against trojaner, worms, backdoors, jokes and other harmful programs
AntiVir protection against viruses, worms and Trojans
AntiDialer protection against expensive dialers
AntiRootkit protection against hidden rootkits
AntiPhishing protection against phishing
AntiSpyware protection against spyware and adware
NetbookSupport for laptops with low resolution
QuickRemoval eliminates viruses at the push of a button
Easy operation
Internet-Update Wizard for easy updating
Protection against previously unknown boot record viruses and master boot record viruses

Free Download Flash Player 10.1.51.66 Beta 2 (IE)

Adobe Flash Player is the high performance, lightweight, highly expressive client runtime that delivers powerful and consistent user experiences across major operating systems, browsers, mobile phones and devices.
Installed on over 750 million Internet-connected desktops and mobile devices, Flash Player enables organizations and individuals to build and deliver great digital experiences to their end users.
Immersive experiences with Flash video, content and applications with full-screen mode.
Low-bandwidth, high-quality video with advanced compression technology.
High-fidelity text using the advanced text rendering engine.
Real-time dynamic effects with filters for Blur, DropShadow, Glow, Bevel, Gradient Glow, Gradient Bevel, Displacement Map, Convolution, and Color Matrix.
Innovative media compositions with 8-bit video alpha channels.
Blend modes, radial gradient, and stroke enhancements.
Additional image formats: GIF, Progressive JPEG, and PNG.

click Her for Download

Monday, April 5, 2010

Free Download Windows Media Player 11

Windows Media Player 11 for Windows XP offers great new ways to store and enjoy all your music, video, pictures, and recorded TV. Play it, view it, and sync it to a portable device for enjoying on the go or even share with devices around your home—all from one place.

  • Simplicity In Design - Bring a whole new look to your digital entertainment.
  • More of the Music You Love - Breathe new life into your digital music experience.
  • All Your Entertainment in One Place - Store and enjoy all of your music, video, pictures, and recorded TV.
  • Enjoy Everywhere - Stay connected with your music, video, and pictures no matter where you are.

Sunday, April 4, 2010

Film

Film encompasses individual motion pictures, the field of film as an art form, and the motion picture industry. Films (also referred to as movies or motion pictures) are produced by recording images from the world with cameras, or by creating images using animation techniques or visual effects.

Films are cultural artifacts created by specific cultures, which reflect those cultures, and, in turn, affect them. Film is considered to be an important art form, a source of popular entertainment and a powerful method for educating — or indoctrinating — citizens. The visual elements of cinema give motion pictures a universal power of communication. Some films have become popular worldwide attractions by using dubbing or subtitles that translate the dialogue.

Films are made up of a series of individual images called frames. When these images are shown rapidly in succession, a viewer has the illusion that motion is occurring. The viewer cannot see the flickering between frames due to an effect known as persistence of vision, whereby the eye retains a visual image for a fraction of a second after the source has been removed. Viewers perceive motion due to a psychological effect called beta movement.

The origin of the name "film" comes from the fact that photographic film (also called film stock) has historically been the primary medium for recording and displaying motion pictures. Many other terms exist for an individual motion picture, including picture, picture show, moving picture, photo-play and flick. A common name for film in the United States is movie, while in Europe the term cinema is preferred. Additional terms for the field in general include the big screen, the silver screen, the cinema and the movies.

Contents

[hide]

History

Charlie Chaplin, the Marriage Bond.ogg

A clip from the Charlie Chaplin silent film, The Bond (1918)

Preceding film by thousands of years, plays and dances had elements common to film: scripts, sets, costumes, production, direction, actors, audiences, storyboards, and scores. Much terminology later used in film theory and criticism applied, such as mise en scene (roughly, the entire visual picture at any one time). Moving visual and aural images were not recorded for replaying as in film.

The camera obscura was pioneered by Alhazen in his Book of Optics (1021),[1][2][3] and later near the year 1600, it was perfected by Giambattista della Porta. Light is inverted through a small hole or lens from outside, and projected onto a surface or screen, creating a moving image, but it is not preserved in a recording.

In the 1860s, mechanisms for producing two-dimensional drawings in motion were demonstrated with devices such as the zoetrope, mutoscope and praxinoscope. These machines were outgrowths of simple optical devices (such as magic lanterns) and would display sequences of still pictures at sufficient speed for the images on the pictures to appear to be moving, a phenomenon called persistence of vision. Naturally the images needed to be carefully designed to achieve the desired effect, and the underlying principle became the basis for the development of film animation.

With the development of celluloid film for still photography, it became possible to directly capture objects in motion in real time. An 1878 experiment by Eadweard Muybridge in the United States using 24 cameras produced a series of stereoscopic images of a galloping horse, arguably the first "motion picture," though it was not called by this name. This technology required a person to look into a viewing machine to see the pictures which were separate paper prints attached to a drum turned by a handcrank. The pictures were shown at a variable speed of about 5 to 10 pictures per second, depending on how rapidly the crank was turned. Commercial versions of these machines were coin operated.

A frame from Roundhay Garden Scene, the world's earliest film produced using a motion picture camera, by Louis Le Prince, 1888

By the 1880s the development of the motion picture camera allowed the individual component images to be captured and stored on a single reel, and led quickly to the development of a motion picture projector to shine light through the processed and printed film and magnify these "moving picture shows" onto a screen for an entire audience. These reels, so exhibited, came to be known as "motion pictures". Early motion pictures were static shots that showed an event or action with no editing or other cinematic techniques.

Ignoring Dickson's early sound experiments (1894), commercial motion pictures were purely visual art through the late 19th century, but these innovative silent films had gained a hold on the public imagination. Around the turn of the twentieth century, films began developing a narrative structure by stringing scenes together to tell narratives. The scenes were later broken up into multiple shots of varying sizes and angles. Other techniques such as camera movement were realized as effective ways to portray a story on film. Rather than leave the audience in silence, theater owners would hire a pianist or organist or a full orchestra to play music fitting the mood of the film at any given moment. By the early 1920s, most films came with a prepared list of sheet music for this purpose, with complete film scores being composed for major productions.

A shot from Georges Méliès Le Voyage dans la Lune (A Trip to the Moon) (1902), an early narrative film.

The rise of European cinema was interrupted by the outbreak of World War I when the film industry in United States flourished with the rise of Hollywood, typified most prominently by the great innovative work of D.W. Griffith in The Birth of a Nation (1914) and Intolerance (1916) . However in the 1920s, European filmmakers such as Sergei Eisenstein, F. W. Murnau, and Fritz Lang,in many ways inspired by the meteoric war-time progress of film through Griffith, along with the contributions of Charles Chaplin, Buster Keaton and others, quickly caught up with American film-making and continued to further advance the medium. In the 1920s, new technology allowed filmmakers to attach to each film a soundtrack of speech, music and sound effects synchronized with the action on the screen. These sound films were initially distinguished by calling them "talking pictures", or talkies.

The next major step in the development of cinema was the introduction of so-called "natural" color. While the addition of sound quickly eclipsed silent film and theater musicians, color was adopted more gradually as methods evolved making it more practical and cost effective to produce "natural color" films. The public was relatively indifferent to color photography as opposed to black-and-white,[citation needed] but as color processes improved and became as affordable as black-and-white film, more and more movies were filmed in color after the end of World War II, as the industry in America came to view color as essential to attracting audiences in its competition with television, which remained a black-and-white medium until the mid-1960s. By the end of the 1960s, color had become the norm for film makers.

Since the decline of the studio system in the 1960s, the succeeding decades saw changes in the production and style of film. Various New Wave movements (including the French New Wave, Indian New Wave, Japanese New Wave and New Hollywood) and the rise of film school educated independent filmmakers were all part of the changes the medium experienced in the latter half of the 20th century. Digital technology has been the driving force in change throughout the 1990s and into the 21st century.

Theory

Film theory seeks to develop concise and systematic concepts that apply to the study of film as art. It was started by Ricciotto Canudo's The Birth of the Sixth Art. Formalist film theory, led by Rudolf Arnheim, Béla Balázs, and Siegfried Kracauer, emphasized how film differed from reality, and thus could be considered a valid fine art. André Bazin reacted against this theory by arguing that film's artistic essence lay in its ability to mechanically reproduce reality not in its differences from reality, and this gave rise to realist theory. More recent analysis spurred by Jacques Lacan's psychoanalysis and Ferdinand de Saussure's semiotics among other things has given rise to psychoanalytical film theory, structuralist film theory, feminist film theory and others. On the other hand, critics from the analytical philosophy tradition, influenced by Wittgenstein, try to clarify misconceptions used in theoretical studies and produce analysis of a film's vocabulary and its link to a form of life.

Language

Film is considered to have its own language. James Monaco wrote a classic text on film theory titled "How to Read a Film". Director Ingmar Bergman famously said, "[Andrei] Tarkovsky for me is the greatest [director], the one who invented a new language, true to the nature of film, as it captures life as a reflection, life as a dream." Examples of the language are a sequence of back and forth images of one actor's left profile speaking, followed by another actor’s right profile speaking, then a repetition of this, which is a language understood by the audience to indicate a conversation. Another example is zooming in on the forehead of an actor with an expression of silent reflection, then changing to a scene of a younger actor who vaguely resembles the first actor, indicating the first actor is having a memory of their own past.

Montage

Parallels to musical counterpoint have been developed into a theory of montage, extended from the complex superimposition of images in early silent film[citation needed] to even more complex incorporation of musical counterpoint together with visual counterpoint through mise en scene and editing, as in a ballet or opera; e.g., as illustrated in the gang fight scene of director Francis Ford Coppola’s film, Rumble Fish.

Criticism

Film criticism is the analysis and evaluation of films. In general, these works can be divided into two categories: academic criticism by film scholars and journalistic film criticism that appears regularly in newspapers and other media.

Film critics working for newspapers, magazines, and broadcast media mainly review new releases. Normally they only see any given film once and have only a day or two to formulate opinions. Despite this, critics have an important impact on films, especially those of certain genres. Mass marketed action, horror, and comedy films tend not to be greatly affected by a critic's overall judgment of a film. The plot summary and description of a film that makes up the majority of any film review can still have an important impact on whether people decide to see a film. For prestige films such as most dramas, the influence of reviews is extremely important. Poor reviews will often doom a film to obscurity and financial loss.

The impact of a reviewer on a given film's box office performance is a matter of debate. Some claim that movie marketing is now so intense and well financed that reviewers cannot make an impact against it. However, the cataclysmic failure of some heavily-promoted movies which were harshly reviewed, as well as the unexpected success of critically praised independent movies indicates that extreme critical reactions can have considerable influence. Others note that positive film reviews have been shown to spark interest in little-known films. Conversely, there have been several films in which film companies have so little confidence that they refuse to give reviewers an advanced viewing to avoid widespread panning of the film. However, this usually backfires as reviewers are wise to the tactic and warn the public that the film may not be worth seeing and the films often do poorly as a result.

It is argued that journalist film critics should only be known as film reviewers, and true film critics are those who take a more academic approach to films. This line of work is more often known as film theory or film studies. These film critics attempt to come to understand how film and filming techniques work, and what effect they have on people. Rather than having their works published in newspapers or appear on television, their articles are published in scholarly journals, or sometimes in up-market magazines. They also tend to be affiliated with colleges or universities.

Industry

The making and showing of motion pictures became a source of profit almost as soon as the process was invented. Upon seeing how successful their new invention, and its product, was in their native France, the Lumières quickly set about touring the Continent to exhibit the first films privately to royalty and publicly to the masses. In each country, they would normally add new, local scenes to their catalogue and, quickly enough, found local entrepreneurs in the various countries of Europe to buy their equipment and photograph, export, import and screen additional product commercially. The Oberammergau Passion Play of 1898[citation needed] was the first commercial motion picture ever produced. Other pictures soon followed, and motion pictures became a separate industry that overshadowed the vaudeville world. Dedicated theaters and companies formed specifically to produce and distribute films, while motion picture actors became major celebrities and commanded huge fees for their performances. Already by 1917, Charlie Chaplin had a contract that called for an annual salary of one million dollars.

From 1931 to 1956, film was also the only image storage and playback system for television programming until the introduction of videotape recorders.

In the United States today, much of the film industry is centered around Hollywood. Other regional centers exist in many parts of the world, such as Mumbai-centered Bollywood, the Indian film industry's Hindi cinema which produces the largest number of films in the world.[4] Whether the ten thousand-plus feature length films a year produced by the Valley pornographic film industry should qualify for this title is the source of some debate.[citation needed] Though the expense involved in making movies has led cinema production to concentrate under the auspices of movie studios, recent advances in affordable film making equipment have allowed independent film productions to flourish.

Profit is a key force in the industry, due to the costly and risky nature of filmmaking; many films have large cost overruns, a notorious example being Kevin Costner's Waterworld. Yet many filmmakers strive to create works of lasting social significance. The Academy Awards (also known as "the Oscars") are the most prominent film awards in the United States, providing recognition each year to films, ostensibly based on their artistic merits.

There is also a large industry for educational and instructional films made in lieu of or in addition to lectures and texts.

Associated fields

Derivative academic Fields of study may both interact with and develop independently of filmmaking, as in film theory and analysis. Fields of academic study have been created that are derivative or dependent on the existence of film, such as film criticism, film history, divisions of film propaganda in authoritarian governments, or psychological on subliminal effects of a flashing soda can during a screening. These fields may further create derivative fields, such as a movie review section in a newspaper or a television guide. Sub-industries can spin off from film, such as popcorn makers, and toys. Sub- industries of pre-existing industries may deal specifically with film, such as product placement in advertising.

Terminology used

Most people use "film" and "movie" interchangeably[citation needed]. "Film" is more often used when considering artistic, theoretical, or technical aspects, as studies in a university class. "Movies" more often refers to entertainment or commercial aspects, as where to go for fun on a date. For example, a book titled "How to Read a Film" would be about the aesthetics or theory of film, while "Lets Go to the Movies" would be about the history of entertaining movies. "Motion pictures” or "Moving pictures" are films and movies. A "DVD", "videotape", "video" or "vid" is a digital reproduction of an analogue film, or a product with all of the elements of an analogue film but made in an electromagnetic storage medium. "Film" refers to the media onto which a visual art is shot, and to this end it may seem improper for a digital originating work to be referred to as a "film" and the action of shooting as "filming," and yet these terms are still used. "Silent films" need not be silent, but are films and movies without an audible dialogue, though they may have a musical soundtrack. "Talkies" refers to early movies or films having audible dialogue or analogue sound, not just a musical accompaniment. "Cinema" either broadly encompasses both films and movies, or is roughly synonymous with “Film”, both capitalized when referring to a category of art. The "silver screen" refers to classic black and white films before color, not to contemporary films without color.

The expression "Sight and Sound", as in the film journal of the same name, means "film". The following icons mean film - a "candle and bell", as in the films Tarkovsky, of a segment of film stock, or a two faced Janus image, and an image of a movie camera in profile.

"Widescreen" and "Cinemascope" refers to a larger width to height in the frame, compared to an earlier historic aspect ratios. A "feature length film", or "feature film", is of a conventional full length, usually 60 minutes or more, and can commercially stand by itself without other films in a ticketed screening. A "short" is a film that is not as long as a feature length film, usually screened with other shorts, or preceding a feature length film. An "independent" is a film made outside of the conventional film industry.

A "screening" or "projection" is the projection of a film or video on a screen at a public or private theater, usually but not always of a film, but of a video or DVD when of sufficient projection quality. A "double feature" is a screening of two independent, stand-alone, feature films. A "viewing" is a watching of a film. A "showing" is a screening or viewing on an electronic monitor. "Sales" refers to tickets sold at a theater, or more currently, rights sold for individual showings. A "release" is the distribution and often simultaneous screening of a film. A "preview" is a screening in advance of the main release.

"Hollywood" may be used either as a pejorative adjective, shorthand for asserting an overly commercial rather than artistic intent or outcome, as in "too Hollywood", or as a descriptive adjective to refer to a film originating with people who ordinarily work near Los Angeles.

Expressions for Genres of film are sometimes used interchangeably for "film" in a specific context, such as a "porn" for a film with explicit sexual content, or "cheese" for films that are light, entertaining and not highbrow.

Any film may also have a "Sequel", which chronologically portrays events following those in the film. Film sequels may even be released first, e.g. Star Wars Episode IV.

Preview

A preview performance refers to a showing of a movie to a select audience, usually for the purposes of corporate promotions, before the public film premiere itself. Previews are sometimes used to judge audience reaction, which if unexpectedly negative, may result in recutting or even refilming certain sections (Audience response).

Trailer

Trailers or previews are film advertisements for films that will be exhibited in the future at a cinema, on whose screen they are shown. The term "trailer" comes from their having originally been shown at the end of a film programme. That practice did not last long, because patrons tended to leave the theater after the films ended, but the name has stuck. Trailers are now shown before the film (or the A movie in a double feature program) begins.

Film, or other art form?

Film may be combined with performance art and still be considered or referred to as a “film”. For example, when there is a live musical accompaniment to a silent film. Another example is audience participation films, as at a midnight movies screening of The Rocky Horror Picture Show, where the audience dresses up in costume from the film and loudly does a karaoke-like reenactment along with the film. Performance art where film is incorporated as a component is usually not called film, but a film, which could stand-alone but is accompanied by a performance may still be referred to as a film.

The act of making a film can, in and of itself, be considered a work of art, on a different level from the film itself, as in the films of Werner Herzog.

Similarly, the playing of a film can be considered to fall within the realm of political protest art, as in the subtleties within the films of Tarkovsky. A "road movie" can refer to a film put together from footage from a long road trip or vacation.

Education and Propaganda

Film is used for education and propaganda. When the purpose is primarily educational, a film is called an "educational film". Examples are recordings of lectures and experiments, or more marginally, a film based on a classic novel.

Film may be propaganda, in whole or in part, such as the films made by Leni Riefenstahl in Nazi Germany, US war film trailers during World War II, or artistic films made under Stalin by Eisenstein. They may also be works of political protest, as in the films of Wajda, or more subtly, the films of Andrei Tarkovsky.

The same film may be considered educational by some, and propaganda by others, such as some of the films of Michael Moore.

Production

At its core, the means to produce a film depend on the content the filmmaker wishes to show, and the apparatus for displaying it: the zoetrope merely requires a series of images on a strip of paper. Film production can therefore take as little as one person with a camera (or without it, such as Stan Brakhage's 1963 film Mothlight), or thousands of actors, extras and crewmembers for a live-action, feature-length epic.

The necessary steps for almost any film can be boiled down to conception, planning, execution, revision, and distribution. The more involved the production, the more significant each of the steps becomes. In a typical production cycle of a Hollywood-style film, these main stages are defined as:

  1. Development
  2. Pre-production
  3. Production
  4. Post-production
  5. Distribution

This production cycle usually takes three years. The first year is taken up with development. The second year comprises preproduction and production. The third year, post-production and distribution.

The bigger the production, the more resources it takes, and the more important financing becomes; most feature films are not only artistic works, but for-profit business entities.

Crew

A film crew is a group of people hired by a film company, employed during the "production" or "photography" phase, for the purpose of producing a film or motion picture. Crew are distinguished from cast, the actors who appear in front of the camera or provide voices for characters in the film. The crew interacts with but is also distinct from the production staff, consisting of producers, managers, company representatives, their assistants, and those whose primary responsibility falls in pre-production or post-production phases, such as writers and editors. Communication between production and crew generally passes through the director and his/her staff of assistants. Medium-to-large crews are generally divided into departments with well defined hierarchies and standards for interaction and cooperation between the departments. Other than acting, the crew handles everything in the photography phase: props and costumes, shooting, sound, electrics (i.e., lights), sets, and production special effects. Caterers (known in the film industry as "craft services") are usually not considered part of the crew.

Technology

Film stock consists of transparent celluloid, acetate, or polyester base coated with an emulsion containing light-sensitive chemicals. Cellulose nitrate was the first type of film base used to record motion pictures, but due to its flammability was eventually replaced by safer materials. Stock widths and the film format for images on the reel have had a rich history, though most large commercial films are still shot on (and distributed to theaters) as 35 mm prints.

Originally moving picture film was shot and projected at various speeds using hand-cranked cameras and projectors; though 1000 frames per minute (16⅔ frame/s) is generally cited as a standard silent speed, research indicates most films were shot between 16 frame/s and 23 frame/s and projected from 18 frame/s on up (often reels included instructions on how fast each scene should be shown)[5]. When sound film was introduced in the late 1920s, a constant speed was required for the sound head. 24 frames per second was chosen because it was the slowest (and thus cheapest) speed which allowed for sufficient sound quality. Improvements since the late 19th century include the mechanization of cameras — allowing them to record at a consistent speed, quiet camera design — allowing sound recorded on-set to be usable without requiring large "blimps" to encase the camera, the invention of more sophisticated filmstocks and lenses, allowing directors to film in increasingly dim conditions, and the development of synchronized sound, allowing sound to be recorded at exactly the same speed as its corresponding action. The soundtrack can be recorded separately from shooting the film, but for live-action pictures many parts of the soundtrack are usually recorded simultaneously.

As a medium, film is not limited to motion pictures, since the technology developed as the basis for photography. It can be used to present a progressive sequence of still images in the form of a slideshow. Film has also been incorporated into multimedia presentations, and often has importance as primary historical documentation. However, historic films have problems in terms of preservation and storage, and the motion picture industry is exploring many alternatives. Most movies on cellulose nitrate base have been copied onto modern safety films. Some studios save color films through the use of separation masters — three B&W negatives each exposed through red, green, or blue filters (essentially a reverse of the Technicolor process). Digital methods have also been used to restore films, although their continued obsolescence cycle makes them (as of 2006) a poor choice for long-term preservation. Film preservation of decaying film stock is a matter of concern to both film historians and archivists, and to companies interested in preserving their existing products in order to make them available to future generations (and thereby increase revenue). Preservation is generally a higher-concern for nitrate and single-strip color films, due to their high decay rates; black and white films on safety bases and color films preserved on Technicolor imbibition prints tend to keep up much better, assuming proper handling and storage.

Some films in recent decades have been recorded using analog video technology similar to that used in television production. Modern digital video cameras and digital projectors are gaining ground as well. These approaches are extremely beneficial to moviemakers, especially because footage can be evaluated and edited without waiting for the film stock to be processed. Yet the migration is gradual, and as of 2005 most major motion pictures are still recorded on film.

Independent

Independent filmmaking often takes place outside of Hollywood, or other major studio systems. An independent film (or indie film) is a film initially produced without financing or distribution from a major movie studio. Creative, business, and technological reasons have all contributed to the growth of the indie film scene in the late 20th and early 21st century.

On the business side, the costs of big-budget studio films also leads to conservative choices in cast and crew. There is a trend in Hollywood towards co-financing (over two-thirds of the films put out by Warner Bros. in 2000 were joint ventures, up from 10% in 1987).[6] A hopeful director is almost never given the opportunity to get a job on a big-budget studio film unless he or she has significant industry experience in film or television. Also, the studios rarely produce films with unknown actors, particularly in lead roles.

Before the advent of digital alternatives, the cost of professional film equipment and stock was also a hurdle to being able to produce, direct, or star in a traditional studio film.

But the advent of consumer camcorders in 1985, and more importantly, the arrival of high-resolution digital video in the early 1990s, have lowered the technology barrier to movie production significantly. Both production and post-production costs have been significantly lowered; today, the hardware and software for post-production can be installed in a commodity-based personal computer. Technologies such as DVDs, FireWire connections and non-linear editing system pro-level software like Adobe Premiere Pro, Sony Vegas and Apple's Final Cut Pro, and consumer level software such as Apple's Final Cut Express and iMovie, and Microsoft's Windows Movie Maker make movie-making relatively inexpensive.

Since the introduction of DV technology, the means of production have become more democratized. Filmmakers can conceivably shoot and edit a movie, create and edit the sound and music, and mix the final cut on a home computer. However, while the means of production may be democratized, financing, distribution, and marketing remain difficult to accomplish outside the traditional system. Most independent filmmakers rely on film festivals to get their films noticed and sold for distribution. The arrival of internet-based video outlets such as YouTube and Veoh has further changed the film making landscape in ways that are still to be determined.

Open content film

An open content film is much like an independent film, but it is produced through open collaborations; its source material is available under a license which is permissive enough to allow other parties to create fan fiction or derivative works, than a traditional copyright. Like independent filmmaking, open source filmmaking takes place outside of Hollywood, or other major studio systems.

Fan film

A fan film is a film or video inspired by a film, television program, comic book or a similar source, created by fans rather than by the source's copyright holders or creators. Fan filmmakers have traditionally been amateurs, but some of the more notable films have actually been produced by professional filmmakers as film school class projects or as demonstration reels. Fan films vary tremendously in length, from short faux-teaser trailers for non-existent motion pictures to rarer full-length motion pictures.

Distribution

When it is initially produced, a feature film is often shown to audiences in a movie theater or cinema. The identity of the first theater designed specifically for cinema is a matter of debate; candidates include Tally's Electric Theatre, established 1902 in Los Angeles[7], and Pittsburgh's Nickelodeon, established 1905.[8] Thousands of such theaters were built or converted from existing facilities within a few years.[9] In the United States, these theaters came to be known as nickelodeons, because admission typically cost a nickel (five cents).

Typically, one film is the featured presentation (or feature film). Before the 1970s, there were "double features"; typically, a high quality "A picture" rented by an independent theater for a lump sum, and a "B picture" of lower quality rented for a percentage of the gross receipts. Today, the bulk of the material shown before the feature film consists of previews for upcoming movies and paid advertisements (also known as trailers or "The Twenty").

Historically, all mass marketed feature films were made to be shown in movie theaters. The development of television has allowed films to be broadcast to larger audiences, usually after the film is no longer being shown in theaters. Recording technology has also enabled consumers to rent or buy copies of films on VHS or DVD (and the older formats of laserdisc, VCD and SelectaVision — see also videodisc), and Internet downloads may be available and have started to become revenue sources for the film companies. Some films are now made specifically for these other venues, being released as made-for-TV movies or direct-to-video movies. The production values on these films are often considered to be of inferior quality compared to theatrical releases in similar genres, and indeed, some films that are rejected by their own studios upon completion are distributed through these markets.

The movie theater pays an average of about 50-55% of its ticket sales to the movie studio, as film rental fees.[10] The actual percentage starts with a number higher than that, and decreases as the duration of a film's showing continues, as an incentive to theaters to keep movies in the theater longer. However, today's barrage of highly marketed movies ensures that most movies are shown in first-run theaters for less than 8 weeks. There are a few movies every year that defy this rule, often limited-release movies that start in only a few theaters and actually grow their theater count through good word-of-mouth and reviews. According to a 2000 study by ABN AMRO, about 26% of Hollywood movie studios' worldwide income came from box office ticket sales; 46% came from VHS and DVD sales to consumers; and 28% came from television (broadcast, cable, and pay-per-view).[10]

Animation

Animation is the technique in which each frame of a film is produced individually, whether generated as a computer graphic, or by photographing a drawn image, or by repeatedly making small changes to a model unit (see claymation and stop motion), and then photographing the result with a special animation camera. When the frames are strung together and the resulting film is viewed at a speed of 16 or more frames per second, there is an illusion of continuous movement (due to the persistence of vision). Generating such a film is very labor intensive and tedious, though the development of computer animation has greatly sped up the process.

File formats like GIF, QuickTime, Shockwave and Flash allow animation to be viewed on a computer or over the Internet.

Because animation is very time-consuming and often very expensive to produce, the majority of animation for TV and movies comes from professional animation studios. However, the field of independent animation has existed at least since the 1950s, with animation being produced by independent studios (and sometimes by a single person). Several independent animation producers have gone on to enter the professional animation industry.

Limited animation is a way of increasing production and decreasing costs of animation by using "short cuts" in the animation process. This method was pioneered by UPA and popularized by Hanna-Barbera, and adapted by other studios as cartoons moved from movie theaters to television.[11]

Although most animation studios are now using digital technologies in their productions, there is a specific style of animation that depends on film. Cameraless animation, made famous by moviemakers like Norman McLaren, Len Lye and Stan Brakhage, is painted and drawn directly onto pieces of film, and then run through a projector.

Future state

While motion picture films have been around for more than a century, film is still a relative newcomer in the pantheon of fine arts. In the 1950s, when television became widely available, industry analysts[who?] predicted the demise of local movie theaters[citation needed]. Despite competition from television's increasing technological sophistication over the 1960s and 1970s[citation needed]such as the development of color television and large screens, motion picture cinemas continued. In fact with the rise of television's predominance, film began to become more respected as an artistic medium by contrast due the low general opinion of the quality of average television content[citation needed]In the 1980s, when the widespread availability of inexpensive videocassette recorders enabled people to select films for home viewing, industry analysts again wrongly predicted the death of the local cinemas.[citation needed]

In the 1990s and 2000s the development of digital DVD players, home theater amplification systems with surround sound and subwoofers, and large LCD or plasma screens enabled people to select and view films at home with greatly improved audio and visual reproduction[citation needed]. These new technologies provided audio and visual that in the past only local cinemas had been able to provide: a large, clear widescreen presentation of a film with a full-range, high-quality multi-speaker sound system. Once again industry analysts predicted the demise of the local cinema. Local cinemas will be changing in the 2000s and moving towards digital screens, a new approach which will allow for easier and quicker distribution of films (via satellite or hard disks), a development which may give local theaters a reprieve from their predicted demise.[citation needed] The cinema now faces a new challenge from home video by the likes of a new High Definition format, Blu-ray, which can provide full HD 1080p video playback at near cinema quality[citation needed]Video formats are gradually catching up with the resolutions and quality that film offers, 1080p in Blu-ray offers a pixel resolution of 1920×1080 a leap from the DVD offering of 720×480 and the paltry 330×480 offered by the first home video standard VHS[citation needed]The maximum resolutions that film currently offers are 2485×2970 or 1420×3390, UHD, a future digital video format, will offer a massive resolution of 7680×4320, surpassing all current film resolutions. The only viable competitor to these new innovations is IMAX which can play film content at an extreme 10000×7000 resolution[citation needed].

Despite the rise of all new technologies, the development of the home video market and a surge of online copyright infringement, 2007 was a record year in film that showed the highest ever box-office grosses. Many expected film to suffer as a result of the effects listed above but it has flourished, strengthening film studio expectations for the future[citation needed].

Sunday, March 28, 2010

Philosophy of science

Jump to: navigation, search

The philosophy of science is concerned with the assumptions, foundations, methods and implications of science. The field is defined by an interest in one of a set of "traditional" problems or an interest in central or foundational concerns in science. In addition to these central problems for science as a whole, many philosophers of science consider these problems as they apply to particular sciences (e.g. philosophy of biology or philosophy of physics). Some philosophers of science also use contemporary results in science to draw philosophical morals.

Although most practitioners are philosophers, several prominent scientists have contributed to the field and still do. Other prominent scientists have felt that the practical effect on their work is limited: “Philosophy of science is about as useful to scientists as ornithology is to birds,” according to physicist Richard Feynman.[citation needed]

Philosophy of science focuses on metaphysical, epistemic and semantic aspects of science. Ethical issues such as bioethics and scientific misconduct are usually considered ethics or science studies rather than philosophy of science.

Contents

[hide]

[edit] Nature of scientific concepts and statements

[edit] Demarcation

Karl Popper contended that the central question in the philosophy of science was distinguishing science from non-science.[1]

Early attempts by the logical positivists grounded science in observation while non-science was non-observational and hence nonsense.[2] Popper claimed that the central feature of science was that science aims at falsifiable claims (i.e. claims that can be proven false, at least in principle).[3]

No single unified account of the difference between science and non-science has been widely accepted by philosophers, and some regard the problem as unsolvable or uninteresting.[4]

This problem has taken center stage in the debate regarding evolution and intelligent design. The vast majority of opponents of intelligent design claim that it does not meet the criteria of science and should thus not be treated on equal footing as evolution.[5] Those who defend intelligent design either attempt to validate the view as meeting the criteria of science or challenge the coherence of this distinction.[6]

[edit] Scientific realism and instrumentalism

Two central questions about science are (1) what are the aims of science and (2) how should one interpret the results of science? Scientific realists claim that science aims at truth and that one ought to regard scientific theories as true, approximately true, or likely true. Conversely, a scientific antirealist or instrumentalist argues that science does not aim (or at least does not succeed) at truth and that we should not regard scientific theories as true.[7] Some antirealists claim that scientific theories aim at being instrumentally useful and should only be regarded as useful, but not true, descriptions of the world.[8] More radical antirealists, like Thomas Kuhn and Paul Feyerabend, have argued that scientific theories do not even succeed at this goal, and that later, more accurate scientific theories are not "typically approximately true" as Popper contended.[9][10]

Realists often point to the success of recent scientific theories as evidence for the truth (or near truth) of our current theories.[11][12][13][14][15] Antirealists point to either the history of science,[16][17] epistemic morals,[8] the success of false modeling assumptions,[18] or widely termed postmodern criticisms of objectivity as evidence against scientific realisms.[19] Some antirealists attempt to explain the success of our theories without reference to truth[8][20] while others deny that our current scientific theories are successful at all.[9][10]

[edit] Scientific explanation

In addition to providing predictions about future events, we often take scientific theories to offer explanations for those that occur regularly or have already occurred. Philosophers have investigated the criteria by which a scientific theory can be said to have successfully explained a phenomenon, as well as what gives a scientific theory explanatory power. One early and influential theory of scientific explanation was put forward by Carl G. Hempel and Paul Oppenheim in 1948. Their Deductive-Nomological (D-N) model of explanation says that a scientific explanation succeeds by subsuming a phenomenon under a general law.[21] Although ignored for a decade, this view was subjected to substantial criticism, resulting in several widely believed counter examples to the theory.[22]

In addition to their D-N model, Hempel and Oppenheim offered other statistical models of explanation which would account for statistical sciences.[21] These theories have received criticism as well.[22] Salmon attempted to provide an alternative account for some of the problems with Hempel and Oppenheim's model by developing his statistical relevance model.[23][24] In addition to Salmon's model, others have suggested that explanation is primarily motivated by unifying disparate phenomena or primarily motivated by providing the causal or mechanical histories leading up to the phenomenon (or phenomena of that type).[24]

[edit] Analysis and reductionism

Analysis is the activity of breaking an observation or theory down into simpler concepts in order to understand it. Analysis is as essential to science as it is to all rational enterprises. For example, the task of describing mathematically the motion of a projectile is made easier by separating out the force of gravity, angle of projection and initial velocity. After such analysis it is possible to formulate a suitable theory of motion.

Reductionism in science can have several different senses. One type of reductionism is the belief that all fields of study are ultimately amenable to scientific explanation. Perhaps a historical event might be explained in sociological and psychological terms, which in turn might be described in terms of human physiology, which in turn might be described in terms of chemistry and physics.

Daniel Dennett invented the term greedy reductionism to describe the assumption that such reductionism was possible. He claims that it is just 'bad science', seeking to find explanations which are appealing or eloquent, rather than those that are of use in predicting natural phenomena. He also says that:

There is no such thing as philosophy-free science; there is only science whose philosophical baggage is taken on board without examination.Daniel Dennett, Darwin's Dangerous Idea, 1995.

Arguments made against greedy reductionism through reference to emergent phenomena rely upon the fact that self-referential systems can be said to contain more information than can be described through individual analysis of their component parts. Examples include systems that contain strange loops, fractal organization and strange attractors in phase space. Analysis of such systems is necessarily information-destructive because the observer must select a sample of the system that can be at best partially representative. Information theory can be used to calculate the magnitude of information loss and is one of the techniques applied by Chaos theory.

[edit] Grounds of validity of scientific reasoning

[edit] Empirical Verification

Science relies on evidence to validate its theories and models. The predictions implied by those theories and models should be in agreement with observation. Ultimately, observations reduce to those made by the unaided human senses: sight, hearing, etc. To be accepted by most scientists, several impartial, competent observers should agree on what is observed. Observations should be repeatable, e.g., experiments that generate relevant observations can be (and, if important, usually will be) done again. Furthermore, predictions should be specific; one should be able to describe a possible observation that would falsify the theory or a model that implies the prediction.

Nevertheless, while the basic concept of empirical verification is simple, in practice, there are difficulties as described in the following sections.

[edit] Induction

It is not possible for scientists to have tested every incidence of an action, and found a reaction. How is it, then, that they can assert, for example, that Newton's Third Law is universally true? They have, of course, tested many, many actions, and in each one have been able to find the corresponding reaction. But can we be sure that the next time we test the Third Law, it will be found to hold true?

One solution to this problem is to rely on the notion of induction. Inductive reasoning maintains that if a situation holds in all observed cases, then the situation holds in all cases. So, after completing a series of experiments that support the Third Law, one is justified in maintaining that the Law holds in all cases.

Explaining why induction commonly works has been somewhat problematic. One cannot use deduction, the usual process of moving logically from premise to conclusion, because there is simply no syllogism that will allow such a move. No matter how many times 17th century biologists observed white swans, and in how many different locations, there is no deductive path that can lead them to the conclusion that all swans are white. This is just as well, since, as it turned out, that conclusion would have been wrong. Similarly, it is at least possible that an observation will be made tomorrow that shows an occasion in which an action is not accompanied by a reaction; the same is true of any scientific law.

One answer has been to conceive of a different form of rational argument, one that does not rely on deduction. Deduction allows one to formulate a specific truth from a general truth: all crows are black; this is a crow; therefore this is black. Induction somehow allows one to formulate a general truth from some series of specific observations: this is a crow and it is black; that is a crow and it is black; no crow has been seen that is not black; therefore all crows are black.

The problem of induction is one of considerable debate and importance in the philosophy of science: is induction indeed justified, and if so, how?

[edit] Test of an isolated theory impossible

According to the Duhem-Quine thesis, after Pierre Duhem and W.V. Quine, it is impossible to test a theory in isolation. One must always add auxiliary hypotheses in order to make testable predictions. For example, to test Newton's Law of Gravitation in our solar system, one needs information about the masses and positions of the Sun and all the planets. Famously, the failure to predict the orbit of Uranus in the 19th century led, not to the rejection of Newton's Law, but rather to the rejection of the hypothesis that there are only seven planets in our solar system. The investigations that followed led to the discovery of an eighth planet, Neptune. If a test fails, something is wrong. But there is a problem in figuring out what that something is: a missing planet, badly calibrated test equipment, an unsuspected curvature of space, etc.

One consequence of the Duhem-Quine thesis is that any theory can be made compatible with any empirical observation by the addition of suitable ad hoc hypotheses.

This thesis was accepted by Karl Popper, leading him to reject naïve falsification in favor of 'survival of the fittest', or most falsifiable, of scientific theories. In Popper's view, any hypothesis that does not make testable predictions is simply not science. Such a hypothesis may be useful or valuable, but it cannot be said to be science. Confirmation holism, developed by W.V. Quine, states that empirical data are not sufficient to make a judgment between theories. In this view, a theory can always be made to fit with the available empirical data. However, the fact that empirical evidence does not serve to determine between alternative theories does not necessarily imply that all theories are of equal value, as scientists often use guiding principles such as Occam's Razor.

One result of this view is that specialists in the philosophy of science stress the requirement that observations made for the purposes of science be restricted to intersubjective objects. That is, science is restricted to those areas where there is general agreement on the nature of the observations involved. It is comparatively easy to agree on observations of physical phenomena, harder for them to agree on observations of social or mental phenomena, and difficult in the extreme to reach agreement on matters of theology or ethics (and thus the latter remain outside the normal purview of science).

[edit] Theory-dependence of observations

When making observations, scientists peer through telescopes, study images on electronic screens, record meter readings, and so on. Generally, on a basic level, they can agree on what they see, e.g., the thermometer shows 37.9 C. But, if these scientists have very different ideas about the theories that supposedly explain these basic observations, they can interpret them in very different ways. Ancient "scientists" interpreted the rising of the Sun in the morning as evidence that the Sun moved. Later scientists deduce that the Earth is rotating. While some scientists may conclude that certain observations confirm a specific hypothesis; skeptical co-workers may yet suspect that something is wrong with the test equipment, for example. Observations when interpreted by a scientist's theories are said to be theory-laden.

Observation involves both perception as well as cognition. That is, one does not make an observation passively, but is also actively engaged in distinguishing the phenomenon being observed from surrounding sensory data. Therefore, observations depend on our underlying understanding of the way in which the world functions, and that understanding may influence what is perceived, noticed, or deemed worthy of consideration. More importantly, most scientific observation must be done within a theoretical context in order to be useful. For example, when one observes a measured increase in temperature, that observation is based on assumptions about the nature of temperature and its measurement, as well as assumptions about the way the instrument used to measure the temperature functions. Such assumptions are necessary in order to obtain scientifically useful observations (such as, "the temperature increased by two degrees").

Empirical observation is used to determine the acceptability of some hypothesis within a theory. When someone claims to have made an observation, it is reasonable to ask them to justify their claim. Such justification must include reference to the theory – operational definitions and hypotheses – in which the observation is embedded. That is, the observation is framed in terms of the theory that also contains the hypothesis it is meant to verify or falsify (though of course the observation should not be based on an assumption of the truth or falsity of the hypothesis being tested). This means that the observation cannot serve as an entirely neutral arbiter between competing hypotheses, but can only arbitrate between the hypotheses within the context of the underlying theory.

Thomas Kuhn denied that it is ever possible to isolate the hypothesis being tested from the influence of the theory in which the observations are grounded. He argued that observations always rely on a specific paradigm, and that it is not possible to evaluate competing paradigms independently. By "paradigm" he meant, essentially, a logically consistent "portrait" of the world, one that involves no logical contradictions and that is consistent with observations that are made from the point of view of this paradigm. More than one such logically consistent construct can paint a usable likeness of the world, but there is no common ground from which to pit two against each other, theory against theory. Neither is a standard by which the other can be judged. Instead, the question is which "portrait" is judged by some set of people to promise the most useful in terms of scientific “puzzle solving”.

For Kuhn, the choice of paradigm was sustained by, but not ultimately determined by, logical processes. The individual's choice between paradigms involves setting two or more “portraits" against the world and deciding which likeness is most promising. In the case of a general acceptance of one paradigm or another, Kuhn believed that it represented the consensus of the community of scientists. Acceptance or rejection of some paradigm is, he argued, a social process as much as a logical process. Kuhn's position, however, is not one of relativism.[25] According to Kuhn, a paradigm shift will occur when a significant number of observational anomalies in the old paradigm have made the new paradigm more useful. That is, the choice of a new paradigm is based on observations, even though those observations are made against the background of the old paradigm. A new paradigm is chosen because it does a better job of solving scientific problems than the old one.

The fact that observation is embedded in theory does not mean observations are irrelevant to science. Scientific understanding derives from observation, but the acceptance of scientific statements is dependent on the related theoretical background or paradigm as well as on observation. Coherentism, skepticism, and foundationalism are alternatives for dealing with the difficulty of grounding scientific theories in something more than observations. And, of course, further, redesigned testing may resolve differences of opinion.

[edit] Coherentism

Induction attempts to justify scientific statements by reference to other specific scientific statements. It must avoid the problem of the criterion, in which any justification must in turn be justified, resulting in an infinite regress. The regress argument has been used to justify one way out of the infinite regress, foundationalism. Foundationalism claims that there are some basic statements that do not require justification. Both induction and falsification are forms of foundationalism in that they rely on basic statements that derive directly from immediate sensory experience.

The way in which basic statements are derived from observation complicates the problem. Observation is a cognitive act; that is, it relies on our existing understanding, our set of beliefs. An observation of a transit of Venus requires a huge range of auxiliary beliefs, such as those that describe the optics of telescopes, the mechanics of the telescope mount, and an understanding of celestial mechanics. At first sight, the observation does not appear to be 'basic'.

Coherentism offers an alternative by claiming that statements can be justified by their being a part of a coherent system. In the case of science, the system is usually taken to be the complete set of beliefs of an individual scientist or, more broadly, of the community of scientists. W. V. Quine argued for a Coherentist approach to science, as does E O Wilson, though he uses the term consilience (notably in his book of that name). An observation of a transit of Venus is justified by its being coherent with our beliefs about optics, telescope mounts and celestial mechanics. Where this observation is at odds with one of these auxiliary beliefs, an adjustment in the system will be required to remove the contradiction.

[edit] Ockham's razor

William of Ockham (c. 1295–1349) … is remembered as an influential nominalist, but his popular fame as a great logician rests chiefly on the maxim known as Ockham's razor: Entia non sunt multiplicanda praeter necessitatem ["entities must not be multiplied beyond necessity]. No doubt this represents correctly the general tendency of his philosophy, but it has not so far been found in any of his writings. His nearest pronouncement seems to be Numquam ponenda est pluralitas sine necessitate [Plurality must never be posited without necessity], which occurs in his theological work on the Sentences of Peter Lombard (Super Quattuor Libros Sententiarum (ed. Lugd., 1495), i, dist. 27, qu. 2, K). In his Summa Totius Logicae, i. 12, Ockham cites the principle of economy, Frustra fit per plura quod potest fieri per pauciora [It is futile to do with more things that which can be done with fewer]. (Kneale and Kneale, 1962, p. 243)

The practice of scientific inquiry typically involves a number of heuristic principles that serve as rules of thumb for guiding the work. Prominent among these are the principles of conceptual economy or theoretical parsimony that are customarily placed under the rubric of Ockham's razor, named after the 14th century Franciscan friar William of Ockham who is credited with giving the maxim many pithy expressions, not all of which have yet been found among his extant works.[26]

The motto is most commonly cited in the form "entities should not be multiplied beyond necessity", generally taken to suggest that the simplest explanation tends to be the correct one. As interpreted in contemporary scientific practice, it advises opting for the simplest theory among a set of competing theories that have a comparable explanatory power, discarding assumptions that do not improve the explanation. The "other things being equal" clause is a critical qualification, which rather severely limits the utility of Ockham's razor in real practice, as theorists rarely if ever find themselves presented with competent theories of exactly equal explanatory adequacy.

Among the many difficulties that arise in trying to apply Ockham's razor is the problem of formalizing and quantifying the "measure of simplicity" that is implied by the task of deciding which of several theories is the simplest. Although various measures of simplicity have been brought forward as potential candidates from time to time, it is generally recognized that there is no such thing as a theory-independent measure of simplicity. In other words, there appear to be as many different measures of simplicity as there are theories themselves, and the task of choosing between measures of simplicity appears to be every bit as problematic as the job of choosing between theories. Moreover, it is extremely difficult to identify the hypotheses or theories that have "comparable explanatory power", though it may be readily possible to rule out some of the extremes. Ockham's razor also does not say that the simplest account is to be preferred regardless of its capacity to explain outliers, exceptions, or other phenomena in question. The principle of falsifiability requires that any exception that can be reliably reproduced should invalidate the simplest theory, and that the next-simplest account which can actually incorporate the exception as part of the theory should then be preferred to the first. As Albert Einstein puts it, "The supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience".

[edit] Objectivity of observations in science

It is vitally important for science that the information about the surrounding world and the objects of study be as accurate and as reliable as possible. For the sake of this, measurements which are the source of this information must be as objective as possible. Before the invention of measuring tools (like weights, meter sticks, clocks, etc) the only source of information available to humans were their senses (vision, hearing, taste, tactile, sense of heat, sense of gravity, etc.). Because human senses differ from person to person (due to wide variations in personal chemistry, deficiencies, inherited flaws, etc) there were no objective measurements before the invention of these tools. The consequence of this was the lack of a rigorous science.

With the advent of exchange of goods, trades, and agricultures there arose a need in such measurements, and science (arithmetics, geometry, mechanics, etc) based on standardized units of measurements (stadia, pounds, seconds, etc) was born. To further abstract from unreliable human senses and make measurements more objective, science uses measuring devices (like spectrometers, voltmeters, interferometers, thermocouples, counters, etc) and lately - computers. In most cases, the less human involvement in the measuring process, the more accurate and reliable scientific data are. Currently most measurements are done by a variety of mechanical and electronic sensors directly linked to computers—which further reduces the chance of human error/contamination of information. This made it possible to achieve astonishing accuracy of modern measurements. For example, current accuracy of measurement of mass is about 10−10, of angles—about 10−9, and of time and length intervals in many cases reaches the order of 10−13 - 10−15. This made possible to measure, say, the distance to the Moon with sub-centimeter accuracy (see Lunar laser ranging experiment), to measure slight movement of tectonic plates using GPS system with sub-millimeter accuracy, or even to measure as slight variations in the distance between two mirrors separated by several kilometers as 10−18 m—three orders of magnitude less than the size of a single atomic nucleus—see LIGO.

Another question about the objectivity of observations relates to the so called "experimenter's regress", as well as to other problems identified from the sociology of scientific knowledge: the people that carry out the observations or experiments always have cognitive and social biases that lead them, often in an unconscious way, to introduce their own interpretations and theories into their description of what they are 'seeing'. Some of these arguments can be shown to be of a limited scope, when analysed from a game-theoretic point of view.

[edit] Philosophy of particular sciences

In addition to addressing the general questions regarding science and induction, many philosophers of science are occupied by investigating philosophical or foundational problems in particular sciences. The late 20th and early 21st century has seen a rise in the number of practitioners of philosophy of a particular science.

[edit] Philosophy of biology

Philosophy of biology deals with epistemological, metaphysical, and ethical issues in the biological and biomedical sciences. Although philosophers of science and philosophers generally have long been interested in biology (e.g., Aristotle, Descartes, and even Kant), philosophy of biology only emerged as an independent field of philosophy in the 1960s and 1970s. Philosophers of science then began paying increasing attention to developments in biology, from the rise of Neodarwinism in the 1930s and 1940s to the discovery of the structure of Deoxyribonucleic acid (DNA) in 1953 to more recent advances in genetic engineering. Other key ideas such as the reduction of all life processes to biochemical reactions as well as the incorporation of psychology into a broader neuroscience are also addressed.

[edit] Philosophy of chemistry

Philosophy of chemistry considers the methodology and underlying assumptions of the science of chemistry. It is explored by philosophers, chemists, and philosopher-chemist teams.

The philosophy of science has centered on physics for the last several centuries, and during the last century in particular, it has become increasingly concerned with the ultimate constituents of existence, or what one might call reductionism. Thus, for example, considerable attention has been devoted to the philosophical implications of special relativity, general relativity, and quantum mechanics. In recent years, however, more attention has been given to both the philosophy of biology and chemistry, which both deal with more intermediate states of existence.

In the philosophy of chemistry, for example, we might ask, given quantum reality at the microcosmic level, and given the enormous distances between electrons and the atomic nucleus, how is it that we are unable to put our hands through walls, as physics might predict? Chemistry provides the answer, and so we then ask what it is that distinguishes chemistry from physics?

In the philosophy of biology, which is closely related to chemistry, we inquire about what distinguishes a living thing from a non-living thing at the most elementary level. Can a living thing be understood in purely mechanistic terms, or is there, as vitalism asserts, always something beyond mere quantum states?

Issues in philosophy of chemistry may not be as deeply conceptually perplexing as the quantum mechanical measurement problem in the philosophy of physics, and may not be as conceptually complex as optimality arguments in evolutionary biology. However interest in the philosophy of chemistry in part stems from the ability of chemistry to connect the “hard sciences” such as physics with the “soft sciences” such as biology, which gives it a rather distinctive role as the central science.

[edit] Philosophy of mathematics

Philosophy of mathematics is the branch of philosophy that studies the philosophical assumptions, foundations, and implications of mathematics.

Recurrent themes include:

  • What are the sources of mathematical subject matter?
  • What is the ontological status of mathematical entities?
  • What does it mean to refer to a mathematical object?
  • What is the character of a mathematical proposition?
  • What is the relation between logic and mathematics?
  • What is the role of hermeneutics in mathematics?
  • What kinds of inquiry play a role in mathematics?
  • What are the objectives of mathematical inquiry?
  • What gives mathematics its hold on experience?
  • What are the human traits behind mathematics?
  • What is mathematical beauty?
  • What is the source and nature of mathematical truth?
  • What is the relationship between the abstract world of mathematics and the material universe?
  • What is a number?
  • Are mathematical proofs exercises in tautology?
  • Why does it make sense to ask whether "1+1=2" is true?
  • How do we know whether a mathematical proof is correct?

[edit] Philosophy of physics

Philosophy of physics is the study of the fundamental, philosophical questions underlying modern physics, the study of matter and energy and how they interact. The main questions concern the nature of space and time, atoms and atomism. Also the predictions of cosmology, the results of the interpretation of quantum mechanics, the foundations of statistical mechanics, causality, determinism, and the nature of physical laws. Classically, several of these questions were studied as part of metaphysics (for example, those about causality, determinism, and space and time).

[edit] Philosophy of social science

[edit] Positivism and scientism

The French philosopher, Auguste Comte (1798-1857), established the epistemological perspective of positivism in The Course in Positivist Philosophy, a series of texts published between 1830 and 1842. These texts were followed by the 1844 work, A General View of Positivism (published in English in 1865). The first three volumes of the Course dealt chiefly with the physical sciences already in existence (mathematics, astronomy, physics, chemistry, biology), whereas the latter two emphasised the inevitable coming of social science. Observing the circular dependence of theory and observation in science, and classifying the sciences in this way, Comte may be regarded as the first philosopher of science in the modern sense of the term.[27] For him, the physical sciences had necessarily to arrive first, before humanity could adequately channel its efforts into the most challenging and complex "Queen science" of human society itself. Comte offers an evolutionary system proposing that society undergoes three phases in its quest for the truth according to a general 'law of three stages'. These are (1) the theological, (2) the metaphysical, and (3) the positive.[28]

Comte's positivism laid the initial foundations for sociology, social research, and social science in general. In psychology, a positivistic approach has historically been favoured in behaviourism. In the early 20th century, logical positivism—a stricter version of Comte's basic thesis but a broadly independent movement— sprang up in Vienna and grew to become one of the dominant movements in Anglo-American philosophy and the analytic tradition. Logical positivists (or 'neopositivists') reject metaphysical speculation and attempt to reduce statements and propositions to pure logic.

The positivist perspective, however, has been associated with 'scientism'; the view that the methods of the natural sciences may be applied to all areas of investigation, be it philosophical, social scientific, or otherwise. Among most social scientists and historians, orthodox positivism has long since fallen out of favor. Today, practitioners of both social and physical sciences recognize the distorting effect of observer bias and structural limitations. This scepticism has been facilitated by a general weakening of deductivist accounts of science by philosophers such as Thomas Kuhn, and new philosophical movements such as critical realism and neopragmatism. Positivism has also been espoused by 'technocrats' who believe in the inevitability of social progress through science and technology.[29] The philosopher-sociologist Jürgen Habermas has critiqued pure instrumental rationality as meaning that scientific-thinking becomes something akin to ideology itself.[30]

[edit] Philosophy of economics

Philosophy of economics is the branch of philosophy which studies philosophical issues relating to economics. It can also be defined as the branch of economics which studies its own foundations and morality.

[edit] Philosophy of psychology

Philosophy of psychology refers to issues at the theoretical foundations of modern psychology. Some of these issues are epistemological concerns about the methodology of psychological investigation. For example:

  • What is the most appropriate methodology for psychology: mentalism, behaviorism, or a compromise?
  • Are self-reports a reliable data gathering method?
  • What conclusions can be drawn from null hypothesis tests?
  • Can first-person experiences (emotions, desires, beliefs, etc.) be measured objectively?

Other issues in philosophy of psychology are philosophical questions about the nature of mind, brain, and cognition, and are perhaps more commonly thought of as part of cognitive science, or philosophy of mind, such as:

Philosophy of psychology also closely monitors contemporary work conducted in cognitive neuroscience, evolutionary psychology, and artificial intelligence, questioning what they can and cannot explain in psychology.

Philosophy of psychology is a relatively young field, due to the fact that psychology only became a discipline of its own in the late 1800s. Philosophy of mind, by contrast, has been a well-established discipline since before psychology was a field of study at all. It is concerned with questions about the very nature of mind, the qualities of experience, and particular issues like the debate between dualism and monism.

Also, neurophilosophy has become its own field with the works of Paul and Patricia Churchland.

[edit] Social accountability

[edit] Scientific Openness

A very broad issue affecting the neutrality of science concerns the areas over which science chooses to explore, so what part of the world and man is studied by science. Since the areas for science to investigate are theoretically infinite, the issue then arises as to what science should attempt to question or find out.

Philip Kitcher in his "Science, Truth, and Democracy"[31] argues that scientific studies that attempt to show one segment of the population as being less intelligent, successful or emotionally backward compared to others have a political feedback effect which further excludes such groups from access to science. Thus such studies undermine the broad consensus required for good science by excluding certain people, and so proving themselves in the end to be unscientific.

See also The Mismeasure of Man.

[edit] Critiques of scientific method

Paul Feyerabend argued that no description of scientific method could possibly be broad enough to encompass all the approaches and methods used by scientists. Feyerabend objected to prescriptive scientific method on the grounds that any such method would stifle and cramp scientific progress. Feyerabend claimed, "the only principle that does not inhibit progress is: anything goes."[32] However there have been many opponents to his theory. Alan Sokal and Jean Bricmont wrote the essay "Feyerabend: Anything Goes" about his belief that science is of little use to society.

[edit] Sociology, anthropology and economics of science

In his book The Structure of Scientific Revolutions Kuhn argues that the process of observation and evaluation take place within a paradigm. 'A paradigm is what the members of a community of scientists share, and, conversely, a scientific community consists of men who share a paradigm'.[33] On this account, science can be done only as a part of a community, and is inherently a communal activity.

For Kuhn, the fundamental difference between science and other disciplines is in the way in which the communities function. Others, especially Feyerabend and some post-modernist thinkers, have argued that there is insufficient difference between social practices in science and other disciplines to maintain this distinction. It is apparent that social factors play an important and direct role in scientific method, but that they do not serve to differentiate science from other disciplines. Furthermore, although on this account science is socially constructed, it does not follow that reality is a social construct. (See Science studies and the links there.) Kuhn’s ideas are equally applicable to both realist and anti-realist ontologies.

There are, however, those who maintain that scientific reality is indeed a social construct, to quote Quine:

Physical objects are conceptually imported into the situation as convenient intermediaries not by definition in terms of experience, but simply as irreducible posits comparable, epistemologically, to the gods of Homer . . . For my part I do, qua lay physicist, believe in physical objects and not in Homer's gods; and I consider it a scientific error to believe otherwise. But in point of epistemological footing, the physical objects and the gods differ only in degree and not in kind. Both sorts of entities enter our conceptions only as cultural posits[34]

A major development in recent decades has been the study of the formation, structure, and evolution of scientific communities by sociologists and anthropologists including Michel Callon, Bruno Latour, John Law, Anselm Strauss, Lucy Suchman, and others. Some of their work has been previously loosely gathered in actor network theory. Here the approach to the philosophy of science is to study how scientific communities actually operate.

[edit] Continental philosophy of science

In the Continental philosophical tradition, science is viewed from a world-historical perspective. One of the first philosophers who supported this view was Georg Wilhelm Friedrich Hegel. Philosophers such as Ernst Mach, Pierre Duhem and Gaston Bachelard also wrote their works with this world-historical approach to science. Nietzsche advanced the thesis in his "The Genealogy of Morals" that the motive for search of truth in sciences is a kind of ascetic ideal.

All of these approaches involve a historical and sociological turn to science, with a special emphasis on lived experience (a kind of Husserlian "life-world"), rather than a progress-based or anti-historical approach as done in the analytic tradition. Two other approaches to science include Edmund Husserl's phenomenology and Martin Heidegger's hermeneutics.

The largest effect on the continental tradition with respect to science was Martin Heidegger's assault on the theoretical attitude in general which of course includes the scientific attitude. For this reason one could suggest that the philosophy of science, in the Continental tradition, has not developed much further due to its inability to overcome Heidegger's criticism.

Notwithstanding, there have been a number of important works: especially a Kuhnian precursor, Alexandre Koyré. Another important development was that of Foucault's analysis of the historical and scientific thought in The Order of Things and his study of power and corruption within the "science" of madness.

Several post-Heideggerian authors contributing to the Continental philosophy of science in the second half of the 20th century include Jürgen Habermas (e.g., "Truth and Justification", 1998), Carl Friedrich von Weizsäcker ("The Unity of Nature", 1980), and Wolfgang Stegmüller ("Probleme und Resultate der Wissenschafttheorie und Analytischen Philosophie", 1973-1986).