Search results
By 1908 there were thousands of storefront Nickelodeons, Gems and Bijous across North America. A few theaters from the nickelodeon era are still showing films today. The 1913 opening of the Regent Theater in New York City signaled a new respectability for the medium, and the start of the two-decade heyday of American cinema design.
The cinema of the United States, consisting mainly of major film studios (also known metonymously as Hollywood), along with some independent films, has had a large effect on the global film industry since the early 20th century. Classical Hollywood cinema, which developed from 1910 to 1962, is still typical of most films made in America today.
- Overview
- United States
In the last years of the 20th century and the early years of the 21st century, the idea of “synergy” dominated the motion-picture industry in the United States, and an unprecedented wave of mergers and acquisitions pursued this ultimately elusive concept. Simply put, synergy implied that consolidating related media and entertainment properties under a single umbrella could strengthen every facet of a coordinated communications empire. Motion pictures, broadcast television, cable and satellite systems, radio networks, theme parks, newspapers and magazines, book publishers, manufacturers of home entertainment products, sports teams, Internet service providers—these were among the different elements that came together in various corporate combinations under the notion that each would boost the others. News Corporation Ltd., originally an Australian media company, started the trend by acquiring Twentieth Century–Fox in 1985. The Japanese manufacturing giant Sony Corporation acquired Columbia Pictures Entertainment, Inc., from The Coca-Cola Company in 1989. Another Japanese firm, Matsushita, purchased Universal Studios (as part of Music Corporation of America, or MCA) in 1990; it then was acquired by Seagram Company Ltd. (1995), became part of Vivendi Universal Entertainment (2000), and merged with the National Broadcasting Co., Inc. (2004), a subsidiary of the Comcast Corporation. Paramount Pictures, as Paramount Communications, Inc., became part of Viacom Inc. In perhaps the most striking of all ventures, Warner Communications merged with Time Inc. to become Time Warner Inc., which in turn came together with the Internet company America Online (AOL) to form AOL Time Warner in 2001. The company then changed its name again, back to Time Warner Inc., in 2003; it was purchased by AT&T in 2018 and renamed WarnerMedia. The Disney Company itself became an acquirer, adding Miramax Films, the television network American Broadcasting Company, the cable sports network ESPN, and, in 2019, 20th Century Fox, among other properties. The volume of corporate reshuffling and realignment had an undoubted impact on the studios involved. Nevertheless, the potential for success of such synergistic entities—and, more particularly, the positive or negative effect on their motion-picture units—remained an open question.
It could well be argued, however, that motion-picture companies’ corporate links with the wider media world and emergent communications forms such as the Internet fostered receptivity to new technologies that rapidly transformed film production in the 1990s and into the 21st century. As early as 1982, the Disney film Tron made extensive use of computer-generated images, which were introduced in a short special-effects sequence in which a human character is deconstructed into electronic particles and reassembled inside a computer. A few years later computer-generated imagery was greatly facilitated when it became possible to transfer film images into a computer and manipulate them digitally. The possibilities became apparent in director James Cameron’s Terminator 2: Judgment Day (1991), in images of the shape-changing character T-1000.
In the 1990s computer-generated imagery (CGI) made rapid strides and became a standard feature not only of Hollywood action-adventure films but also of nearly any work that required special visual effects. Examples of landmark films utilizing the new technologies included Steven Spielberg’s Jurassic Park (1993); Independence Day (1996), directed by Roland Emmerich; and The Matrix (1999), written and directed by Larry (later Lana) Wachowski and Andy (later Lilly) Wachowski. In Spielberg’s film, based on a best-selling novel by Michael Crichton, a number of long-extinct dinosaur species are re-created through genetic engineering. At the special-effects firm Industrial Light and Magic, models of the dinosaurs were scanned into computers and animated realistically to produce the first computer-generated images of lifelike action, rather than fantasy scenes. In Independence Day, a film combining the science-fiction and disaster genres in which giant alien spaceships attack Earth, an air battle was programmed in a computer so that each individual aircraft maneuvered, fired its weapons, and dueled with other flying objects in intricate patterns of action that would have been too time-consuming and costly to achieve by traditional special-effects means. By the end of the 1990s, the developing new technologies were displayed perhaps more fully than ever before in the Wachowskis’ spectacular film, in which the computer functions as both a central subject and a primary visual tool. For a scene in which actor Keanu Reeves appears to be dodging bullets that pass by him in a visible slow-motion trajectory, a computer program determined what motion-picture and still images were to be photographed, and then the computer assembled the images into a complete visual sequence.
In part through the expensive and lavish effects attained through the new technologies, American cinema at the end of the 20th century sustained and even widened its domination of the world film marketplace. Domestically, the expansion of ancillary products and venues—which during the 1990s were dominated by the sale and rental of video cassettes and then DVDs for home viewing as well as by additional cable and satellite outlets for movie presentation—produced new revenues that were becoming equal to, or in some cases more important than, income from theatrical exhibition. Nevertheless, exhibition outlets continued to grow, with new “megaplex” theatres offering several dozen cinemas, while distribution strategies called for opening major commercial films on 1,000 or more—sometimes as many as 3,000 by the late 1990s—screens across the country. The competition for box-office returns became something of a spectator sport, with the media reporting every Monday on the previous weekend’s multimillion-dollar grosses and ranking the top-10 films by ticket sales. The exhibition environment seemed to demand more than ever that film production be geared to the tastes of teenage spectators who frequented the suburban mall cinemas on weekends, and commentators within the industry as well as outside it observed what they regarded as the diminished quality of mainstream films. As if reflecting that judgment, in 1996 only one major studio film, Jerry Maguire, was among the five nominees for best picture at the annual Academy of Motion Picture Arts and Sciences awards ceremony (the other nominees were an American independent film, Fargo; an Australian work, Shine; a film from Britain, Secrets & Lies; and the winner, an international production with British stars and based on a novel written by a Canadian, The English Patient).
The motion-picture industry’s emphasis on pleasing the youth audience with special effects-laden blockbusters and genre works such as teen-oriented horror films and comedies inevitably diminished the role of directors as dominant figures in the creative process, further reducing the status that Hollywood directors had attained in the auteur-oriented 1960s and ’70s. Still, more than a handful of filmmakers, several of them veterans of that earlier era, maintained their prestige as artists practicing in a commercial medium. Two of the most prominent, who had launched their careers in the early 1970s, were Steven Spielberg and Martin Scorsese. In addition to Jurassic Park, Spielberg’s works in the 1990s included Schindler’s List (1993, winner of an Academy Award for best picture), Amistad (1997), and Saving Private Ryan (1998), with A.I. Artificial Intelligence (2001), Munich (2005), Lincoln (2012), and Bridge of Spies (2015) among his subsequent films. Scorsese directed GoodFellas (1990), The Age of Innocence (1993), Casino (1995), Kundun (1997), Gangs of New York (2002), The Departed (2006; winner of an Academy Award for best picture), and The Irishman (2019), the latter of which made use of CGI to make veteran actors look decades younger.
The actor-director Clint Eastwood was also prolific in this period, winning the best picture Academy Award with Unforgiven (1992) and directing such other films as Mystic River (2003), Million Dollar Baby (2004; Academy Award for best picture and best director), Letters from Iwo Jima (2006), Gran Torino (2008), Invictus (2009), American Sniper (2014), Sully (2016), and The Mule (2018). Stanley Kubrick died before the release of Eyes Wide Shut (1999), his first film since Full Metal Jacket (1987). Two decades passed between Terrence Malick’s Days of Heaven (1978) and The Thin Red Line (1998), but he became more prolific after the turn of the 21st century, directing The New World (2005), The Tree of Life (2011), Knight of Cups (2015), and A Hidden Life (2019).
In the last years of the 20th century and the early years of the 21st century, the idea of “synergy” dominated the motion-picture industry in the United States, and an unprecedented wave of mergers and acquisitions pursued this ultimately elusive concept. Simply put, synergy implied that consolidating related media and entertainment properties under a single umbrella could strengthen every facet of a coordinated communications empire. Motion pictures, broadcast television, cable and satellite systems, radio networks, theme parks, newspapers and magazines, book publishers, manufacturers of home entertainment products, sports teams, Internet service providers—these were among the different elements that came together in various corporate combinations under the notion that each would boost the others. News Corporation Ltd., originally an Australian media company, started the trend by acquiring Twentieth Century–Fox in 1985. The Japanese manufacturing giant Sony Corporation acquired Columbia Pictures Entertainment, Inc., from The Coca-Cola Company in 1989. Another Japanese firm, Matsushita, purchased Universal Studios (as part of Music Corporation of America, or MCA) in 1990; it then was acquired by Seagram Company Ltd. (1995), became part of Vivendi Universal Entertainment (2000), and merged with the National Broadcasting Co., Inc. (2004), a subsidiary of the Comcast Corporation. Paramount Pictures, as Paramount Communications, Inc., became part of Viacom Inc. In perhaps the most striking of all ventures, Warner Communications merged with Time Inc. to become Time Warner Inc., which in turn came together with the Internet company America Online (AOL) to form AOL Time Warner in 2001. The company then changed its name again, back to Time Warner Inc., in 2003; it was purchased by AT&T in 2018 and renamed WarnerMedia. The Disney Company itself became an acquirer, adding Miramax Films, the television network American Broadcasting Company, the cable sports network ESPN, and, in 2019, 20th Century Fox, among other properties. The volume of corporate reshuffling and realignment had an undoubted impact on the studios involved. Nevertheless, the potential for success of such synergistic entities—and, more particularly, the positive or negative effect on their motion-picture units—remained an open question.
It could well be argued, however, that motion-picture companies’ corporate links with the wider media world and emergent communications forms such as the Internet fostered receptivity to new technologies that rapidly transformed film production in the 1990s and into the 21st century. As early as 1982, the Disney film Tron made extensive use of computer-generated images, which were introduced in a short special-effects sequence in which a human character is deconstructed into electronic particles and reassembled inside a computer. A few years later computer-generated imagery was greatly facilitated when it became possible to transfer film images into a computer and manipulate them digitally. The possibilities became apparent in director James Cameron’s Terminator 2: Judgment Day (1991), in images of the shape-changing character T-1000.
In the 1990s computer-generated imagery (CGI) made rapid strides and became a standard feature not only of Hollywood action-adventure films but also of nearly any work that required special visual effects. Examples of landmark films utilizing the new technologies included Steven Spielberg’s Jurassic Park (1993); Independence Day (1996), directed by Roland Emmerich; and The Matrix (1999), written and directed by Larry (later Lana) Wachowski and Andy (later Lilly) Wachowski. In Spielberg’s film, based on a best-selling novel by Michael Crichton, a number of long-extinct dinosaur species are re-created through genetic engineering. At the special-effects firm Industrial Light and Magic, models of the dinosaurs were scanned into computers and animated realistically to produce the first computer-generated images of lifelike action, rather than fantasy scenes. In Independence Day, a film combining the science-fiction and disaster genres in which giant alien spaceships attack Earth, an air battle was programmed in a computer so that each individual aircraft maneuvered, fired its weapons, and dueled with other flying objects in intricate patterns of action that would have been too time-consuming and costly to achieve by traditional special-effects means. By the end of the 1990s, the developing new technologies were displayed perhaps more fully than ever before in the Wachowskis’ spectacular film, in which the computer functions as both a central subject and a primary visual tool. For a scene in which actor Keanu Reeves appears to be dodging bullets that pass by him in a visible slow-motion trajectory, a computer program determined what motion-picture and still images were to be photographed, and then the computer assembled the images into a complete visual sequence.
In part through the expensive and lavish effects attained through the new technologies, American cinema at the end of the 20th century sustained and even widened its domination of the world film marketplace. Domestically, the expansion of ancillary products and venues—which during the 1990s were dominated by the sale and rental of video cassettes and then DVDs for home viewing as well as by additional cable and satellite outlets for movie presentation—produced new revenues that were becoming equal to, or in some cases more important than, income from theatrical exhibition. Nevertheless, exhibition outlets continued to grow, with new “megaplex” theatres offering several dozen cinemas, while distribution strategies called for opening major commercial films on 1,000 or more—sometimes as many as 3,000 by the late 1990s—screens across the country. The competition for box-office returns became something of a spectator sport, with the media reporting every Monday on the previous weekend’s multimillion-dollar grosses and ranking the top-10 films by ticket sales. The exhibition environment seemed to demand more than ever that film production be geared to the tastes of teenage spectators who frequented the suburban mall cinemas on weekends, and commentators within the industry as well as outside it observed what they regarded as the diminished quality of mainstream films. As if reflecting that judgment, in 1996 only one major studio film, Jerry Maguire, was among the five nominees for best picture at the annual Academy of Motion Picture Arts and Sciences awards ceremony (the other nominees were an American independent film, Fargo; an Australian work, Shine; a film from Britain, Secrets & Lies; and the winner, an international production with British stars and based on a novel written by a Canadian, The English Patient).
The motion-picture industry’s emphasis on pleasing the youth audience with special effects-laden blockbusters and genre works such as teen-oriented horror films and comedies inevitably diminished the role of directors as dominant figures in the creative process, further reducing the status that Hollywood directors had attained in the auteur-oriented 1960s and ’70s. Still, more than a handful of filmmakers, several of them veterans of that earlier era, maintained their prestige as artists practicing in a commercial medium. Two of the most prominent, who had launched their careers in the early 1970s, were Steven Spielberg and Martin Scorsese. In addition to Jurassic Park, Spielberg’s works in the 1990s included Schindler’s List (1993, winner of an Academy Award for best picture), Amistad (1997), and Saving Private Ryan (1998), with A.I. Artificial Intelligence (2001), Munich (2005), Lincoln (2012), and Bridge of Spies (2015) among his subsequent films. Scorsese directed GoodFellas (1990), The Age of Innocence (1993), Casino (1995), Kundun (1997), Gangs of New York (2002), The Departed (2006; winner of an Academy Award for best picture), and The Irishman (2019), the latter of which made use of CGI to make veteran actors look decades younger.
The actor-director Clint Eastwood was also prolific in this period, winning the best picture Academy Award with Unforgiven (1992) and directing such other films as Mystic River (2003), Million Dollar Baby (2004; Academy Award for best picture and best director), Letters from Iwo Jima (2006), Gran Torino (2008), Invictus (2009), American Sniper (2014), Sully (2016), and The Mule (2018). Stanley Kubrick died before the release of Eyes Wide Shut (1999), his first film since Full Metal Jacket (1987). Two decades passed between Terrence Malick’s Days of Heaven (1978) and The Thin Red Line (1998), but he became more prolific after the turn of the 21st century, directing The New World (2005), The Tree of Life (2011), Knight of Cups (2015), and A Hidden Life (2019).
- Benjamin Hale
- The Origin of Movies. The origin of movies and motion pictures began in the late 1800’s, with the invention of “motion toys” designed to trick the eye into seeing an illusion of motion from a display of still frames in quick succession, such as the thaumatrope and the zoetrope.
- The First Movie. In 1872, Edward Muybridge created the first movie ever made by placing twelve cameras on a racetrack and rigging the cameras to capture shots in quick sequence as a horse crossed in front of their lenses.
- 1900s Movies. The 1900’s were a time of great advancement for film and motion picture technology. Exploration into editing, backdrops, and visual flow motivated aspiring filmmakers to push into new creative territory.
- 1910s Hollywood. The Squaw Man 1914. According to industry myth, the first movie made in Hollywood was Cecil B. DeMille’s The Squaw Man in 1914 when its director decided last-minute to shoot in Los Angeles, but In Old California, an earlier film by DW Griffith had been filmed entirely in the village of Hollywood in 1910.
Mar 21, 2021 · The growth of American film is a remarkable story of innovation, reflecting the country’s evolving cultural landscapes and technical breakthroughs. From the early 1900s’ flickering silent films to the stunning digital effects of the twenty-first century, the silver screen has been a mirror to society’s ambitions, anxieties, and aspirations.
The first huge success of American cinema, as well as the largest experimental achievement to its point, was The Great Train Robbery, directed by Edwin S. Porter. Rise of Hollywood In early 1910, director D.W. Griffith was sent by the American Mutoscope and Biograph Company to the west coast with his acting troop consisting of actors Blanche Sweet, Lillian Gish, Mary Pickford , and Lionel ...
People also ask
When did American cinema start?
When did film start?
Where did cinema start?
How did American cinema change during the 20th century?
When did classical film start?
When did Hollywood become a movie industry?
Nov 7, 2024 · His first three films—Blind Husbands (1918), The Devil’s Passkey (1919), and Foolish Wives (1922)—constitute an obsessive trilogy of adultery; each features a sexual triangle in which an American wife is seduced by a Prussian army officer. Even though all three films were enormously popular, the great sums Stroheim was spending on the extravagant production design and costuming of his ...