If we go to the cinema to relax without thinking too much about its inner workings, it remains the result of an incredible technological evolution marked by groundbreaking firsts.
Kinetograph: The First Camera
The camera is arguably the most iconic tool of the film industry, and it has undergone numerous transformations. The first was the Kinetograph, invented in 1891 by Thomas Edison and William Dickson, based on the work of Étienne-Jules Marey, who had designed an instrument to study the flight of birds. The Kinetograph used 35-millimeter celluloid film perforated on both sides—also invented by Edison—and the moving image had to be viewed inside a wooden box through a peephole called the Kinetoscope.
Invention of Film
A camera needs a recording medium—this is where film comes in, another major symbol of cinema. It was invented in 1888 by John Carbutt and commercialized the following year by George Eastman. He developed nitrate-based film coated with a gelatin emulsion containing silver halide crystals, which reacted to light to capture images. Initially 35 millimeters wide, film later became available in 16 and then 8 millimeters, making amateur filmmaking possible.
First Movie Screening: A Moment of Wonder
L’arrivée d’un train en gare de La Ciotat (translated from French into English as The Arrival of a Train at La Ciotat Station, Arrival of a Train at La Ciotat
The first paid film screening took place in 1895 in the Indian Salon of the Grand Café in Paris. It was organized by the Lumière brothers, who were inspired by their father, who had witnessed the marvels of the Kinetograph in 1894. To modern audiences, these films may not seem like “movies” in the artistic sense, as they had no scripts or creative direction. However, Arrival of a Train at La Ciotat left the well-to-do spectators in awe.
Where Was the First Movie Theater Built?
The Indian Salon of the Grand Café, where the Lumière brothers’ film was shown, could be considered the first movie theater. However, this was a repurposed space, not originally designed for cinema. The first buildings dedicated to film screenings appeared in the early 20th century: Le Petit Journal in 1904, Cinéma-Théâtre in 1906… The Pathé cinema chain, known for its comfortable theaters, was also founded in 1906.
First Sound Film
The Jazz Singer
Until this point, all films were silent. Music was played live by a musician or a phonograph to accompany the visuals.
However, in 1927, The Jazz Singer, produced by the Warner brothers, became the first sound film. It contained only 354 spoken words, but this was a revolution in cinema history. The arrival of sound created a clear divide between the silent era and the new age of “talkies.”
First Film Studio
The concept of the “first film studio” depends on how one defines a studio. If we consider a studio as a dedicated space for filming, the title likely goes to Thomas Edison’s Black Maria, built in 1893 in New Jersey, USA. This tar-paper-covered wooden structure was the world’s first film production facility, designed to shoot short clips for Edison’s Kinetoscope.
It featured a rotating base to follow sunlight, as artificial lighting was not yet advanced enough for indoor filming. The Black Maria primarily produced experimental clips, such as Fred Ott’s Sneeze (1894), as well as vaudeville acts and athletic demonstrations. However, it lacked narrative storytelling or special effects, focusing instead on technical novelty.
Around the same time, the Lumière brothers in Lyon, France, were pioneering cinema in a different way. While they did not build a traditional studio, their workshop became the birthplace of the Cinématographe, a device that could capture, project, and print films.
The Lumière brothers are best known for hosting the first public film screening on December 28, 1895, in Paris. Their films, such as Workers Leaving the Lumière Factory and The Arrival of a Train, were documentary-style “actualités” shot outdoors or in factory settings. Though groundbreaking, their work did not involve a controlled studio environment.
The first modern film studio, designed specifically for narrative filmmaking and special effects, was built by Georges Méliès in 1897 in Montreuil, France. Méliès, a master illusionist, transformed his family estate into a glass-roofed facility with artificial lighting, trapdoors, and painted backdrops. This studio allowed him to control lighting and weather conditions, enabling the creation of fantastical films like A Trip to the Moon (1902). Méliès pioneered techniques such as stop-motion, double exposure, and hand-painted color, establishing cinema as a storytelling medium and inspiring future filmmakers.
While Edison’s Black Maria and the Lumière brothers’ workshop laid the technical and commercial foundations of cinema, Méliès’ studio marked the transition of film into an art form. Other early studios, such as Robert Paul’s in London (1898) and Nordisk Film in Denmark (1906), further expanded the industry by experimenting with special effects and producing large-scale narrative films.
The “War of Currents” refers to the late 19th-century rivalry between two inventors: Thomas Edison and Nikola Tesla, who was supported by the prominent American industrialist and engineer George Westinghouse. Edison advocated for direct current (DC), while Tesla championed alternating current (AC). Both types of current can power electrical appliances and communication devices, but they behave differently when transmitted over distances.
Historically, DC generators appeared first. They had low voltage and could power electric light bulbs and provide signal transmission, but only over short distances of up to 1-2 km. Over longer distances, voltage dropped and network resistance increased.
AC is transmitted at higher voltages, allowing for more efficient energy transmission over long distances. Its energy can be used not only to supply low-power lighting devices but also, for example, in industrial enterprises with heavy machinery. AC machines received a boost in development in the 1870s with Pavel Yablochkov’s creation of an arc lamp powered by alternating current (see question 8), but the “War of Currents” truly began in the 1880s when Nikola Tesla started working for Edison’s company and demonstrated his AC motor designs based on a rotating magnetic field.
Who Were Edison and Tesla?
Thomas Edison. Circa 1906. Image: Library of Congress
Thomas Edison (1847-1931) was an American inventor and entrepreneur. In his youth, he worked as a telegraph operator, was interested in electrical engineering, and enthusiastically repaired and improved devices. Around 1869, Edison sold his first patent for an improved stock ticker — a system for telegraphing stock market data. In 1873, he visited England, where he became familiar with the complex laboratory machinery used by telegraph engineers for communication. After returning to the USA, he organized a laboratory and later a whole “invention factory”: in 1875-1876 in Menlo Park, and in 1887 in West Orange. He became famous after inventing the phonograph in 1877.
After establishing the production of dynamos, cables, and light bulbs, Edison launched DC power stations in London and New York in 1882. In 1889, the Edison General Electric Company was formed through the merger of three of Edison’s manufacturing companies that had been operating since the early 1880s and his patent company, founded in 1878. He was 42 years old at the time.
Nikola Tesla (1856-1943) was an engineer and inventor of Serbian origin. He became famous for developing AC motors and the idea that electricity and messages could be transmitted wirelessly. He went down in history as an extravagant and even odious figure, with his biography shrouded in mysteries and myths. In 1882, while pondering a problem he had formulated in 1878 while studying in Graz, the 26-year-old Tesla came up with a way to use the phenomenon of a rotating magnetic field, which would allow the construction of an AC electric motor.
What Did They Disagree About?
In the same year, 1882, Tesla arrived at the Paris office of Continental Edison Company, where he proposed the idea of a new motor. Instead of considering the proposal, Tesla was sent to build a problematic power station for the Strasbourg railway station. Tesla helped launch it, but when he returned to Paris for the promised fee, the company refused to pay.
Not abandoning his ideas in electrical engineering, Tesla went to the USA and got a job as a repair engineer for DC generators at Edison Machine Works. The company offered Tesla to improve DC machines and promised a substantial bonus — reportedly $50,000. Tesla soon presented 24 variants of new generators. Edison approved the work but refused to pay, joking that the Serbian immigrant didn’t understand American humor well. According to another version of this event, Tesla himself offered to sell his patent to the company for $50,000, but the company just laughed. Either way, from that moment on, Edison became Tesla’s lifelong enemy.
Tesla left Edison’s company, but George Westinghouse, who was well-versed in the industry and knew the weaknesses of DC power stations, saw potential in his developments. He wanted to use Tesla’s ideas to overcome these shortcomings.
What Was Westinghouse’s Interest?
Illustration from Westinghouse Electric Company catalog. 1888. Image: Wikimedia Commons
Westinghouse’s AC stations and devices began to compete with Edison’s almost established monopoly from the mid-1880s. Moreover, Westinghouse began using transformers that allowed high AC voltage to be lowered so it could be used for home lighting as well. If before the advent of transformers, Edison and Westinghouse could have occupied separate niches (Edison in lighting and communication, Westinghouse in long-distance power transmission and industry with powerful installations), current transformation allowed Westinghouse to compete with Edison in his field – lighting.
What Role Did the Simple Electric Meter Play in the “War of Currents”?
Edison developed an electricity meter in his laboratory and received a patent for it in 1881. There were several such patents afterward. Electric measuring devices existed before this; scientists actively began inventing and manufacturing them from the early 19th century when the first electric batteries appeared and electricity research began to shift from recording qualitative characteristics of the phenomenon to quantitative ones.
Edison’s development is important because the new meter became an element of a centralized energy supply and lighting system, turning electricity into a commercial commodity. It was a kind of final touch, making the whole system complete: power plant, wires, lamps, and a meter recording the relationship between the company and the consumer.
Electricity supply according to this system was organized by Edison’s company in New York in 1882. According to the scientist’s plan, the device for measuring electrical energy should have worked on the same principle as the gas meter familiar to consumers. However, unlike gas, electricity has no weight or smell, so it had to be measured by indirect signs. Edison’s meter was based on Faraday’s law of electrolysis, according to which the mass of substance deposited on an electrode is directly proportional to the amount of electricity transferred to that electrode.
The specificity of such a meter design was that users could not independently take measurements. Company employees would disconnect the meters once a month, remove the electrodes, take them to the laboratory, where electricity consumption was calculated based on weighing data, and then a bill was sent by mail. This method was inconvenient and non-transparent for consumers. Moreover, the electrolytic principle was suitable for a DC system but not for an AC system.
In the context of the “War of Currents,” the invention of the meter was important in the sense that it secured the status of electricity as a mass measurable commodity, allowing the construction of an electricity market. The introduction of AC generators implied a redistribution of this market.
Is It True That Everyone Was Afraid of Alternating Current?
The myriad of telephone, telegraph, and power lines over the streets of New York City in a photo of the Great Blizzard of 1888. An AC line that fell during the storm led to the electrocution of a boy that spring. Image: Public Domain, Wikimedia
Electricity was a trendy topic in American press and public discourse of the 1880s. Electricity was associated with hopes for progress, described as a revolutionary technology capable of illuminating, moving, and even healing. However, there was also fear. In addition to Edison’s company, several other companies in New York offered lighting and telegraph communication. According to journalists’ remarks, they literally enveloped the city with a huge number of wires that covered the sky. Not all wires were maintained in good condition; there were exposed sections.
The number of accidents related to electricity was growing. From May 1887 to September 1889, seventeen New York residents died from electric shock. The fear of electricity intensified and was used in the “War of Currents.” One could say that Edison tried to play on the increasing fear and indignation of New Yorkers. The New York press immediately switched from stories about the comparative merits of gas and electric lighting to reports of “deaths by wire,” and each such report further fueled public fear of high-voltage alternating current.
What Does the Electric Chair Have to Do With This?
Execution by electricity. Illustration from Scientific American. 1888
The history of introducing execution by electricity can be viewed as a story of public confrontation between Edison’s and Westinghouse’s companies. The cunning PR pursued economic interests but relied on the idea of the humanism of technological progress.
Since the 1860s, debates about humanizing the death penalty resumed in the USA — the punishment of the condemned was to be death without physical suffering. In 1885 (according to other data, in 1887), a commission was created in New York State, which for several years consulted with doctors, engineers, and lawyers regarding the most humane and technological methods of execution. In the commission’s report, electricity was given priority, as this method was described as painless, reliable, and fast, that is, as civilized as possible. The guillotine and shooting, according to the report, were also considered reliable and fast methods, but they left visible damage to the body. Execution by electricity was legalized in 1888.
One of the leading figures in the technical implementation of the execution law was Harold P. Brown. In the late 1870s, he was responsible for designing and installing arc lamps in Chicago and consulted in the field of electrical engineering, and from 1887, after moving from Chicago to New York, he began collaborating with Thomas Edison. First, experiments were conducted on animals. Thomas Edison’s laboratory in West Orange provided technical assistance in conducting them. Brown used alternating current. For example, a horse and several calves were killed. Thanks to Edison’s lobbying efforts, it was alternating current that was chosen as the deadly weapon against criminals. Due to the public campaign launched by Edison, alternating current gained a reputation as a more powerful and excessively dangerous technology, unsuitable for everyday use.
The first person sentenced to death in the electric chair was the murderer William Kemmler. Westinghouse felt that his business interests were under threat, so he hired the best lawyers for Kemmler. They built the defense on the fact that the use of electricity for execution was unconstitutional and that its lethal force was not proven. The public condemned the defendant’s defenders for merely defending the company’s economic interests. The execution took place in 1890; it was long and painful. Nevertheless, the first experience of such an execution was recognized as successful and paved the way for the subsequent development of this technology.
In 1901, Edison’s film company shot a series of films about the activities of US President William McKinley. One of the episodes was dedicated to the execution of his killer — the anarchist Leon Czolgosz. For this, the process of carrying out the sentence was reconstructed in Auburn Prison (New York State). The film showed technological details, and the execution was presented in a neutral scientific and medical manner without emotions and suffering.
Who Actually Invented the Light Bulb?
It’s difficult to name a single creator of the light bulb. Many inventors contributed to it.
The use of electricity to create an artificial light source was discussed from the beginning of the 19th century. In 1802, Vasily Petrov, a corresponding member of the St. Petersburg Academy of Sciences, discovered the phenomenon of the electric arc, that is, the occurrence of an electric discharge in gas. Inventors experimented with different power source designs and electrode materials. For example, during the coronation of Alexander II in 1856, illumination was organized using arc lamps made by inventor Alexander Shpakovsky.
In 1873, electrical engineer Alexander Lodygin, having pumped air out of the bulb, created incandescent electric lamps capable of burning for more than 700 hours. He received patents for his invention in Russia and several European countries.
One example of successful technology based on the phenomenon of the electric arc and subsequently widely used was the carbon lamp created by Pavel Yablochkov, the so-called “electric candle.” In 1875, Yablochkov went to Paris, where he constructed an industrial sample of an electric lamp. And in 1878, Yablochkov’s lighting system “Russian light” was demonstrated at the World Exhibition in Paris. At the same time, English engineer Joseph Swan proposed a lamp with a carbon rod in the form of a filament.
In turn, Thomas Edison created a lamp that surpassed others in reliability and became commercially successful. Most likely, he was familiar with Yablochkov’s “candle” and Lodygin’s incandescent lamp with tungsten filaments, but he improved the lamp’s design. He experimented with various heating element materials, tried platinum, boron, chromium, in 1879 received a patent confirming the invention of an incandescent lamp with a platinum filament, and in 1880 — a patent for an incandescent lamp with a carbon filament. Trying to improve the filament, he also experimented with high-quality Bristol cardboard and natural plant fibers — botanists collected samples in South America, Cuba, Australia on his instructions. Bamboo turned out to be the most suitable.
On January 1, 1880, a large-scale demonstration of the new lighting system took place at Edison’s laboratory in Menlo Park, which was attended by more than 3,000 people. After the successful presentation, Edison received many requests for lighting installation. Mass distribution of his lamps began. By the beginning of the “War of Currents,” Edison’s invention was an integral part of the electricity market.
When and How Did the “War of Currents” End?
The final chord of the confrontation between Edison and Tesla was the World’s Fair in Chicago in 1893. Westinghouse, together with Tesla, won the tender for illuminating the event and installed 200,000 lamps powered by alternating current. And in 1894, Westinghouse built a large hydroelectric power station at Niagara Falls, operating on a three-phase AC generator. It transmitted energy over long distances at high voltage. This was a triumph for alternating current and a defeat for direct current.
However, the history of the rivalry between direct and alternating current did not end there. An example can be given from Soviet electrification. Soviet engineers and economists in the early 1930s returned to the question of choosing a current system for creating a unified centralized network of high-voltage transmissions. The historical victory of alternating current was obvious, but in the USSR, this technology seemed unsatisfactory and short-sighted. The fact is that when transmitting energy over distances of more than 250-300 km, a limitation of the three-phase AC system was discovered – reactive resistance, leading to large energy losses. Direct current was transmitted without losses, and the infrastructure allowed saving on compensating equipment and maintaining high voltage.
Under these conditions, direct current became the embodiment of the idea of scaling electrification. The longer the power transmission line and the greater the transmitted power, the higher (not lower!) the throughput and economic benefit became. Then, in the 1930s, engineers associated the possibility of transferring millions of kilowatts over thousands of kilometers, ensuring the country’s technological and material abundance, precisely with direct current.
Looking ahead, we note that DC lines did not become the only or universal ones in the Soviet system, but they did indeed show their effectiveness and are still used in Russia and the world to this day.