When and Why Did the U.S. Transition from 110V to 120V Supply?

Why Did the Voltage Level Increase from 110V to 120V in North America?

The electric power transmission and distribution system in the United States is more complex than in IEC-based countries. Nations like the UK, Australia, and most of Europe and Asia (except Japan) use 230V for single-phase and 400/415V for three-phase systems. In contrast, the U.S. operates with multiple voltage levels, including 120V (formerly 110V), 208V, 240V, 277V, and 480V. Canada follows a similar setup, adding 347V and 600V options for industrial use, especially in North American smelteries.

In most American homes, the electrical panel does not supply only 110V. Instead, it delivers both 120V and 240V single-phase power. The 120V connection—between one hot wire and neutral—is used for small appliances like lamps and TVs. Meanwhile, 240V, measured between two hot wires (Phase 1 and Phase 2), powers heavy loads such as dryers, ovens, and heaters. This is similar in concept to the 230V single-phase supply used in countries following IEC standards.

Around a hundred years ago, the U.S. standard voltage was 110V, but it slowly rose to 120V ±5%, meaning the voltage at your outlet can vary slightly depending on load and wiring distance. In the 1920s, homes commonly used 110V, which increased to 115V during the 1930s and 117V in the 1950s, before finally stabilizing at 120V in the 1960s, where it remains today.

The story begins during the War of Currents between AC and DC systems. Thomas Edison likely chose 110V so that around 100V would reach the outlet after accounting for voltage drops in early wiring systems. At that time, insulation materials and safety standards were poor, so using a lower voltage reduced the risk of electric shock. Moreover, incandescent bulbs ran well at around 100V without burning out too quickly—similar to how Japan later standardized its system at 100V for the same reason.

When and Why Did the U.S. Transition from 110V to 120V Supply

By the 1930s, improvements in light bulb design and the growing use of electric motors encouraged utilities to raise voltage levels to 115V/230V. Later, when electrical systems became more reliable, the National Electrical Code (NEC) officially standardized 120V and 240V across North America. This higher voltage allowed better efficiency and supported new electrical appliances without major redesigns.

Some older appliances and plugs are still rated for 110V or 115V, since they work safely within the acceptable range. Conversely, you might see labels like 125V/250V on modern NEMA receptacles, indicating their maximum safe operating voltage. Because many devices were already built to handle 110–120V, this transition was smooth for both manufacturers and consumers.

Comprehensive resource on : Why Does the USA Use 120V While Most of the World Uses 230V?

Even today, many people casually refer to the system as “110/220”, out of habit from the earlier standard. The same happens in 230V regions, where people still call it “220V”, even though the official change occurred in 1989. This shows how old electrical terms can persist long after the standards themselves evolve.

Good to Know: The switch to alternating current (AC) power was a major factor in the change. In the early days, the U.S. mainly used direct current (DC) at 110–120V for lighting and small appliances. When AC power began to replace DC, engineers kept the same general voltage level—around 120V—so that the new AC systems could work with existing DC wiring and equipment.

This made the transition smoother and less costly, as homes and industries didn’t need to completely rebuild their electrical infrastructure. Think of it like upgrading a road from two lanes to four without changing where it runs—the traffic system improves, but the old paths still work with the new design. The adoption of AC not only standardized voltage but also made long-distance power transmission more efficient and practical across the United States.

When and Why Did They Change from 110V to 120V?

The U.S. residential power supply did not jump from 110V to 120V overnight—it happened gradually, as technology and safety standards improved. This slow transition reflected practical needs and efforts to create a more reliable national standard for homes and businesses.

The change from 110V to 120V began in the 1920s, when engineers and utilities realized that slightly higher voltage improved efficiency and reduced energy losses in wires. At the same time, the growing adoption of alternating current (AC) made it easier to distribute power over long distances compared to direct current (DC). By increasing voltage to 120V, the U.S. could deliver more stable power while keeping older 110V-rated devices safely compatible.

The upgrade also supported the development of new household appliances, electric motors, and lighting systems that performed better and lasted longer at the higher voltage. Over time, electrical codes and standards officially adopted 120V as the nominal value, allowing for a ±5% variation. This balance between safety, performance, and practicality made 120V the enduring standard across North America today.

See also : Why 3-Phase Power? Why Not 6, 12 or More for Power Transmission?

When the Transition Occurred

In the late 19th and early 20th centuries, Thomas Edison’s early direct current (DC) power systems operated at around 110V. This level was chosen mainly to suit the incandescent light bulbs of the time, which worked best at lower voltages and posed less risk of electric shock. Later, alternating current (AC) power, promoted by George Westinghouse and Nikola Tesla, gained popularity because transformers made it possible to easily step voltage up or down for long-distance transmission and safer household use. To maintain compatibility with existing lighting systems, the initial AC standard was also set near 110V.

As technology advanced, the nominal voltage gradually increased—from 110V to 115V, then to 117V during the 1940s and 1950s. Many older appliances from that era are still labeled 110V, 115V, or 117V, reflecting this slow and steady adjustment as electrical efficiency and equipment reliability improved.

Final standardization occurred in the 1960s and 1970s, when the National Electrical Code (NEC) established 120/240V at 60 Hz as the standard across North America. The change became widely accepted around 1967, and NEC revisions in 1968 and 1984 further reinforced it. By the early 1970s, the industry had officially raised voltage ratings from 110/220V to 120/240V, ensuring consistent power delivery and better performance for modern appliances.

Good to Know: The U.S. also adopted 240V circuits for high-power appliances such as electric stoves, dryers, and air conditioners. These are typically supplied through split 120V circuits coming from a 240V main panel, allowing both standard household outlets and heavy-duty connections to coexist safely within the same electrical system.

Why Did the U.S. Transition from 110V to 120V Supply

Why the Transition Occurred

The gradual rise from 110V to 120V and its eventual standardization happened for several practical and technical reasons that improved efficiency, safety, and reliability across the growing U.S. electrical system.

In the early years of electricity, incandescent light bulbs were the main electrical load. These bulbs worked best within a narrow voltage range where they produced good brightness without burning out too quickly. As filament materials and manufacturing techniques improved, bulbs could safely handle slightly higher voltages, making the shift toward 120V both safe and beneficial for lighting performance.

As households began using more electrical appliances—such as irons, washing machines, and refrigerators—the overall power demand increased. Raising the voltage allowed more power (P = V × I) to be delivered without needing much thicker or more expensive wiring. A higher voltage reduces the current (I) required for the same power, which not only cuts costs but also improves energy efficiency.

Read More : Why is a Battery Rated in Ah (Ampere hour), Not in VA?

Another driving factor was the need to regulate voltage more effectively across distribution and transmission transformers. As the number of electric motors in residential and commercial systems grew, the National Electrical Code (NEC) updated motor ratings to align with the 120V standard, ensuring consistent performance and uniform design across the electrical industry.

From an efficiency standpoint, higher voltage reduces line losses in conductors. Since heat loss (I²R) depends on the square of the current, even a small increase from 110V to 120V results in lower current and noticeably less energy lost as heat. Though the difference seems minor, it scales up significantly across millions of homes and power lines.

Finally, the National Electrical Manufacturers Association (NEMA) helped establish 120V as the national standard during the 1920s. This move unified appliance manufacturers, electricians, and utility companies under a common voltage level, ensuring that electrical products and home wiring systems were fully compatible and reliable throughout the United States.

Related Posts :

Why are Stones Used in an Electrical Substation

Why are Stones Used in an Electrical Substation?

Why are Stones, Pebbles, Grit, and Gravel Used in a Switchyard? In an electrical substation, you’ll find many types of…

Read More
Difference between Circuit Breaker and Isolator Disconnector

Difference between Circuit Breaker and Isolator / Disconnector

Difference Between Circuit Breaker and Isolator / Disconnector Electrical safety is something everyone understands is important, but many people don’t…

Read More
Why is the Grounding Wire Bare and Not Insulated

Why is the Grounding Wire Bare and Not Insulated?

Why is Mostly Solid Bare Conductor Used for Grounding Instead of Insulated Wire? A ground wire, also called a grounding…

Read More
Difference between Single Phase and Three Phase Transformer

Difference between Single Phase and Three Phase Transformer

What is the Difference Between 1-Phase Transformer & 3-Phase Transformer? A transformer is an electrical machine used to increase (step-up)…

Read More
Why is an Electric Motor Rated in kW instead of kVA

Why is an Electric Motor Rated in kW instead of kVA?

Why are Electric Motors Rated in kW or Horsepower (hp) and Not in kVA? We know that a transformer is…

Read More
Why Don’t Birds and Squirrels Get Electrocuted on Power Lines?

Why Don’t Birds and Squirrels Get Electrocuted on Power Lines?

Why Don’t Birds Sitting on Power Lines Get Electrocuted? We often say that electricity is both our best friend and…

Read More

Leave the first comment

Follow Us
Engineering Reference
Subscribe to our

Newsletter

We will update you with all the latest books and references.