A Power Problem
I recently moved into my new house, and the first thing I planned to do was to turn one of the rooms into a home theater. I'd been dreaming of this for a long time and was excited to get started. I was talking to a friend of mine who happened to be an electrician, and he warned me that I needed to be careful when choosing the equipment so that I didn't overload the circuit breakers. Heeding his warning, I went home and checked the rating of the circuit breaker for my home theater room. It said 15 amps.
Okay, I thought; now I know that together the equipment can draw no more than 15 amps of current. I got out the home theater catalog and started looking at TVs and surround sound systems. I immediately saw a problem. All of the equipment was rated in watts of power and not amps of current! I knew that power and current were related in some way, but I wasn't sure how.
Power and Watts
In search of answers, I dug out my old physics textbook and started reading. It turns out that power is a measure of how much energy is used over a period of time. There are many different units that can be used to express power, such as horsepower for cars. However, electrical power is almost universally given in watts. One watt is equal to one joule per second. This made sense when I considered that power is a measure of how much energy, measured in joules, is used per unit of time, like a second. I guess it's just easier to say watts instead of joules per second.
Power, Current and Voltage
All this new information was great, but I still wasn't sure how to solve my problem, so I kept reading. The book went on to say that electrical power used by a device can be calculated by multiplying the current by the difference between the voltage going into and out of the device. Now I knew I was getting somewhere.
The home theater equipment was rated in watts of power and the breaker in amps of current, so all I was missing was the voltage. I knew that for devices plugged into an electrical outlet, the voltage difference through the device was equal to the voltage difference between the two slots of the outlet. Using my electrical meter, I carefully probed the electrical outlet and measured 120 volts.
With all the pieces of the puzzle, I multiplied 120 volts by the maximum current of the circuit breaker, 15 amps, and came up with 1800 watts of power. This meant that I had to select a TV and surround sound system that together used no more than 1800 watts of power.
Power, Current and Resistance
Later that day, my electrician friend came over to see how I was coming along with picking out equipment. Excitedly, I showed him the gigantic 80-inch flat screen TV and mega-bass surround sound system I had selected from the catalog. I assured him that I had done my research and figured out that this equipment was just under the 1800-watt limit that I had calculated. 'Not so fast,' he said; 'You're forgetting something! You're not accounting for the power lost in the house wiring before it even gets to your equipment.' He was right. I hadn't even thought of this.
Due to the resistance of the wires running from the circuit breaker to the home theater room, some of the electrical energy would be converted to heat, reducing the amount of power available at the outlet. I needed to figure out how much power would be lost, but there was a problem. I couldn't measure the voltage difference from one end of the wire to the other unless there were 15 amps of current flowing through it. Unfortunately, I didn't have any way to do that. Desperate for answers, I went back to my physics book.
Perhaps I shouldn't have stopped reading when I thought I had all the information the first time. Further down the page, I learned that electrical power can also be calculated using current and resistance instead of the voltage. Now, instead of using the voltage at the outlet, I needed to measure the resistance of the wire between the breaker and my home theater room. Enlisting the help of my friend, we used my electrical meter and measured a resistance of 1 ohm in the wire. Using the equation from the book, I multiplied the maximum current of 15 amps by itself and then multiplied that number by the resistance.
The result told me that I would lose 225 watts of power in the house wiring before it even got to the room! Wow - I was surprised how much of a difference this made. Subtracting the power loss from my previous calculation of 1800 watts left me with less than 1600 watts to power my home theater. My friend, being the cautious type, said I should probably round that number down to 1500 watts to be on the safe side. Going back to the catalog, my 80-inch flat screen TV had to shrink down to 60 inches to stay within the power limits. In the end, I guess it was better that I learned about electrical power and bought the right TV rather than tripping the breaker every time I tried to watch a movie!
Power is a measure of how much energy is used over a period of time. Electrical power is almost always measured in units of watts, where one watt is equal to one joule per second. The electrical power consumed by a device is equal to the current multiplied by the difference between the voltage going into and going out of the device. If the voltage difference isn't known, power can also be calculated by multiplying the current by itself and then multiplying the result by the resistance.
After watching this lesson, you'll be able to:
- Define power and tell how it's typically expressed
- Calculate how much electrical power a device consumes
- Find power when the voltage difference is not known
To unlock this lesson you must be a Study.com Member.
Create your account
Electric power is the time rate of change of electric energy. The units of power are watts (W), which are equal to joules per second. The power (P) is calculated by multiplying I (electric current in amperes) by V, (the potential difference in volts). When Ohms law holds, V = I * R, where R is the resistance in ohms, the power can be expressed as P = I^2 R. Power can also be expressed in terms of voltage and resistance as P = V^2 /R.
Calculate the resistance of a 40-watt automobile headlight designed for 12 volts.
Electric companies charge for energy used. The total energy is the power multiplied by the time it is used. One kilowatt hour equals 1,000 watts (3,600 sec) = 3,600,000 joules.
An electric heater draws 15a on a 120-volt line. How much power does it require and how much will it cost to run the heater for 30 days at three hours (3) a day, when the electric company charges $0.105 per kilowatt hour?
Answer 1: Use the relation R = V^2 / P= 12(12) / 40 = 3.6 ohms.
Answer 2: P = IV = (15 A)120 V = 1,800 watts. The cost = (1.8)(90)($.105) = $17.
Register to view this lesson
Unlock Your Education
See for yourself why 30 million people use Study.com
Become a Study.com member and start learning now.Become a Member
Already a member? Log InBack