berlinpose2
Computer
- Jul 15, 2020
- 2
I've done some research and the following is what I understand of the topic. Please correct me if I'm wrong.
When power is transmitted, electricity is sent to a transformer, which increases the voltage and decreases the current according to the relationship S = IV. The reason for doing this is to minimize power losses along the transmission line, which is equal to RI2 . However, isn't power loss also equal to V2 /R, so having a large voltage would also cause a large power drop? Clearly there is a fault in my logic here because the power calculated with current and the power calculated with voltage would not be equal, so I am looking for an explanation of this. Thanks!
When power is transmitted, electricity is sent to a transformer, which increases the voltage and decreases the current according to the relationship S = IV. The reason for doing this is to minimize power losses along the transmission line, which is equal to RI2 . However, isn't power loss also equal to V2 /R, so having a large voltage would also cause a large power drop? Clearly there is a fault in my logic here because the power calculated with current and the power calculated with voltage would not be equal, so I am looking for an explanation of this. Thanks!