kebo2K
Electrical
- Nov 28, 2007
- 7
Hi All,
Here is my situation...
I have a 10V, 10A supply with independent voltage and current control inputs. During a process I have a load attached to the output that will increase in size as time goes. So at the beginning of the process I might set the power supply to 5V,5A and have a .5 ohm load so the power supply output will initially be 2.5V,5A (current control). As time goes however the load will reach 1 ohm and the output becomes 5V, 5A (transition from current control to voltage control). After more time the load becomes 2 ohms and the supply output would be 5V, 2.5A (voltage control). This is a theoretical situation but exactly models what I need to control.
I can use a PID to control either the voltage OR current without issue, but when I try to use a PID for the voltage and a PID for the current, I get a terrible transition from current control to voltage control. The reason for the bad transition is because while the supply is under current control, the voltage PID is trying to output 5V, but the current control and ohms law is limiting the output to 2.5V. This causes the voltage PID to eventually max's out the voltage control signal on the power supply. Therefore when the transition point comes, the PID's output signals have the supply set to 10V,5A and there is a time (which depends on the response of the voltage PID) where the voltage exceeds 5V. This is not good.
I know enough about PID and controls to make simple PID loops very effectively, but the only method I can come up with to handle this situation is to run a single PID for voltage and assume the output for the current, then start a different PID for current and assume the output for voltage. I have tried this, but it has severe limitations caused by the 'assumptions'
Is there a common method to control a system in this fashion? or does anyone have any good advice?
I am using an AB compactlogix system
thanks
kevin
Here is my situation...
I have a 10V, 10A supply with independent voltage and current control inputs. During a process I have a load attached to the output that will increase in size as time goes. So at the beginning of the process I might set the power supply to 5V,5A and have a .5 ohm load so the power supply output will initially be 2.5V,5A (current control). As time goes however the load will reach 1 ohm and the output becomes 5V, 5A (transition from current control to voltage control). After more time the load becomes 2 ohms and the supply output would be 5V, 2.5A (voltage control). This is a theoretical situation but exactly models what I need to control.
I can use a PID to control either the voltage OR current without issue, but when I try to use a PID for the voltage and a PID for the current, I get a terrible transition from current control to voltage control. The reason for the bad transition is because while the supply is under current control, the voltage PID is trying to output 5V, but the current control and ohms law is limiting the output to 2.5V. This causes the voltage PID to eventually max's out the voltage control signal on the power supply. Therefore when the transition point comes, the PID's output signals have the supply set to 10V,5A and there is a time (which depends on the response of the voltage PID) where the voltage exceeds 5V. This is not good.
I know enough about PID and controls to make simple PID loops very effectively, but the only method I can come up with to handle this situation is to run a single PID for voltage and assume the output for the current, then start a different PID for current and assume the output for voltage. I have tried this, but it has severe limitations caused by the 'assumptions'
Is there a common method to control a system in this fashion? or does anyone have any good advice?
I am using an AB compactlogix system
thanks
kevin