PrimalPete
Electrical
- Oct 18, 2018
- 15
HI,
I work on renewable energy microgrids for commercial and industrial. Typically this includes solar panels and battery storage.
I am exploring adding a renewable microgrid to an existing building which has a diesel generator backup. There is an ATS that switches to diesel during a grid outage. There is no emergency panel, the genset picks up the entire building load. The battery and solar would be added on a load side tap, downstream of the ATS.
I am thinking of using the diesel generator during a utility grid outage situation to form the a grid/frequency. The renewable inverters would then follow that voltage and provide up to 80-90% of the load, diesel would handle the difference. I am aware backfeeding and so the renewable output will be following the load in real time, which is why there is a 10-20% cushion between renewable power vs diesel power. The idea is to explore how long we can increase the diesel's full reserve when the renewable system is actively providing 80-90% of the load.
Now here comes the question at hand. I am not too familiar with diesel gensets. Based on data sheets I have looked at, they typically give you fuel efficiency of 25%, 50%, 75% and 100% of rated load (output power). The problem is, I am looking at several sites with 150kW diesel generator, and an average building load of 20kW. This average load is already less than 25% of rated output power of the genset. I fear that anything below 25% will not change the gal/hour specified for the 25% level in the datasheet, which would ultimately mean that there is an almost complete waste of renewable power in such a system.
Is there any way to calculate fuel efficiency at low levels of 2-5% rated output for diesel generators (given that I know the exact model number and can obtain data sheet)? Or is this simply a terrible idea?
I work on renewable energy microgrids for commercial and industrial. Typically this includes solar panels and battery storage.
I am exploring adding a renewable microgrid to an existing building which has a diesel generator backup. There is an ATS that switches to diesel during a grid outage. There is no emergency panel, the genset picks up the entire building load. The battery and solar would be added on a load side tap, downstream of the ATS.
I am thinking of using the diesel generator during a utility grid outage situation to form the a grid/frequency. The renewable inverters would then follow that voltage and provide up to 80-90% of the load, diesel would handle the difference. I am aware backfeeding and so the renewable output will be following the load in real time, which is why there is a 10-20% cushion between renewable power vs diesel power. The idea is to explore how long we can increase the diesel's full reserve when the renewable system is actively providing 80-90% of the load.
Now here comes the question at hand. I am not too familiar with diesel gensets. Based on data sheets I have looked at, they typically give you fuel efficiency of 25%, 50%, 75% and 100% of rated load (output power). The problem is, I am looking at several sites with 150kW diesel generator, and an average building load of 20kW. This average load is already less than 25% of rated output power of the genset. I fear that anything below 25% will not change the gal/hour specified for the 25% level in the datasheet, which would ultimately mean that there is an almost complete waste of renewable power in such a system.
Is there any way to calculate fuel efficiency at low levels of 2-5% rated output for diesel generators (given that I know the exact model number and can obtain data sheet)? Or is this simply a terrible idea?