Is there not a power factor correction that manufacturers apply when line testing diesel engines during manufacture that requires they measure the density and viscosity at the fuel temperature? (J something or other); which implies an effect of the fuel kinematic viscosity on power output.
Large diesel engines (Residual fuelled) have an inline viscometer which is used to control the fuel heater so that the viscosity is optimised.
If the fuel viscosity is too low then a fine non-dispersive mist is formed which does not fully mix with the air flow; the fuel burns incompletely and in the region of the injectors.
If the viscosity is too high the large drops formed project too far across the burning zone, again do not mix well with the air flow and incomplete combustion results, and in the wrong place.
In either case optimising the viscosity makes the engine more fuel efficient and less polluting.
Theoretically, the same situation could apply to smaller diesels burning commercial diesel or turbines burning aviation fuel.
BUT:
How variable is fuel quality?
Commercial diesel of a the approved grade for the engine shouldn't exhibit that much viscosity variation nor should it vary much over the normal range of fuel temperatures with the engine operating normally.
Hence I would have expected that the engine design would be that at normal working conditions of temperature etc. the viscosity ought to be pretty much optimum or is this wrong? How significantly does diesel viscosity vary according to source, garde and fuel temperature?
At what point does viscosity control become desirable? I'd certainly like to know.
Am I right in thinking that simply reducing the viscosity isn't automatically a good idea?
JMW