Lets make it a bit easier, what induction temp would be ideal?
A simple hot or cold.
dicer,
Once again, that's not a simple question.
In very general terms, combustion cycle efficiency is dependent on pressure ratio and heat release. SI engines are detonation limited, so having a lower T1 allows the use of higher CR's (up to about 14:1), which in turn gives greater cycle efficiency. With GDI, injecting later in the compression phase gives a greater charge temperature reduction due to the fuel latent heat effect, thus reducing the knock tendency.
CI engines are not detonation limited, and a low T1 is not so critical. T1 needs to be high enough such that the T2 is sufficient to ignite the injected fuel, but the lower air density of hot intake air can be compensated for with increased turbocharger work. Indeed, the best efficiency in CI engines is obtained with a CR around 14:1 and very high levels of boost. Current production turbo CI engines all use charge air cooling, but it's for NOx reduction, and the intercooler's flow and thermal losses hurt BSFC.
Finally, there is the extreme example of turbine engines. Turbine engine efficiency is greatly improved with the use of thermal recuperators, which are heat exchangers that transfer heat from the exhaust flow to the compressed intake flow. In this case, it's the hotter the better. To improve thermal efficiency even more, turbine engines also use fuel/oil heat exchangers that cool the engine oil and recover waste heat in the fuel prior to injection into the engine.