A large installation might be able to justify the addition of a heat pump. Then you could keep the cells at a lower temperature while still being able to scavenge the heat.
Here's the catch. Locations and times that are good for solar energy typically don't need heat in that place and time. When there's a ~kilowatt per square meter falling from the sky, then air conditioning is what's often required. There are exceptions of course, but the economics often become foolish.
More generally, the problem with overly-complicated, overly-expensive renewable energy systems is that the payback period often stretches out to multiple infinities. This is not only bad economics, it's also a sign that it's a waste of resources. Considering that the manufacturing sector is one of the largest sources of CO2, realize that the capital cost of the installation is roughly proportional to the environmental impact of the project. If you're trying to save money, while saving the planet, spending huge gobs of money at the outset is a bad first step.
Calculating the approximate energy captured by water cooling is trivial. You look up the data for the local solar incident (something one needs for any solar project), subtract the energy taken away as electricity (~20%?), guesstimate how much of what remains escapes through other loss mechanisms (assume half, plus or minus half, to start), and then use the well-known thermal mass of water to calculate the approximate temperature rise.
Another approach would be to find data for a comparable sized water heater solar panel. The missing ~20% of the energy for the electrical output is within the error bounds anyway.