Instant Heart Rate, from Azumio, is one such app.
Cover the camera lens with your finger, hold it there for a while, and you get credible pulse rates min/avg/max.
Amazing.
I believe there is more than one algorithm shown - the hue/saturation comparison giving the pulse would not (by my estimate) ever give the apparent motion amplification of the crane nor eyeball - one involves changing the color of a particular pixel based on its difference from an average value, whereas the other seems to involve changing the color of a remote pixel (from "sky" to "crane" colored) based on local changes.
For the app I just downloaded, they can't be doing image processing in any normal sense, because a finger placed on the lens can't produce a normal image. Contact with the lens, however, keeps the unfocused object stationary, so pixel to pixel correspondence over time is assured, or at least encouraged.
So, yeah, they seem to be looking at frame by frame variation/trends in the color bits of each pixel, and drawing inference from that.
ISTR reading about using frame to frame comparisons to extract useful 3D data from ordinary video, more than ten years ago, at which time it took some fairly fancy hardware to do such things in anywhere near real time. Coverage ebbed rather quickly, leading me to suspect that the technology may have become classified for its military potential.
Now, everybody and his brother has a phone that's certainly capable of real time image processing, so some cats are out of the bag...
Video encoders typically include motion tracking features as a fundamental part of video compression algorithm. Perhaps the motion exaggeration is as simple as multiplying the movement vectors by 'N'?
As shown in the NYT video, they're also using the other technique of exaggerating colour changes.
Both techniques seem to be very simple, clever and effective. It won't be long until they're built into consumer cameras.