One concept which exists in digital photography, is that we can remove any need for special filters, just by using software to modify or rearrange the colors within a photo or video we have shot. And one problem with this claim is, the fact that software can only change the contents of an image, based on information already stored in its pixels. Hence, the color-vectors of resulting pixels, need to be derived from those of captured pixels.
Thus, if we have taken a photo of a gray, hazy day scene, and if we wanted the sky to look more blue, and if we wanted features in the scene to look more yellow, then we could be creative in the coding of our software, so that it performs a per-channel gamma-correction, raising the blue channel to an exponent greater than one, while raising the red and green channels to an exponent less than one. And we might find that regions within the image which were already more blue, will seem blue more-strongly, while regions which did not, will end up looking more yellow, as if sunlit.
(I suppose that while we are at it, we would also want to normalize each color-vector first, and store its original luminance in a separate register, so that our effect only influences coloration in ways not dependent on luminance, and so that the original luminance can be restored to the pixel afterward.
At that stage of the game, a linear correction could also be computed, with the intent that purely gray pixels should remain gray. )
(Edit 02/24/2018 :
Actually, such an effect plug-in might just as easily keep the other channels, Red and Green in this case, as they are. )
The problem remains, that the entire image could have colors washed out, so that the sky looks gray, and the subject does as well. So then, our software would have nothing on which to base its differentiation.
But light that occurs naturally in scenes tends to be polarized. Hence, light that came from the sky will have an angle of plane-polarization to it, while light which has been scattered by the scene will have more-randomized polarization. Hence, if we have a DSLR camera, we can mount polarization filters which tend to absorb blue light more, if it is polarized along one plane, while absorbing yellow light more, which is polarized at right-angles to the same plane.
The idea is that the filter could be mounted on our camera-lens, in whatever position gives the sky a blue appearance, and we can hope that the entire landscape-photo also looks as if sunlit.
(Edit 02/24/2018 :
After actually giving it some thought, I’d suggest that light which comes from the sky is horizontally-polarized, and that the use of this filter will make both the sky, and horizontally-facing bodies of water look more blue, which both would, on a sunny day. In comparison, the rest of the scene would end up looking ‘more yellow’, suggesting sunlit appearance. )
Then, the actual pixels of the camera will have captured information in a way influenced by polarization, which they would normally not do, any more than Human Eyes would normally do so.
(Updated 02/23/2018 : )