One concept which exists in digital photography, is that we can remove any need for special filters, just by using software to modify or rearrange the colors within a photo or video we have shot. And one problem with this claim is, the fact that software can only change the contents of an image, based on information already stored in its pixels. Hence, the color-vectors of resulting pixels, need to be derived from those of captured pixels.
Thus, if we have taken a photo of a gray, hazy day scene, and if we wanted the sky to look more blue, and if we wanted features in the scene to look more yellow, then we could be creative in the coding of our software, so that it performs a per-channel gamma-correction, raising the blue channel to an exponent greater than one, while raising the red and green channels to an exponent less than one. And we might find that regions within the image which were already more blue, will seem blue more-strongly, while regions which did not, will end up looking more yellow, as if sunlit.
(I suppose that while we are at it, we would also want to normalize each color-vector first, and store its original luminance in a separate register, so that our effect only influences coloration in ways not dependent on luminance, and so that the original luminance can be restored to the pixel afterward.
At that stage of the game, a linear correction could also be computed, with the intent that purely gray pixels should remain gray. )
(Edit 02/24/2018 :
Actually, such an effect plug-in might just as easily keep the other channels, Red and Green in this case, as they are. )
The problem remains, that the entire image could have colors washed out, so that the sky looks gray, and the subject does as well. So then, our software would have nothing on which to base its differentiation.
But light that occurs naturally in scenes tends to be polarized. Hence, light that came from the sky will have an angle of plane-polarization to it, while light which has been scattered by the scene will have more-randomized polarization. Hence, if we have a DSLR camera, we can mount polarization filters which tend to absorb blue light more, if it is polarized along one plane, while absorbing yellow light more, which is polarized at right-angles to the same plane.
The idea is that the filter could be mounted on our camera-lens, in whatever position gives the sky a blue appearance, and we can hope that the entire landscape-photo also looks as if sunlit.
(Edit 02/24/2018 :
After actually giving it some thought, I’d suggest that light which comes from the sky is horizontally-polarized, and that the use of this filter will make both the sky, and horizontally-facing bodies of water look more blue, which both would, on a sunny day. In comparison, the rest of the scene would end up looking ‘more yellow’, suggesting sunlit appearance. )
Then, the actual pixels of the camera will have captured information in a way influenced by polarization, which they would normally not do, any more than Human Eyes would normally do so.
(Updated 02/23/2018 : )
I suppose that one question which can remain, is why, since this type of filter is based on plane-polarization, it does not interfere with the autofocus features of modern cameras. And the simpler answer is, that it never plane-polarizes the light 100%. Whatever the mixture of wavelengths was, incident to the filter, to some extent those rays of light will continue to be polarized not along 1 plane but along 2 planes at right-angles. Only, some hint of bias has been inserted, according to which some of the incident light is polarized more-strongly in one direction. The autofocus can still find components to one visual edge in the scene, that are polarized along two axes.
A more-real problem with these selective polarizing filters is, that they give an overall tint to all the light that passes through them, even if that light is randomly polarized. Thus, a Blue-Yellow Polarizer typically tends to be deficient in Green light, so that if we use it on indoor scenes, its overall effect on the light will be to make it look slightly purple. Similarly, a Blue-Lime Polarizer tries to make up for this original lack in Green light, but also tends to give an excess of Green, if the incident light is just randomly-polarized.
(Edit 04/26/2017 : In fact it is plausible, that the Blue-Lime polarizer was designed in its way, specifically so that any Green primary-color-light will be transmitted in both directions of plane-polarization, thereby providing a spectral overlap, and possibly addressing customer complaints from the past, about the autofocus either finding Blue light plane-polarized, or non-Blue light plane-polarized in the opposing direction. )
And so in practical use, we would rely on outdoor lighting, that being strongly polarized by default, to avoid overall discoloration. And we could still decide to apply a white-balance correction to the photo in our software after shooting it, that gives it back a neutral overall color, but which leaves it with differential coloration in its regions…
(Edit 02/23/2018 : )
Below are two pictures, which both resulted from using a Varicolor Blue / Yellow Polarizer, on a severely grayed-out day. The first shows before, and the second shows after, (Automatic) White-Balance Correction was applied using GIMP:
The second picture looks remarkably, as though it had been a clear day. And, the gravel in the foreground, was truly yellow like that (not a result of processing).
Here’s another example, shot on the same day:
In practice, I found that if the scene contained large regions of horizontally-facing water – i.e., lakes, rivers, etc. – then it remained washed-out, or even became more washed-out:
IIRC, the real water in this situation was so polluted, in the year 2007, that it actually had a brownish or yellowish appearance to it, when not processed.