Sorry for the stupid question but a post in last thread made me think about it :
Let's say you see a star 4 light years away. Ok that means your eye receive photons emitted by this star 4 year ago. now move one inch on the left, you still se the star, so again you receive photons from it. And you must receive quite a lot of photons in order your eye can detect them.
Now if we add all the photons received at the same time, at every microscopic dot of the space where the star is visible, would the result be realistic or totally disproportionate according to the size of the star and it's distance ?
Could it be a good way the determine if they are as far as supposed or much closer ?