Since we’re all trying to min-max our vision improvement gains, a solid question is how long one should spend outside. This is a bit of a shot in the dark, but would anyone care to have a go at hypothesising which model would be more accurate?
Does more time outside on any given day lead to directly more improvement proportional to the amount of time spent, or do your eyes hit a certain ‘cap’ after getting enough stimulus and need time to recover? Either seem plausible to me at the moment.
The diminishing returns model would be analogous to the gym, where after a certain amount of training no more effort would improve your workout anymore as your body needs time to rest and relax. Whenever I get the occasional flash of blue light in my eyes, I know it’s the vitreous gel harmlessly tugging on my retina as it retreats and shrinks a little bit, and this often happens long after I’ve stopped recieving stimulus in a dark, close-up room! So this leads me to think the diminishing retruns model might be more accurate, but the way eyes and muscles work are totally different and probably should not be properly compared.