VR calibration.
I've been using lenses and my phones (first n7100, then 7910c, both made by same manufacturer) and I must tell that the quality is not perfect.
Technology developed over time.
What is still missing is the calibration per user.
VR should:
- adjust per user as every one had own perception
- interact with computed whatever
- adjust to ex. Head tilt
- and more
But how we can adjust glasses?
In my opinion best and the most simple and obvious way is to scan some real thing (ex. Environment around the user) and then present rendered version for one eye only. Second eye should look at the real world.
Then repeat for second eye.
This way 3d depth and possibly other effects could be adjusted per user and even per time of day or even better precision...
No comments:
Post a Comment