{"id":357953,"date":"2025-11-05T17:56:19","date_gmt":"2025-11-05T17:56:19","guid":{"rendered":"https:\/\/www.europesays.com\/us\/357953\/"},"modified":"2025-11-05T17:56:19","modified_gmt":"2025-11-05T17:56:19","slug":"cambridge-meta-study-raises-the-bar-for-retinal-resolution-in-xr","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/us\/357953\/","title":{"rendered":"Cambridge &#038; Meta Study Raises the Bar for &#8216;Retinal Resolution&#8217; in XR"},"content":{"rendered":"<p>It\u2019s been a long-held assumption that the human eye is capable of detecting a maximum of 60 pixels per degree (PPD), which is commonly called \u2018retinal\u2019 resolution. Any more than that, and you\u2019d be wasting pixels. Now, a recent University of Cambridge and Meta Reality Labs study\u00a0<a href=\"https:\/\/www.nature.com\/articles\/s41467-025-64679-2\" target=\"_blank\" rel=\"noopener\">published in Nature<\/a> maintains the upper threshold is actually much higher than previously thought.<\/p>\n<p>The News<\/p>\n<p>As <a href=\"https:\/\/www.cam.ac.uk\/research\/news\/is-your-ultra-hd-tv-worth-it-scientists-measure-the-resolution-limit-of-the-human-eye\" target=\"_blank\" rel=\"noopener\">the University of Cambridge\u2019s news site<\/a> explains, the research team measured participants\u2019 ability to detect specific display features across a variety of scenarios: both in color and greyscale, looking at images straight on (aka \u2018foveal vision\u2019), through their peripheral vision, and from both close up and farther away.<\/p>\n<p>The team used a novel sliding-display device (seen below) to precisely measure the visual resolution limits of the human eye, which seem to overturn\u00a0the widely accepted benchmark of 60 PPD commonly considered as \u2018retinal resolution\u2019.<\/p>\n<p><a href=\"https:\/\/www.europesays.com\/us\/wp-content\/uploads\/2025\/11\/retinal-resolution-ppd.jpeg\"><img loading=\"lazy\" decoding=\"async\" class=\"lazy lazy-hidden size-full wp-image-125497\" data-lazy-type=\"image\" src=\"https:\/\/www.europesays.com\/us\/wp-content\/uploads\/2025\/11\/retinal-resolution-ppd.jpeg\" alt=\"\" width=\"1991\" height=\"1168\"  \/><\/a>Image courtesy University of Cambridge, Meta<\/p>\n<p>Essentially, PPD measures how many display pixels fall within one degree of a viewer\u2019s visual field; it\u2019s sometimes seen on XR headset spec sheets to better communicate exactly what the combination of field of view (FOV) and display resolution actually means to users in terms of visual sharpness.<\/p>\n<p>According to the researchers, foveal vision can actually perceive much more than 60 PPD\u2014more like up to 94 PPD for black-and-white patterns, 89 PPD for red-green, and 53 PPD for yellow-violet.\u00a0Notably, the study had a few outliers in the participant group, with some individuals capable of perceiving as high as 120 PPD\u2014double the upper bound for the previously assumed retinal resolution limit.<\/p>\n<p>The study also holds implications for foveated rendering, which is used with eye-tracking to reduce rendering quality in an XR headset user\u2019s peripheral vision. Traditionally optimized for black and white vision, the study maintains foveated rendering could further reduce bandwidth and computation by lowering resolution further for specific color channels.<\/p>\n<p>So, for XR hardware engineers, the team\u2019s findings point to a new target for true\u00a0retinal resolution. For a more in-depth look, you can read <a href=\"https:\/\/www.nature.com\/articles\/s41467-025-64679-2\" target=\"_blank\" rel=\"noopener\">the full paper in Nature<\/a>.<\/p>\n<p>My Take<\/p>\n<p>While you\u2019ll be hard pressed to find accurate info on each headset\u2019s PPD\u2014some manufacturers believe in touting pixels per inch (PPI), while others\u00a0focus on raw resolution numbers\u2014not many come close to reaching 60 PPD, let alone the revised retinal resolution suggested above.<\/p>\n<p>According to data obtained from XR spec comparison site\u00a0<a href=\"https:\/\/vr-compare.com\/\" target=\"_blank\" rel=\"noopener\">VRCompare<\/a>, consumer headsets like Quest 3, Pico 4, and Bigscreen Beyond 2 tend to have a peak PPD of around 22-25, which describes the most pixel-dense area at dead center.<\/p>\n<p><a href=\"https:\/\/www.europesays.com\/us\/wp-content\/uploads\/2025\/11\/meta-vr-prototypes.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"lazy lazy-hidden size-full wp-image-113568\" data-lazy-type=\"image\" src=\"https:\/\/www.europesays.com\/us\/wp-content\/uploads\/2025\/11\/meta-vr-prototypes.jpg\" alt=\"\" width=\"1920\" height=\"1080\"  \/><\/a>Meta \u2018Butterscotch\u2019 varifocal prototype (left), \u2018Flamera\u2019 passthrough prototype (right) | Image courtesy Meta<\/p>\n<p>Prosumer and enterprise headsets fare slightly better, but only just. Estimating from available data, Apple Vision Pro and Samsung Galaxy XR boast a peak PPD of between 32-36.<\/p>\n<p>Headsets like Shiftall MeganeX Superlight \u201c8K\u201d and Pimax Dream Air have around 35-40 peak PPD. On the top end of the range is Varjo, which claims its XR-4 ($8,000) enterprise headset can achieve 51 peak PPD through an aspheric lens.<\/p>\n<p>Then, there are prototypes like <a href=\"https:\/\/www.roadtovr.com\/meta-prototype-vr-retinal-resoltion-light-field-passthrough\/\" target=\"_blank\" rel=\"noopener\">Meta\u2019s \u2018Butterscotch\u2019 varifocal headset<\/a>, which the company showed off in 2023, which is said to sport 56 PPD (not confirmed if average or peak).<\/p>\n<p>Still, there\u2019s a lot more to factor in to reaching \u2018perfect\u2019 visuals beyond PPD, peak or otherwise. Optical artifacts, refresh rate, subpixel layout, binocular overlap, and eye box size can all sour even the best displays. What is sure though: there is still plenty of room to grow in the spec sheet department before any manufacturer can confidently call their displays retinal.<\/p>\n","protected":false},"excerpt":{"rendered":"It\u2019s been a long-held assumption that the human eye is capable of detecting a maximum of 60 pixels&hellip;\n","protected":false},"author":3,"featured_media":357954,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[158,67,132,68,729,730],"class_list":{"0":"post-357953","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-virtual-reality","8":"tag-technology","9":"tag-united-states","10":"tag-unitedstates","11":"tag-us","12":"tag-virtual-reality","13":"tag-vr"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@us\/115498384761866522","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts\/357953","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/comments?post=357953"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts\/357953\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/media\/357954"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/media?parent=357953"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/categories?post=357953"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/tags?post=357953"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}