{"id":105184,"date":"2025-05-16T03:09:09","date_gmt":"2025-05-16T03:09:09","guid":{"rendered":"https:\/\/www.europesays.com\/uk\/105184\/"},"modified":"2025-05-16T03:09:09","modified_gmt":"2025-05-16T03:09:09","slug":"meta-seeks-volunteers-for-codec-avatar-training-and-pays-50-per-hour","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/uk\/105184\/","title":{"rendered":"Meta seeks volunteers for Codec Avatar training and pays $50 per hour"},"content":{"rendered":"<p>                        <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/05\/Friedman-Zuckerberg-Podcast-1200x675.jpg\" data-no-lazy=\"1\" alt=\"Meta seeks volunteers for Codec Avatar training and pays $50 per hour\" width=\"1200\" height=\"675\"\/><\/p>\n<p>Image: Lex Fridman<\/p>\n<p><strong>Meta is on the hunt for adult volunteers to help shape the future of its Codec Avatars \u2014 paying $50 an hour to record facial expressions, gestures, and conversations.<\/strong><\/p>\n<p>FACTS<\/p>\n<p>Meta is currently recruiting paid volunteers at $50 per hour to capture a range of facial expressions, physical movements, and conversational exchanges. This data will feed directly into the next phase of Meta\u2019s Codec Avatars project, a technology aimed at powering more lifelike avatars for <a class=\"mixed-keyword\" href=\"https:\/\/mixed-news.com\/en\/virtual-reality-guide\/\" target=\"_blank\" rel=\"noopener\">VR<\/a> and <a class=\"mixed-keyword\" href=\"https:\/\/mixed-news.com\/en\/augmented-reality-hardware-and-definitions\/\" target=\"_blank\" rel=\"noopener\">AR<\/a> \u2014 first demoed back in 2019. According to a report from Business Insider, the initiative, codenamed \u201cProject Warhol,\u201d is being run through the data company Appen, which is officially listed as a Meta partner in the project\u2019s consent forms.<\/p>\n<p>The research itself is split into two separate studies: \u201cHuman Motion\u201d and \u201cGroup Conversations.\u201d In the Human Motion study, participants are asked to mimic specific facial expressions, read sample sentences, and make hand gestures, all while being recorded. The setup involves cameras, headsets, and various sensors tracking participants\u2019 movements from all possible angles. The Group Conversations portion brings together small groups of two or three people, who engage in unscripted conversations and light improv activities. Both studies are scheduled to kick off this September at Meta\u2019s research lab in Pittsburgh.<\/p>\n<p>CONTEXT<\/p>\n<p>Codec Avatars: From research to real-world use<\/p>\n<p>Meta has been steadily inching closer to mainstreaming its Codec Avatar technology. Last summer, the company <a href=\"https:\/\/mixed-news.com\/en\/codec-avatars-job-postings\/\" target=\"_blank\" rel=\"noopener\">posted a string of job listings tied directly to the Codec Avatars project<\/a>, including roles for a design prototyper and an iOS developer. The goal: to build out an \u201cinternal <a class=\"mixed-keyword\" href=\"https:\/\/mixed-news.com\/en\/mixed-reality-xr-headset-definition-differences\/\" target=\"_blank\" rel=\"noopener\">XR<\/a> phone service\u201d described as defining \u201cthe future of human-to-human interaction through immersive telepresence with Codec Avatars.\u201d<\/p>\n<p>Meta\u2019s ambitions aren\u2019t just talk\u2014last September, the tech made a splash in a high-profile podcast, where Lex Fridman and Mark Zuckerberg held a long-distance conversation using photorealistic Codec Avatars. What didn\u2019t get mentioned at the time: <a href=\"https:\/\/mixed-news.com\/en\/mark-zuckerberg-lex-fridman-codec-avatars-workstation\/\" target=\"_blank\" rel=\"noopener\">pulling off that level of realism required workstations loaded with four GeForce RTX 4090 GPUs per avatar.<\/a><\/p>\n<p>     span { width: 5px; height: 5px; background-color: #5b5b5b; }#mailpoet_form_11{border-radius: 0px;text-align: left;}#mailpoet_form_11 form.mailpoet_form {padding: 20px;}#mailpoet_form_11{width: 100%;}#mailpoet_form_11 .mailpoet_message {margin: 0; padding: 0 20px;}#mailpoet_form_11 .mailpoet_paragraph.last {margin-bottom: 0} @media (max-width: 500px) {#mailpoet_form_11 {background-image: none;}} @media (min-width: 500px) {#mailpoet_form_11 .last .mailpoet_paragraph:last-child {margin-bottom: 0}}  @media (max-width: 500px) {#mailpoet_form_11 .mailpoet_form_column:last-child .mailpoet_paragraph:last-child {margin-bottom: 0}}<br \/>\n    ]]><\/p>\n<p>More recently, in March 2024, Meta\u2019s Head of Research Yaser Sheikh revealed that an earlier, less detailed version of the Codec Avatars already runs on standalone VR headsets \u2014 and that it\u2019s now possible to generate an avatar with just a one-minute face scan from a smartphone. Meta calls these quick-scan versions \u201cInstant Codec Avatars.\u201d<\/p>\n<p>Buy Quest 3S &amp; Accessories<\/p>\n<p>Buy Quest 3, Accessories &amp; Prescription Lenses<\/p>\n<p>                Sources: <a target=\"_blank\" rel=\"noopener\" href=\"https:\/\/www.businessinsider.com\/meta-project-warhol-avatar-data-metaverse-smart-glasses-2025\">Business Insider<\/a><\/p>\n<p class=\"link-note\">Note: Links to online stores in articles can be so-called affiliate links. If you buy through this link, MIXED receives a commission from the provider. For you the price does not change.<\/p>\n","protected":false},"excerpt":{"rendered":"Image: Lex Fridman Meta is on the hunt for adult volunteers to help shape the future of its&hellip;\n","protected":false},"author":2,"featured_media":105185,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3162],"tags":[48300,598,53,16,15,3243,3244],"class_list":{"0":"post-105184","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-virtual-reality","8":"tag-avatars","9":"tag-meta","10":"tag-technology","11":"tag-uk","12":"tag-united-kingdom","13":"tag-virtual-reality","14":"tag-vr"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@uk\/114515316839514617","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/105184","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/comments?post=105184"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/105184\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media\/105185"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media?parent=105184"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/categories?post=105184"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/tags?post=105184"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}