All-in-one VR software for creators. Record mixed reality, transform into your favorite avatar, see chat inside your headset, and more.
We've got some quality of life updates we’d love for you to try out in the beta branch! (Right click LIV in your Steam library -> properties -> betas -> select public beta in the drop down options and let LIV update)
[h1]New LIV Menu Location[/h1]
This one’s for those of you who use avatars and the first person stabilizer…[b]the LIV UI circle menu (in headset) has moved![/b]
No longer will it get be the way when playing golf or when trying to pick something up off the floor! We’re hoping this will help with a lot of those annoying ‘accidental’ toggles which can really be jarring if unexpected.
You can now access the in headset menu through the SteamVR dashboard. With the LIV desktop app running (and avatars or first person stabilizer selected) open the SteamVR dashboard and you should see a new button on the lower left side. Click that to open the LIV menu! To close it you can click the ‘x’ in the corner or the LIV circle on the ground.
[img]https://i.ibb.co/5kccHMg/ezgif-5-6c8bb87c25.gif[/img]
[h1]Another one for avatar users…[/h1]
Improved microphone, eye tracking gaze, and eye and face tracking blendshapes for .avatar and .VRM!
[img]https://i.ibb.co/5h6X7gM/ezgif-1-db134b48a4.gif[/img]
It's Prisma Sinclair!
[i].avatar eye and mouth tracking changes: [/i]
[list]
[*].avatar now supports Microphone (lipsync).
[*].avatar now supports eye tracking blendshapes and face tracking blendshapes using SRAnipal-styled blendshape names.Note: We also support ARKit blendshapes, but for .avatar we strongly suggest choosing one or the other, and never both. See VRM section below.
[*]
.avatar now supports eye gaze tracking through humanoid eye bone rotation. We will try to use the eye bones that are specified in the humanoid configuration of the avatar asset (this can be set in the import settings of your avatar asset, usually a FBX file).
[*]Use the new option "Use Experimental Eye Gaze" located in the VR LIV avatar selection menu to enable it (eye tracking is required).
[/list]
[i].VRM eye and mouth tracking changes:[/i]
[list]
[*].VRM now supports eye tracking blendshapes and face tracking blendshapes using ARKit-styled BlendShape Clip names.
[*]Added a new checkbox option "Use Experimental Eye Gaze" located in the VR LIV avatar selection menu (Eye tracking is required). When enabled for VRM avatars, it will attempt to rotate the eyes more strongly to match the eye gaze precisely.
[/list]
You can find full documentation and more details of these new avatar updates [url=https://help.liv.tv/hc/en-us/articles/7617322253458]here[/url] (for .avatar formats) and [url=https://help.liv.tv/hc/en-us/articles/7735955349906]here[/url] (for .VRM) respectively!
[h1]Minor Beat Saber Improvements![/h1]
In Beat Saber, opaque objects are now actually opaque, the way they are meant to be!
[img]https://i.ibb.co/Xx75dL0/image.jpg[/img]
Give all these new features a test drive in the beta branch and let us know what you think, we'd love to hear your feedback on our [url=https://discord.com/invite/liv]Discord[/url]!