Meta Avatar
This addon shows how to integrate Meta avatars with Fusion.
The main focus is to :
- synchronise avatars using Fusion networked variables
- integrate lipsync with Fusion Voice
Meta XR SDK
Instead of the OpenXR plug-in used in other samples, the Oculus XR plug-in is used here.
The Meta XR SDK has been added through their scoped registry https://npm.developer.oculus.com/ (see the Meta documentation for more details).
The main installed packages from Meta registries are:
- Meta XR Core SDK
- Meta XR Platform SDK: it is required to have access to the Oculus user id and load Meta avatars.
Oculus rig and building blocks
Being based on the Oculus XR plug-in instead of the OpenXR plug-in, a specific rig has been created to capture the headset and hand positions.
This hardware collecting rig has been created through the Meta building blocks:
- the prefab resulting from this step is available in the
/Prefabs/Rig/BaseBuildingBlocks/[BuildingBlock] BaseRig
prefab of theMetaOVRHandsSynchronization
add-on. - the prefab actualy used in the add-on, with the synchronization components added to the previous one, is available in the
/Prefabs/HardwareRig/[BuildingBlock] HardwareRigForMetaAvatar
prefab.
The MetaAvatar
gameobject groups together all the components needed for the Meta avatar to function properly.
- The
LipSync
has theOVRAvatarLipSyncContext
component : it is provided by the Oculus SDK to setup the lipsync feature. - The
BodyTracking
has theSampleInputManager
component : it is the one coming from the Oculus SDK (Asset/Avatar2/Example/Common/Scripts
). It is a derived class from theOvrAvatarInputManager
base class and it refers to theOVR Hardware Rig
for setting tracking input on an avatar entity. - The
AvatarManager
has theOVRAvatarManager
component : it is used to load Meta avatars.
Runner
The ConnectionManager
, located on the Runner
game object, handles the connection to the Photon Fusion server and spawns the user network prefab when the OnPlayerJoined
callback is called.
In order to stream the voice over the network, we need the Fusion Voice Client
. The primary recorder field refers to the Recorder
gameobject located under the Runner
.
For more details on Photon Voice integration with Fusion, see this page: Voice - Fusion Integration.
The Runner
game object is also in charge of asking for the microphone authorization thanks to the MetaAvatarMicrophoneAuthorization
component. It enables the Recorder
object when the microphone access is granted.
The subobject Recorder
is in charge of connecting to the microphone. It contains the AudioLipSyncConnector
component which receive the audio stream from the Recorder
and forward it to the OVRAvatarLipSyncContext
User Spawned Network Prefab
The ConnectionManager
spawns the user network prefab NetworkRigWithOVRHandsMetaAvatar Variant
when the OnPlayerJoined
callback is called.
This prefab contains :
MetaAvatarSync
: it is in charge to select a random avatar at start and streaming the avatar over the network.NetworkedAvatarEntity
: it is derived from the OculusOvrAvatarEntity
. It is used to configure the avatar entity depending on whether the network rig represents the local user or a remote user.
Avatar Synchronization
Overview
The MetaAvatarSync
class handles the orchestration of the avatar synchornization.
Thanks to its ConfigureAsLocalAvatar()
method, when the user network prefab is spawned for the local user, the associated NetworkAvatarEntity
received datas from :
OvrAvatarLipSyncContext
for lipsyncSampleInputManager
for body tracking
Datas are streamed over the network thanks to networked variables.
While when an user network prefab is spawned for a remote user, the MetaAvatarSync
ConfigureAsRemoteAvatar()
is called and the associated NetworkAvatarEntity
class builds & animates the avatar thanks to datas streamed.
Avatar modes
The MetaAvatarSync
support two modes :
- UserAvatar : to load the user's Meta avatar
- RandomAvatar : to load a random Meta avatar
So, when the local user networked prefab is spawned, the avatar is selected according to this configuration,
C#
public override void Spawned()
{
base.Spawned();
if (Object.HasInputAuthority)
{
LoadLocalAvatar();
}
else
{
if (!avatarConfigured)
{
ConfigureAsRemoteAvatar();
}
}
changeDetector = GetChangeDetector(ChangeDetector.Source.SnapshotFrom);
// Trigger initial change if any
OnUserIdChanged();
ChangeAvatarIndex();
}
async void LoadLocalAvatar()
{
if (avatarMode == AvatarMode.UserAvatar)
{
// Make sure to download the user id
ConfigureAsLocalAvatar();
UserId = await avatarEntity.LoadUserAvatar();
}
else
{
ConfigureAsLocalAvatar();
AvatarIndex = UnityEngine.Random.Range(0, 31);
avatarEntity.LoadZipAvatar(AvatarIndex);
}
}
Because AvatarIndex
is a networked variable, all players will be updated when this value changes thanks to the ChangeDetector.
C#
[Networked]
public int AvatarIndex { get; set; } = -1;
ChangeDetector changeDetector;
C#
public override void Render()
{
base.Render();
foreach (var changedPropertyName in changeDetector.DetectChanges(this))
{
if (changedPropertyName == nameof(UserId)) OnUserIdChanged();
...
}
}
Avatar data
The SampleInputManager
component on the hardware rig tracks the user's movements.
It is referenced by the NetworkedAvatarEntity
if the player network rig represents the local user.
This setting is done by the MetaAvatarSync
(ConfigureAsLocalAvatar()
).
At each LateUpdate()
, MetaAvatarSync
captures avatar data for the local player.
C#
private void LateUpdate()
{
// Local avatar has fully updated this frame and can send data to the network
if (Object.HasInputAuthority)
{
CaptureAvatarData();
}
}
The CaptureLODAvatar
method gets the avatar entity stream buffer and copies it into a network variable called AvatarData
.
The capacity is limited to 1200 as it is enough to stream Meta avatars in medium or high LOD (for an actual configuration, this number should match the actual needed amount of data, to avoid wasting memory).
Please note that, for simplification, only the medium LOD is streamed in this sample.
The buffer size AvatarDataCount
is also synchronized over the network.
C#
[Networked, Capacity(1200)]
public NetworkArray<byte> AvatarData { get; }
[Networked]
public uint AvatarDataCount { get; set; }
So, when the avatar stream buffer is updated, remote users are informed and apply the received data on the network rig representing the remote player.
C#
public override void Render()
{
base.Render();
foreach (var changedPropertyName in changeDetector.DetectChanges(this))
{
...
if (changedPropertyName == nameof(AvatarData)) ApplyAvatarData();
}
}
Loading personalized Meta avatar
Loading the local user avatar
As seen previously, to load the user's Meta avatar, the AvatarMode
must be set to AvatarMode.UserAvatar
.
So, during the Spawned
callback, the NetworkedAvatarEntity
asks for the user's account avatar with LoadUserAvatar()
.
C#
/// <summary>
/// Load the user meta avatar based on its user id
/// Note: _deferLoading has to been set to true for this to be working
/// </summary>
public async Task<ulong> LoadUserAvatar()
{
// Initializes the OVR PLatform, then get the user id
await FinOculusUserId();
if(_userId != 0)
{
// Load the actual avatar
StartCoroutine(Retry_HasAvatarRequest());
}
else
{
Debug.LogError("Unable to find UserId.");
}
return _userId;
}
This method returns the Meta's account user id, which is stored in the UserId
[Networked] var, so that the user id will be synchronized on all clients.
C#
[Networked]
public ulong UserId { get; set; } = 0;
Please note that to be able to load the user avatar :
Defer Loading
should be set to true on theNetworkedAvatarEntity
component of the player's prefab: it will prevent the avatar to be automatically loaded at start.
Loading the remote users' avatar
Upon reception of UserId
, remote users will trigger the downloading of the avatar associated with this id:
C#
public override void Render()
{
base.Render();
foreach (var changedPropertyName in changeDetector.DetectChanges(this))
{
if (changedPropertyName == nameof(UserId)) OnUserIdChanged();
...
}
}
void OnUserIdChanged()
{
if(Object.HasStateAuthority == false && UserId != 0)
{
Debug.Log("Loading remote avatar: "+UserId);
avatarEntity.LoadRemoteUserCdnAvatar(UserId);
}
}
Access to Meta avatar
To be able to load Meta avatars, you have to add the App Id
for your application in the Oculus > Platform > Edit Settings
Unity settings menu.
This App Id
can be found in the API > App Id
field on your Meta dashboard.
Also, this application has to have a completed Data use checkup
section, having required User Id
, User profile
and Avatars
access:
Testing users' avatars
To be able to see in development the avatar associated with the local Meta account, the local user account must be a member of the organization associated with the provided App Id
in the Oculus Platform settings.
To see your avatars in cross platform setup (between a Quest and a desktop build), you will need to have your Quest and Rift application grouped, as specified in the Group App IDs Together
chapter on this page: Configuring Apps for Meta Avatars SDK
LipSync
The microphone initialization is done by the Photon Voice Recorder
.
The OvrAvatarLipSyncContext
on the OVRHardwareRig
is configured to expect direct calls to feed it with the audio buffer.
A class hooks on the recorder audio read, to forward it to the OvrAvatarLipSyncContext
, as detailed below.
The Recorder
class can forward the read audio buffers to classes implementing the IProcessor
interface.
For more details on how to create a custom audio processor see this page: Photon Voice - FAQ.
To register such a processor in the voice connection, a VoiceComponent
subclass, AudioLipSyncConnector
, is added on the same gameobject as the Recorder
.
This led to the reception of the PhotonVoiceCreated
and PhotonVoiceRemoved
callbacks, allowing to add a post processor to the connected voice.
The connected post-processor is an AvatarAudioProcessor
, implementing IProcessor<float>
or IProcessor<short>
.
During a player connection, the MetaAvatarSync
component searchs for the AudioLipSyncConnector
located on the Recorder
to set the lipSyncContext
field of this processor.
Doing so, each time the AvatarAudioProcessor Process
callback is called by the Recorder
, ProcessAudioSamples
is called on the OvrAvatarLipSyncContext
with the received audio buffer, ensuring that the lip synchronization is computed on the avatar model.
This way, the lip sync will be streamed along with the other avatar body info, when captured with RecordStreamData_AutoBuffer
on the avatar entity, done during the late update of MetaAvatarSync
.
Dependencies
- Meta Avatars SDK (com.meta.xr.sdk.avatars) 24.1.1 + Sample scene
- Meta Avatars SDK Sample Assets (com.meta.xr.sdk.avatars.sample.assets) 24.1.1
- Meta XR Core SDK (com.meta.xr.sdk.core) 62.0.0
- Meta XR Platform SDK (com.meta.xr.sdk.platform) 62.0.0
- Photon Voice SDK
- MetaOVRHandsSynchronization addon
Demo
A demo scene can be found in the Assets\Photon\FusionAddons\MetaAvatar\Demo\Scenes\
folder.
Download
This addon latest version is included into the Industries addon project
Supported topologies
- shared mode
Changelog
- Version 2.0.1:
- Compatibility with packaged Meta avatar com.meta.xr.sdk.avatars 24.1.1
- Version 2.0.0: First release