Oculus Avatar

This guide will show you how to use the Oculus Avatar SDK with PUN. Start with a new Unity project and import the following packages:

Getting Started

When importing is complete, we can start extending existing components. First step is to navigate to "Assets/Oculus/Avatar/Content/Prefabs" where you find two prefabs: the "LocalAvatar" and the "RemoteAvatar". For the next step you can either use these two prefabs or create copies of them.

Important: both prefabs needs to be placed inside a "Resources" folder.
In our case a copy of each prefab is placed in "Assets/Resources".

Back To Top

Synchronizing The Avatar

The next step requires implementing a script that will be observed by the PhotonView (we will attach it later) component that handles the synchronization across multiple clients. Therefore, we create a new MonoBehaviour, name it PhotonAvatarView and add the following three references to its code:

    private PhotonView photonView;
    private OvrAvatar ovrAvatar;
    private OvrAvatarRemoteDriver remoteDriver;

Additionally, we need a list of byte-arrays in order to store data from our Avatar before actually sending it to other clients.

    private List<byte[]> packetData;

By using Unity's Start method we can set up all previous references and objects.

    public void Start()
    {
        photonView = GetComponent<PhotonView>();

        if (photonView.IsMine)
        {
            ovrAvatar = GetComponent<OvrAvatar>();
            ovrAvatar.RecordPackets = true;
            ovrAvatar.PacketRecorded += OnLocalAvatarPacketRecorded;

            packetData = new List<byte[]>();
        }
        else
        {
            remoteDriver = GetComponent<OvrAvatarRemoteDriver>();
        }
    }

After getting the reference to our PhotonView component we can directly use its isMine condition in order to have a clear cut between the 'Local' and the 'Remote Avatar'. If the instantiated object is ours we get the reference to the OvrAvatar component, set an event handler which is fired when a new packet is recorded and also instantiate the list of byte-arrays which stores all Avatar related input events before sending this data across the network. If the object belongs to another client we get the reference to the OvrAvatarRemoteDriver component which is later used to imitate our input so that other clients see our gesture. Next we need Unity's OnDisable method which we will use to stop recording packets which contain our gesture.

    public void OnDisable()
    {
        if (photonView.IsMine)
        {
            ovrAvatar.RecordPackets = false;
            ovrAvatar.PacketRecorded -= OnLocalAvatarPacketRecorded;
        }
    }

In the next step this packet gets serialized into a byte-array which is supported by PUN out of the box. Afterwards it is added to our previously created list and is then ready to get sent across the network. In order to avoid sending unnecessary data and prevent disconnects that might occur due to message size exceeding, we implement a condition first which checks if we have to process the other part of the method at all. See below:

    private int localSequence;

    public void OnLocalAvatarPacketRecorded(object sender, OvrAvatar.PacketEventArgs args)
    {
        if (!PhotonNetwork.InRoom || (PhotonNetwork.CurrentRoom.PlayerCount < 2))
        {
            return;
        }

        using (MemoryStream outputStream = new MemoryStream())
        {
            BinaryWriter writer = new BinaryWriter(outputStream);

            var size = Oculus.Avatar.CAPI.ovrAvatarPacket_GetSize(args.Packet.ovrNativePacket);
            byte[] data = new byte[size];
            Oculus.Avatar.CAPI.ovrAvatarPacket_Write(args.Packet.ovrNativePacket, size, data);

            writer.Write(localSequence++);
            writer.Write(size);
            writer.Write(data);

            packetData.Add(outputStream.ToArray());
        }
    }

Since we have a serializer now we also need a deserializer which helps us with the received packets. Our next task is to implement this method. In our example, it is called DeserializeAndQueuePacketData:

    private void DeserializeAndQueuePacketData(byte[] data)
    {
        using (MemoryStream inputStream = new MemoryStream(data))
        {
            BinaryReader reader = new BinaryReader(inputStream);
            int remoteSequence = reader.ReadInt32();

            int size = reader.ReadInt32();
            byte[] sdkData = reader.ReadBytes(size);

            System.IntPtr packet = Oculus.Avatar.CAPI.ovrAvatarPacket_Read((System.UInt32)data.Length, sdkData);
            remoteDriver.QueuePacket(remoteSequence, new OvrAvatarPacket { ovrNativePacket = packet });
        }
    }

This method deserializes the incoming byte-arrays and recreates the packet data which is then queued on the OvrAvatarRemoteDriver component in order to replay our gesture. The last coding section for this part is to add the exchange of recorded packets. In this case we use OnPhotonSerializeView because it's called automatically on a regular basis and results in regular updates and smooth looking. To add and use this method correctly, it is mandatory to implement the IPunObservable interface. In our case, the class declaration will look like this: public class PhotonAvatarView : MonoBehaviour, IPunObservable. This interface forces you to implement the OnPhotonSerializeView method.

    public void OnPhotonSerializeView(PhotonStream stream, PhotonMessageInfo info)
    {
        if (stream.IsWriting)
        {
            if (packetData.Count == 0)
            {
                return;
            }

            stream.SendNext(packetData.Count);

            foreach (byte[] b in packetData)
            {
                stream.SendNext(b);
            }

            packetData.Clear();
        }

        if (stream.IsReading)
        {
            int num = (int)stream.ReceiveNext();

            for (int counter = 0; counter < num; ++counter)
            {
                byte[] data = (byte[])stream.ReceiveNext();

                DeserializeAndQueuePacketData(data);
            }
        }
    }

For a better understanding we subdivide this into two parts. The first one is the IsWriting condition which will be executed by the owner of the Game Object. It first checks if we have to send data at all. If we have to send data, the first sent value is the number of packets. This information is important on the receiving side. In the end we send all the recorded and serialized packets and clear the previous packet data stored in the list because we don't need this any longer.

The IsReading condition however is only executed on the remote clients, those who don't own the object. It firstly checks how many packets we have to process and then calls our previous implemented method to deserialize and queue all packet data step by step.

The last step is to attach a PhotonView component and our implemented PhotonAvatarView to both prefabs. Don't forget to add the PhotonAvatarView component to the observed components of the PhotonView.

Back To Top

Instantiating Avatars

In order to instantiate our network Avatars, it's sadly not enough to just call PhotonNetwork.Instantiate. The reason is that we have to instantiate two different avatars: the "LocalAvatar" for the instantiating player and the "RemoteAvatar" for everybody else. We have to use Manual Instantiation.

As the code cannot be part of the PhotonAvatarView we created earlier, in the following code we create a new class called NetworkManager that should deal with network logic. Here we are using the OnJoinedRoom callback.

In order to make sure that callbacks are called, the class needs to be registered which can be done with the help of Unity's OnEnable and we also need to deregister the class when we do not need it anymore in OnDisable.

    using System.Collections.Generic;
    using ExitGames.Client.Photon;
    using UnityEngine;
    using Photon.Pun;
    using Photon.Realtime;

    public class NetworkManager : MonoBehaviour, IMatchmakingCallbacks
    {
        public const byte InstantiateVrAvatarEventCode = 1; // example code, change to any value between 1 and 199

        private void OnEnable()
        {
            PhotonNetwork.AddCallbackTarget(this);
        }

        private void OnDisable()
        {
            PhotonNetwork.RemoveCallbackTarget(this);
        }

        #region IMatchmakingCallbacks

        public void OnJoinedRoom()
        {
            GameObject localAvatar = Instantiate(Resources.Load("LocalAvatar")) as GameObject;
            PhotonView photonView = localAvatar.GetComponent<PhotonView>();

            if (PhotonNetwork.AllocateViewID(photonView))
            {
                RaiseEventOptions raiseEventOptions = new RaiseEventOptions
                {
                    CachingOption = EventCaching.AddToRoomCache,
                    Receivers = ReceiverGroup.Others
                };

                PhotonNetwork.RaiseEvent(InstantiateVrAvatarEventCode, photonView.ViewID, raiseEventOptions, SendOptions.SendReliable);
            }
            else
            {
                Debug.LogError("Failed to allocate a ViewId.");

                Destroy(localAvatar);
            }
        }

        public void OnFriendListUpdate(List<FriendInfo> friendList)
        {
        }

        public void OnCreatedRoom()
        {
        }

        public void OnCreateRoomFailed(short returnCode, string message)
        {
        }

        public void OnJoinRoomFailed(short returnCode, string message)
        {
        }

        public void OnJoinRandomFailed(short returnCode, string message)
        {
        }

        public void OnLeftRoom()
        {
        }

        #endregion
    }

In this example we instantiate the "LocalAvatar" locally first. If we have successfully allocated an ID for the PhotonView, we raise the custom event. Otherwise, we log an error and destroy the locally instantiated prefab. By using the RaiseEventOptions, we make sure, that our custom manual instantiation event is stored in the room's cache (later joining clients will receive this event, too) and only send to the other clients, because we already have instantiated our object locally. With the SendOptions.SendReliable we make sure, that our custom event is sent reliable. Afterwards we use the RaiseEvent method to send our custom manual instantiation event to the server. In this case we are using InstantiateVrAvatarEventCode, which is simply a byte value representing this certain event. If allocating an ID for the PhotonView fails, we log an error message and destroy the previously instantiated object.

In order to receive and handle custom events, we have two different possibilites. In this example we are just demonstrating one of them and continue with implementing the IOnEventCallback interface. To see what the other option is, you can take a look at the RPCs and RaiseEvent page from the documentation. Having implemented the IOnEventCallback interface, our class looks similar to the following code snippet.

public class NetworkManager : MonoBehaviour, IMatchmakingCallbacks, IOnEventCallback
{
    // the rest of the class body, see above

    public void OnEvent(EventData photonEvent)
    {
    }
}

We now have to add some logic to the OnEvent callback handler which makes sure that the Remote Avatar prefab is instantiated and set up correctly.

    public void OnEvent(EventData photonEvent)
    {
        if (photonEvent.Code == InstantiateVrAvatarEventCode)
        {
            GameObject remoteAvatar = Instantiate(Resources.Load("RemoteAvatar")) as GameObject;
            PhotonView photonView = remoteAvatar.GetComponent<PhotonView>();
            photonView.ViewID = (int) photonEvent.CustomData;
        }
    }

Here we simply check, if the received event is our custom manual instantiation event. If so, we instantiate the "RemoteAvatar" prefab. Afterwards we get a reference to the object's PhotonView component and assign the ViewID we have received.

This will instantiate the correct avatar on each connected client, regardless the client has already joined the room or joins it afterwards.

Back To Top

Testing

You have two options to test the synchronization of the avatars.

The first one requires at least two separate computers, each having an Oculus device connected (Rift and Touch Controller). You can either start the game from the Unity Editor or create a build first and run it on both machines afterwards to see if the synchronization of the gesture looks fine.

Another approach is to build a second testing application. This can be more or less a 'blank' project (you still need the plugins mentioned at the beginning of this page) where you simply place and rotate the main camera in order to make it look at the point the "RemoteAvatar" will be instantiated at. In our case the camera should look at (0/0/0). Make sure to use the same AppId and AppVersion when connecting to Photon and to have the OnEvent callback implemented and registered in order to handle the manual instantiation event. Advantage of this approach is that you only need one computer with one Oculus device (Rift and Touch Controllers).

Back To Top

Known Issues

InvalidCastException

If you run into issues with InvalidCastExceptions in OnPhotonSerializeView or DeserializeAndQueuePacketData(byte[] data), you can check out a workaround provided by our forum user cloud_canvas.

To apply this workaround, you have to open and modify the OvrAvatar class. The first thing you have to do, is to add public bool Initialized = false;. You can add this line at the end of the public field definitions for example. Afterwards navigate to the CombinedMeshLoadedCallback(IntPtr assetPtr) method and add Initialized = true; at its end. This simply gives you further information about the state of the object's initialization. We will make use of this one and the Oculus Platform itself, to determine if we are ready to send and receive packages with Avatar poses.

To do this we have to make a few adjustments to our PhotonAvatarView class. In the OnLocalAvatarPacketRecorded method we are just checking, if we are in a room and if there are at least two players in the room. We will add the previously mentioned conditions to it as well. Since we will use this condition more often, we can create a property for it and add it to the PhotonAvatarView class.

    private bool notReadyForSerialization
    {
        get
        {
            return (!PhotonNetwork.InRoom || (PhotonNetwork.CurrentRoom.PlayerCount < 2) || 
                    !Oculus.Platform.Core.IsInitialized() || !ovrAvatar.Initialized);
        }
    }

This property will return true, if we are in a room with at least two clients in it and if both the Oculus Platform and our Avatar are initialized. In this case we are ready to send or receive Avatar poses.

To make use of this property, we navigate to the OnLocalAvatarPacketRecorded method again and replace the if (!PhotonNetwork.InRoom || (PhotonNetwork.CurrentRoom.PlayerCount < 2)) condition with our new if (notReadyForSerialization) condition.

Additionally, we add the same if (notReadyForSerialization) { return; } condition at the beginning of the DeserializeAndQueuePacketData(byte[] data) method.

You can now check, if this already solves the problem. If the issue persists, you can also try to add the above mentioned condition at the beginning of the OnPhotonSerializeView(PhotonStream stream, PhotonMessageInfo info) method.

Hint: since we have modified the OvrAvatar class, we would have to apply these changes each time, we update the Oculus Integration in our project (at least if this certain file gets changed).

To Document Top