Oculus Avatar

This guide will show you how to use the Oculus Avatar SDK with PUN. Therefore we want to start with a new Unity project and import the following packages:

Getting Started

When importing is complete we can start with extending existing components. First step is to navigate to 'Assets/Oculus/Avatar/Content/Prefabs' where you find two prefabs: the 'LocalAvatar' and the 'RemoteAvatar'. For the next step you can either use these two prefabs or create copies of them.

Important: both prefabs needs to be placed inside a 'Resources' folder.
In our case a copy of each prefab is placed in 'Assets/Resources'.

Back To Top

Synchronizing The Avatar

The next step requires implementing a script that will be observed by the PhotonView (we will attach it later) component that handles the synchronization across multiple clients. Therefore we create a new script, name it PhotonAvatarView and add the following three references to its code:

    private PhotonView photonView;
    private OvrAvatar ovrAvatar;
    private OvrAvatarRemoteDriver remoteDriver;

Additionally we need a list of byte-arrays in order to store data from our Avatar before actually sending it to other clients.

    private List<byte[]> packetData;

By using Unity's Start function we can set up all previous references and objects.

    public void Start()
    {
        photonView = GetComponent<PhotonView>();

        if (photonView.IsMine)
        {
            ovrAvatar = GetComponent<OvrAvatar>();
            ovrAvatar.RecordPackets = true;
            ovrAvatar.PacketRecorded += OnLocalAvatarPacketRecorded;

            packetData = new List<byte[]>();
        }
        else
        {
            remoteDriver = GetComponent<OvrAvatarRemoteDriver>();
        }
    }

After getting the reference to our PhotonView component we can directly use its isMine condition in order to have a clear cut between the 'Local' and the 'Remote Avatar'. If the instantiated object is ours we get the reference to the OvrAvatar component, set an event handler which is fired when a new packet is recorded and also instantiate the list of byte-arrays which stores all Avatar related input events before sending this data across the network. If the object belongs to another client we get the reference to the OvrAvatarRemoteDriver component which is later used to imitate our input so that other clients see our gesture. Next we need Unity's OnDisable method which we use to stop recording packets which contain our gesture.

    public void OnDisable()
    {
        if (photonView.IsMine)
        {
            ovrAvatar.RecordPackets = false;
            ovrAvatar.PacketRecorded -= OnLocalAvatarPacketRecorded;
        }
    }

In the next step this packet gets serialized into a byte-array which is supported by PUN out of the box. Afterwards it is added to our previously created list and is then ready to get sent across the network. In order to avoid sending unnecessary data and prevent a disconnect, that might occur due to message size exceeding, we implement a condition first which checks if we have to process the other part of the function at all. See below:

    private int localSequence;

    public void OnLocalAvatarPacketRecorded(object sender, OvrAvatar.PacketEventArgs args)
    {
        if (!PhotonNetwork.InRoom || (PhotonNetwork.CurrentRoom.PlayerCount < 2))
        {
            return;
        }

        using (MemoryStream outputStream = new MemoryStream())
        {
            BinaryWriter writer = new BinaryWriter(outputStream);

            var size = Oculus.Avatar.CAPI.ovrAvatarPacket_GetSize(args.Packet.ovrNativePacket);
            byte[] data = new byte[size];
            Oculus.Avatar.CAPI.ovrAvatarPacket_Write(args.Packet.ovrNativePacket, size, data);

            writer.Write(localSequence++);
            writer.Write(size);
            writer.Write(data);

            packetData.Add(outputStream.ToArray());
        }
    }

Since we have a serializer now we also need a deserializer which helps us deserializing the received packets. Thus our next task is to implement this function. In our example it is called DeserializeAndQueuePacketData:

    private void DeserializeAndQueuePacketData(byte[] data)
    {
        using (MemoryStream inputStream = new MemoryStream(data))
        {
            BinaryReader reader = new BinaryReader(inputStream);
            int remoteSequence = reader.ReadInt32();

            int size = reader.ReadInt32();
            byte[] sdkData = reader.ReadBytes(size);

            System.IntPtr packet = Oculus.Avatar.CAPI.ovrAvatarPacket_Read((System.UInt32)data.Length, sdkData);
            remoteDriver.QueuePacket(remoteSequence, new OvrAvatarPacket { ovrNativePacket = packet });
        }
    }

This function deserializes the incoming byte-arrays and recreates the packet data which is then queued on the OvrAvatarRemoteDriver component in order to replay our gesture. The last coding section for this part is to add the exchange of recorded packets. In this case we use OnPhotonSerializeView because it's called automatically on a regular basis and therefore results in regular updates and smooth looking. To add and use this function correctly, it is mandatory to implement the IPunObservable interface. In our case it looks like this: public class PhotonAvatarView : MonoBehaviour, IPunObservable. This interface forces you to implement the OnPhotonSerializeView function.

    public void OnPhotonSerializeView(PhotonStream stream, PhotonMessageInfo info)
    {
        if (stream.IsWriting)
        {
            if (packetData.Count == 0)
            {
                return;
            }

            stream.SendNext(packetData.Count);

            foreach (byte[] b in packetData)
            {
                stream.SendNext(b);
            }

            packetData.Clear();
        }

        if (stream.IsReading)
        {
            int num = (int)stream.ReceiveNext();

            for (int counter = 0; counter < num; ++counter)
            {
                byte[] data = (byte[])stream.ReceiveNext();

                DeserializeAndQueuePacketData(data);
            }
        }
    }

For a better understanding we subdivide this into two parts. The first one is the IsWriting condition which will be executed by the owner of the Game Object. It first checks if we have to send data at all. If we have to send data, the first sent value is the number of packets. This information is important on the receiving side. In the end we send all the recorded and serialized packets and clear the previous packet data stored in the list because we don't need this any longer.

The IsReading condition however is only executed on the remote clients, those who don't own the object. It firstly checks how many packets we have to process and then calls our previous implemented function to deserialize and queue all packet data step by step.

The last step is to attach a PhotonView component and our implemented PhotonAvatarView to both prefabs. Don't forget to add the PhotonAvatarView component to the observed components of the PhotonView.

Back To Top

Instantiating Avatars

In order to instantiate our network Avatars, it's sadly not enough to just call PhotonNetwork.Instantiate. The reason is that we have to instantiate two different Avatars: the 'LocalAvatar' for the instantiating player and the 'RemoteAvatar' for everybody else. Therefore we have to use Manual Instantiation.

The following code can be placed for example in your already existing Network Manager or any other script that already deals with network logic, it doesn't belong to the PhotonAvatarView script we created earlier. Here we are using the OnJoinedRoom callback.

    public override void OnJoinedRoom()
    {
        GameObject localAvatar = Instantiate(Resources.Load("LocalAvatar")) as GameObject;
        PhotonView photonView = localAvatar.GetComponent<PhotonView>();

        if (PhotonNetwork.AllocateViewID(photonView))
        {
            RaiseEventOptions raiseEventOptions = new RaiseEventOptions
            {
                CachingOption = EventCaching.AddToRoomCache,
                Receivers = ReceiverGroup.Others
            };

            SendOptions sendOptions = new SendOptions
            {
                Reliability = true
            };

            PhotonNetwork.RaiseEvent(InstantiateVrAvatarEventCode, photonView.ViewID, raiseEventOptions, sendOptions);
        }
        else
        {
            Debug.LogError("Failed to allocate a ViewId.");

            Destroy(localAvatar);
        }
    }

In this example we instantiate the Local Avatar locally first. If we have successfully allocated an ID for the PhotonView, we are defining the RaiseEventOptions and the SendOptions. By using the RaiseEventOptions, we make sure, that our custom Manual Instantiation event is stored in the room's cache (later joining clients will receive this event, too) and only send to the other clients, because we already have instantiated our object locally. With the SendOptions we make sure, that our custom event is sent reliable. Afterwards we use the RaiseEvent function to send our custom Manual Instantiation event to the server. In this case we are using InstantiateVrAvatarEventCode, which is simply a byte value representing this certain event. If allocating an ID for the PhotonView fails, we log an error message and destroy the previously instantiated object.

In order to receive and handle custom events, we have two different possibilites. In this example we are just demonstrating one of them and continue with implementing the IOnEventCallback interface. To see what the other option is, you can take a look at the RPCs and RaiseEvent page from the documentation. Having implemented the IOnEventCallback interface, our class looks similar to the following code snippet.

public class MyClass : MonoBehaviourPunCallbacks, IOnEventCallback
{
    public void OnEvent(EventData photonEvent) { }
}

We now have to add some logic to the OnEvent callback handler which makes sure that the Remote Avatar prefab is instantiated and set up correctly.

    public void OnEvent(EventData photonEvent)
    {
        if (photonEvent.Code == InstantiateVrAvatarEventCode)
        {
            GameObject remoteAvatar = Instantiate(Resources.Load("RemoteAvatar")) as GameObject;
            PhotonView photonView = remoteAvatar.GetComponent<PhotonView>();
            photonView.ViewID = (int) photonEvent.CustomData;
        }
    }

Here we simply check, if the received event is our custom Manual Instantiation event. If so, we instantiate the Remote Avatar prefab. Afterwards we get a reference to the object's PhotonView component and assign the ViewID we have received.

This will instantiate the correct Avatar on each connected client, regardless the client has already joined the room or joins it afterwards. We now have to make sure that our custom events will be processed at all. To do so we have to register our previously implemented OnEvent callback. This can be done with the help of Unity's OnEnable and OnDisable (we want to clean up afterwards) function.

    public void OnEnable()
    {
        PhotonNetwork.AddCallbackTarget(this);
    }

    public void OnDisable()
    {
        PhotonNetwork.RemoveCallbackTarget(this);
    }

One last thing: make sure that the Avatar gets destroyed on each client when a player leaves the game. Also don't forget, to remove the stored event from the room's cache.

Back To Top

Testing

You have two options to test the synchronization of the Avatars.

The first one requires at least two separate computers, each having an Oculus device connected (Rift and Touch Controller). You can either start the game from the Unity Editor or create a build first and run it on both machines afterwards to see if the synchronization of the gesture looks fine.

Another approach is to build a second testing application. This can be more or less a 'blank' project (you still need the plugins mentioned at the beginning of this page) where you simply place and rotate the main camera in order to make it look at the point the RemoteAvatar will be instantiated at. In our case the camera should look at (0/0/0). Make sure to use the same AppId and AppVersion when connecting to Photon and to have the OnEvent callback implemented and registered in order to handle the Instantiation event. Advantage of this approach is that you only need one computer with one Oculus device (Rift and Touch Controllers).

Back To Top

Known Issues

InvalidCastException

If you run into issues with InvalidCastExceptions in OnPhotonSerializeView or DeserializeAndQueuePacketData(byte[] data), you can check out a workaround provided by our forum user cloud_canvas.

To apply this workaround, you have to open and modify the OvrAvatar class. The first thing you have to do, is to add public bool Initialized = false;. You can add this line at the end of the public field definitions for example. Afterwards navigate to the CombinedMeshLoadedCallback(IntPtr assetPtr) function and add Initialized = true; at its end. This simply gives you further information about the state of the object's initialization. We will make use of this one and the Oculus Platform itself, to determine if we are ready to send and receive packages with Avatar poses.

To do this we have to make a few adjustments to our PhotonAvatarView class. In the OnLocalAvatarPacketRecorded function we are just checking, if we are in a room and if there are at least two players in the room. We will add the previously mentioned conditions to it as well. Since we will use this condition more often, we can create a property for it and add it to the PhotonAvatarView class.

    private bool notReadyForSerialization
    {
        get
        {
            return (!PhotonNetwork.InRoom || (PhotonNetwork.CurrentRoom.PlayerCount < 2) || 
                    !Oculus.Platform.Core.IsInitialized() || !ovrAvatar.Initialized);
        }
    }

This property will return true, if we are in a room with at least two clients in it and if both the Oculus Platform and our Avatar are initialized. In this case we are ready to send or receive Avatar poses.

To make use of this property, we navigate to the OnLocalAvatarPacketRecorded function again and replace the if (!PhotonNetwork.InRoom || (PhotonNetwork.CurrentRoom.PlayerCount < 2)) condition with our new if (notReadyForSerialization) condition.

Additionally we add the same if (notReadyForSerialization) { return; } condition at the beginning of the DeserializeAndQueuePacketData(byte[] data) function.

You can now check, if this already solves the problem. If the issue persists, you can also try to add the above mentioned condition at the beginning of the OnPhotonSerializeView(PhotonStream stream, PhotonMessageInfo info) function.

Hint: since we have modified the OvrAvatar class, we would have to apply these changes each time, we update the Oculus Integration in our project (at least if this certain file gets changed).

To Document Top