Oculus Avatar

This guide will show you how to use the Oculus Avatar SDK with PUN. Therefore we want to start with a new Unity project and import the following packages:

Getting started

When importing is complete we can start with extending existing components. First step is to navigate to 'Assets/OvrAvatar/Content/Prefabs' where you find two prefabs: the 'LocalAvatar' and the 'RemoteAvatar'. For the next step you can either use these two prefabs or create copies of them.

Important: both prefabs needs to be placed inside a 'Resources' folder.
In our case a copy of each prefab is placed in 'Assets/Resources'.

Synchronizing the Avatar

The next step requires implementing a script that will be observed by the PhotonView (we will attach it later) component that handles the synchronization across multiple clients. Therefore we create a new script, name it PhotonAvatarView and add the following three references to its code:

Additionally we need a list of byte-arrays in order to store data from our Avatar before actually sending it to other clients.

By using Unity's Awake function we can set up all previous references and objects.

Hint: if you are running into NullReferenceExceptions on this code, try moving the content from the Awake function shown above to a new Start function. Afterwards add the content from the following OnEnable function to the newly created Start function, too. This doesn't affect the following OnDisable function at all.

After getting the reference to our PhotonView component we can directly use its isMine condition in order to have a clear cut between the 'Local' and the 'Remote Avatar'. If the instantiated object is ours we get the reference to the OvrAvatar component (using this in the next step) and also instantiate the list of byte-arrays which stores all Avatar related input events before sending this data across the network. If the object belongs to another client we get the reference to the OvrAvatarRemoteDriver component which is later used to imitate our input so that other clients see our gesture. Next we need Unity's OnEnable and OnDisable methods which we use to start and stop recording packets which contain our gesture.

We also set an event handler which is fired when a new packet is recorded. In the next step this packet gets serialized into a byte-array which is supported by PUN out of the box. Afterwards it is added to our previously created list and is then ready to get sent across the network. See below:

Since we have a serializer now we also need a deserializer which helps us deserializing the received packets. Thus our next task is to implement this function. In our example it is called DeserializeAndQueuePacketData:

This function deserializes the incoming byte-arrays and recreates the packet data which is then queued on the OvrAvatarRemoteDriver component in order to replay our gesture. The last coding section for this part is to add the exchange of recorded packets. In this case we use OnPhotonSerializeView because it's called automatically on a regular basis and therefore results in regular updates and smooth looking.

For a better understanding we subdivide this into two parts. The first one is the isWriting condition which will be executed by the owner of the Game Object. It first checks if we have to send data at all. If we have to send data, the first sent value is the number of packets. This information is important on the receiving side. In the end we send all the recorded and serialized packets and clear the previous packet data stored in the list because we don't need this any longer.

The isReading condition however is only executed on the remote clients, those who don't own the object. It firstly checks how many packets we have to process and then calls our previous implemented function to deserialize and queue all packet data step by step.

The last step is to attach a PhotonView component and our implemented PhotonAvatarView to both prefabs. Don't forget to add the PhotonAvatarView component to the observed components of the PhotonView.

Instantiating Avatars

In order to instantiate our network Avatars it's sadly not enough to just call PhotonNetwork.Instantiate. The reason is that we have to instantiate two different Avatars: the 'LocalAvatar' for the instantiating player and the 'RemoteAvatar' for anybody else. Therefore we have to use Manual Instantiation. The following code can be placed for example in your already existing Network Manager or any other script that already deals with network logic, it doesn't belong to the PhotonAvatarView script we created earlier. In our example we firstly allocate a 'ViewId' for our Avatar and then use the RaiseEvent function with enabled caching in order to notify other clients.

We now need an OnEvent callback handler which makes sure that the correct prefab is instantiated. We therefore compare the ID of the sender with the local client's ID: if it is the same we know that this client has also raised the event and furthermore needs to instantiate the 'LocalAvatar' prefab. If the ID is not the same the client has to instantiate the 'RemoteAvatar' prefab. To learn more about the usage of the RaiseEvent function and the corresponding OnEvent callback, you can check out the RPCs and RaiseEvent page from the documentation.

This will instantiate the correct Avatar on each connected client, regardless the client has already joined the room or joins it afterwards. We now have to make sure that our custom events will be executed. To do so we have to register our previously implemented OnEvent callback. This can be done with the help of Unity's OnEnable and OnDisable (we want to clean up afterwards) function.

One last thing: make sure that the Avatar gets destroyed on each client when a player leaves the game.

Testing

You have two options to test the synchronization of the Avatars.

The first one requires at least two separate computers, each having an Oculus device connected (Rift and Touch Controller). You can either start the game from the Unity Editor or create a build first and run it on both machines afterwards to see if the synchronization of the gesture looks fine.

Another approach is to build a second testing application. This can be more or less a 'blank' project (you still need the plugins mentioned at the beginning of this page) where you simply place and rotate the main camera in order to make it look at the point the RemoteAvatar will be instantiated at. In our case the camera should look at (0/0/0). Make sure to use the same AppId and AppVersion when connecting to Photon and to have the OnEvent callback implemented and registered in order to handle the Instantiation event. Advantage of this approach is that you only need one computer with one Oculus device (Rift and Touch Controllers).

 To Document Top