Developer Guide on Synchronising Transform in AR (Vuforia) using Fusion

Augmented reality has been around for years, with games like Pokemon Go putting AR in the hands of millions of players worldwide. However the variety of ways we can apply AR are still being explored, with multiplayer capabilities still being a relatively new and undocumented aspect of AR.

Synchronising Transform, meaning the position, rotation, and scale of an object can be a challenge in multiplayer AR due to the need to work with local values instead of global ones, which means rather than two players seeing different angles of an AR object depending on the position their standing in, they may just see the same rotation as each other regardless of where they’re stood in relation to the AR object.

This tutorial will go through the 3 potential solutions to synchronising transform in multiplayer in AR. It is worth mentioning that when testing these 3 solutions, we were using the Unity game engine v2022.3.2, and using Photon Fusion and Vuforia as external packages(libraries). To find out more about Fusion follow the link here, and to find out more about Vuforia follow this link.

Synchronising Transform in AR (Vuforia) using Fusion

In all three approaches, the user who intends to move an object must have State Authority over it. This means that either the user should have initially spawned the object or they need to request State Authority on the network object by invoking the “Request State Authority” method. To enable this functionality, the network object must be configured to allow State Authority override by setting the “Allow State Authority Override” property accordingly.

A screenshot of the Network Object settings in Unity

_networkObject.RequestStateAuthority();


Option No.1: Using Global Values

This solution only works if there is only one active image target, if you are using multiple active image targets then you will need to look at using either option 2 or 3.

If there is only one active image target in the scene at any given time, we can utilise global values with the default “Network Transform” or “Network Position Rotation” on our Network Objects to synchronise their positions and rotations. However, it’s important to note that this method does not support scale synchronisation. To implement this, we must ensure that the “World Center Mode” value of the “Vuforia Behaviour” component is set to “SPECIFIC_TARGET” and assign our target to the “World Center” variable. It’s worth mentioning that if we require interpolation, it is necessary to use “Network Transform” instead of “Network Position Rotation”.

3 screenshots of the Network Transform and Network Position Rotation settings in Unity

Option No.2: Inherit Fusion Components and Override required methods

If multiple simultaneous image targets are required in the scene, certain steps need to be followed. Firstly, the maximum number of targets should be specified in the Vuforia configuration; Assets\Resources\VuforiaConfiguration.

A screenshot of various settings in Unity

Next, in the “Network Transform” or “Network Position Rotation” components, specific methods must be overridden to track local values instead of global ones.


public class NetworkLocalTransform : NetworkTransform
{
   private Vector3 _targetStartingLocalPosition;
   private Quaternion _targetStartingLocalRotation;


   protected override void Awake()
   {
      _targetStartingLocalPosition = InterpolationTarget.localPosition;
      _targetStartingLocalRotation = InterpolationTarget.localRotation;
      InterpolationDataSource = InterpolationDataSources.NoInterpolation;
      base.Awake();
   }


   protected override void SetEnginePosition(Vector3 pos)
   {
      this.Transform.localPosition = pos;
   }


   protected override Vector3 GetEnginePosition()
   {
      return this.Transform.localPosition;
   }


   protected override void SetEngineRotation(Quaternion rot)
   {
      this.Transform.localRotation = rot;
   }


   protected override Quaternion GetEngineRotation()
   {
      return this.Transform.localRotation;
   }


   public override void Spawned()
   {
      base.Spawned();
      InterpolationTarget.localPosition = _targetStartingLocalPosition;
      InterpolationTarget.localRotation = _targetStartingLocalRotation;
   }
}



public class NetworkLocalPositionRotation : NetworkPositionRotation
{ 
   protected override void SetEnginePosition(Vector3 pos)
   {
      this.Transform.localPosition = pos;
   }


   protected override Vector3 GetEnginePosition()
   {
      return this.Transform.localPosition;
   }


   protected override void SetEngineRotation(Quaternion rot)
   {
      this.Transform.localRotation = rot;
   }


   protected override Quaternion GetEngineRotation()
   {
      return this.Transform.localRotation;
   }
}


It is advisable to designate one of the targets as the world centre. If “Network Transform” is used, the interpolation data source on the components should be set to “No Interpolation” since interpolation only applies to global values. Additionally, the starting position and location of the “Interpolation Target” should be set in the “Spawned” method.

Option No.3: Use Network Variables

In order to incorporate local scale and establish a comprehensive solution, we must employ networked variables to synchronise the local position, rotation, and scale of objects.


public class NetworkedObjectLocalTransform : NetworkBehaviour
{
   protected NetworkObject NetworkObject;


   protected Vector3 StartingLocalPosition;
   protected Quaternion StartingLocalRotation;
   protected Vector3 StartingLocalScale;


   [Networked(OnChanged = nameof(NetworkedLocalPositionChanged), Default = 
   nameof(StartingLocalPosition))]
   protected Vector3 NetworkedLocalPosition { get; set; }


   [Networked(OnChanged = nameof(NetworkedLocalRotationChanged), Default = 
   nameof(StartingLocalRotation))]
   protected Quaternion NetworkedLocalRotation { get; set; }


   [Networked(OnChanged = nameof(NetworkedLocalScaleChanged), Default = 
   nameof(StartingLocalScale))]
   protected Vector3 NetworkedLocalScale { get; set; }


   protected virtual void Awake()
   {
      var t = transform;
      StartingLocalPosition = t.localPosition;
      StartingLocalRotation = t.localRotation;
      StartingLocalScale = t.localScale;
   }


   public override void Spawned()
   {
      base.Spawned();
      NetworkObject = GetComponent();
      SetAllNetworkVariables();
   }


   private static void 
   NetworkedLocalPositionChanged(Changed 
   changed)
   {
      changed.Behaviour.NetworkedLocalPositionChangedBehaviour();
   }


   private static void 
   NetworkedLocalRotationChanged(Changed 
   changed)
   {
      changed.Behaviour.NetworkedLocalRotationChangedBehaviour();
   }


   private static void 
   NetworkedLocalScaleChanged(Changed 
   changed)
   {
      changed.Behaviour.NetworkedLocalScaleChangedBehaviour();
   }


   private void NetworkedLocalPositionChangedBehaviour()
   {
      transform.localPosition = NetworkedLocalPosition;
   }


   private void NetworkedLocalRotationChangedBehaviour()
   {
      transform.localRotation = NetworkedLocalRotation;
   }


   private void NetworkedLocalScaleChangedBehaviour()
   {
      transform.localScale = NetworkedLocalScale;
   }


   private void SetAllNetworkVariables()
   {
      NetworkedLocalPositionChangedBehaviour();
      NetworkedLocalRotationChangedBehaviour();
      NetworkedLocalScaleChangedBehaviour();
   }
}


Each user is required to request state authority when they intend to modify these values. Rather than directly altering the actual variable, users will modify the networked variables. Consequently, when these variables are changed, the changes will be reflected across all users.

Further Reading

Synchronising the position and rotation of an object isn’t the only aspect that will need synching across different devices. Aspects such as animation, rigidbody and physics all also need to be considered. Luckily, these are also problems that we have found solutions for.

Synchronising Animations in AR Using Fusion

For achieving synchronisation of animations in augmented reality (AR), we employ the same approach of utilising networked variables. Here, it is necessary to store the animation state and any animation parameters within these network variables.


public class NetworkedAnimator : NetworkBehaviour
{
   private Animator _animator;
   private bool _blockStateDefaultValue;
   private static readonly int Block = Animator.StringToHash("Block");


   [Networked(OnChanged = nameof(BlockStateChanged), Default = 
   nameof(_blockStateDefaultValue))]
   private bool BlockState { get; set; }


   protected void Awake()
   {
      _animator = GetComponentInChildren();
      _blockStateDefaultValue = false;
   }


   private void TriggerAnimation()
   {
      _networkObject.RequestStateAuthority();
      BlockState = !BlockState;
   }


   private static void BlockStateChanged(Changed 
   changed)
   {
      changed.Behaviour.BlockStateChangedBehaviour();
   }


   private void BlockStateChangedBehaviour()
   {
      _animator.SetBool(Block, BlockState);
   }


   private void SetAllNetworkVariables()
   {
      BlockStateChangedBehaviour();
   }
}


Synchronising Rigidbody and Physics in AR Using Fusion

In order to ensure synchronisation between rigidbody and physics, we delegate control of the rigidbody to the user with state authority. Then, by monitoring changes in position, rotation, and scale, we apply one of the three previously described methods to synchronise other user objects. Additionally, we must address any modifications to the state authority to properly handle the situation.


public class NetworkLocalPositionRotationRigidBody : 
   NetworkPositionRotation, IStateAuthorityChanged
{
   [SerializeField] private DefaultObserverEventHandler 
   defaultObserverEventHandler;


   private Rigidbody _rigidbody;
   private bool _targetLost;


   protected override void Awake()
   {
      base.Awake();
      _rigidbody = GetComponent();
      if (defaultObserverEventHandler)
      {
         defaultObserverEventHandler.OnTargetFound.
            AddListener(SetKinematicFound);
         defaultObserverEventHandler.OnTargetLost.
            AddListener(SetKinematicLost);
      }
   }


   private void OnDestroy()
   {
      if (defaultObserverEventHandler)
      {
         defaultObserverEventHandler.OnTargetFound.
            RemoveListener(SetKinematicFound);
         defaultObserverEventHandler.OnTargetLost.
            RemoveListener(SetKinematicLost);
      }
   }


   private void SetKinematicFound()
   {
      if (_rigidbody && HasStateAuthority)
         _rigidbody.isKinematic = false;
      else if (_rigidbody && !HasStateAuthority)
         _rigidbody.isKinematic = true;


      _targetLost = false;
   }


   private void SetKinematicLost()
   {
      if (_rigidbody)
      _rigidbody.isKinematic = true;


      _targetLost = true;
   }


   protected override void SetEnginePosition(Vector3 pos)
   {
      Transform.localPosition = pos;
   }


   protected override Vector3 GetEnginePosition()
   {
      return Transform.localPosition;
   }


   protected override void SetEngineRotation(Quaternion rot)
   {
      Transform.localRotation = rot;
   }


   protected override Quaternion GetEngineRotation()
   {
      return Transform.localRotation;
   }


   public void StateAuthorityChanged()
   {
      if (_targetLost)
         SetKinematicLost();
      else
         SetKinematicFound();
   }
}


We should remember to set “DefaultObserverEventHandler” to our Image Target Object inside the editor.

A screenshot of Network Local Position Rotation settings in Unity

Hopefully this tutorial has helped in your own AR project, if you’d like to find out more about our experience with AR applications then follow the link here to find out more.

Are you thinking about getting a bespoke AR application made for your business? Here at Sliced Bread we have a huge range of experience in extended reality projects, contact us to find out more about how we can help you with your AR and VR needs.

Recent Posts