MultiRoom PointerEvents do not trigger

Brief description

In an MRS setup, while using the Coherence SceneLoader to load scenes, Unity PointerEvents such as IPointerDownHandler which use method OnPointerEnter(PointerEventData eventData) do not work.

I have a sample project illustrating this issue here:

Details and error message(s)

Reproduction steps:

  1. Play with SampleMenuScene loaded
  2. Create and Join a room using Coherence UI – this will automatically create sim, which in turn, instantiates a test cube that SHOULD change colors when highlighted
  3. Note there are 2 cubes in the scene at this point: 1 which is part of the menu scene and always exists (slightly off to the left of the screen), this cube changes colors as expected when highlighted. The cube in the center, which is instantiated by the sim upon gaining authority, does not change colors as expected because the EventSystem does not seem to be registering it as an object

Expected behaviour

I also have a button in the scene with text “Manual Scene Load”. You can look in SceneLoadButton.cs to see what this does exactly, but I basically try to mimic how Coherence loads scenes additively.

  • Play with SampleMenuScene loaded
  • Click on “Manual Scene Load” button, this will load the game scene and make it active
  • There are now 2 cubes, same as before, but both change colors when highlighted (both are triggering pointer events)


SDK: 0.10.4 (this is the version in the git I linked, but I also upgraded to 0.10.9 afterward locally, and it had no effect on above)
Unity: 2022.2.1f1
OS/Platform: Win10

1 Like

That’s a very good report, thanks for all the detail!
I’ll add a ticket for this.

1 Like

Hey, can you check on the scene that you load through the CoherenceSceneLoader, that there’s a CoherenceScene component and that it’s not disabling your EventSystem? (check on CoherenceScene’s inspector).

1 Like

Thanks for your response! Just so we’re on same page, the initial scene is the Menu scene and the scene being loaded is the Game scene.

  • I’ve tried with EventSystem being disabled automatically by CoherenceScene (this seems to be default behavior since Smart Select will select EventSystem). This was my initial setup.
  • To fix this: I tried keeping the EventSystem in Game scene enabled (though this causes the “2 Event Systems are active” error, so I manually disable the EventSystem in Menu via Unity Editor), but this did not fix the issue.
  • I also tried scripting in disabling the Menu scene EventSystem and enabling the Game EventSystem during CoherenceSceneLoader.onLoaded event, but this also did not fix the issue.
1 Like

Hey! Sorry that it took me this long to get back to you.

I will check the project you link in OP and come back with something (hopefully!)

1 Like

Without using MRS or coherence, can you interact with your object?

As fas as I can see, even though you implement IPointerDownHandler, the event won’t get delivered since it’s not handled by a raycaster. Your prefabs (in the example case, TestPrefab) would need to be inside of a Canvas + PhysicsRaycaster (not GraphicsRaycaster).

Yes, it can be interacted with. The PhysicsRaycaster is on the Main Camera in both scenes. After playing the scene, you can create and join a room and it will change colors on pointer enter.

Alternatively, you can play then disable the coherence UI in editor without connecting, and see the highlight works that way too, no connection is necessary. The UI blocks raycasts from hitting the cube.

The reason I discovered this in the first place is because 95% of the time I’m developing in the game scene and everything works as expected. It wasn’t until I switched from menu to game scene using the Coherence SceneLoader did the events fail to work in the game scene anymore. That’s when I tested loading the Game scene additively myself to see if this broke the events, and the events still work this way.

Obviously though, I need to use the Coherence SceneLoader here to fit inside the Coherence flow, so this work around I did was only a test and not something I can actually implement in production version.

This project is the simplest form I could give illustrating this issue and shows that using a standard Unity additive scene load does not break the events, but using the Coherence SceneLoader does.

Hope that helps.

Yes, it can be interacted with. The PhysicsRaycaster is on the Main Camera in both scenes. After playing the scene, you can create and join a room and it will change colors on pointer enter.

Ahh I see, I’ve missed that. Will recheck today. Thanks for the detailed answer.

Got to the root of the issue.

MRS allows for per-scene physics. And that comes with some limitations.

Any raycasts or shapecasts using Physics.Raycast, etc., will not interact with objects in a local physics Scene. Instead, update your code to use the casting methods on the physicsScene object. This will also work for a Scene that was not loaded with an independent physics Scene.

CoherenceScene.FixedUpdate is in charge of ticking the local physics scene right now. I’m not sure if there’s a way for a GraphicRaycaster to target a local scene tho. Might have to look through their implementation and see what options we have.

For the time being, you can set the Multi-Room Simulator component (and/or any other CoherenceSceneLoader you’d like to use) to use Local Physics Mode None. And you’ll see your events going through. But note that this will mean every object will collide/raycast with each other between scenes. Which you don’t want to happen in a real setup.

1 Like

Thanks a lot, this did fix the immediate issue.

It does create another issue, but it might only be for development, in that when I have both a client and sim open in same editor, the physics of the objects overlap. I’m guessing this is the reason the physics per scene is an option in the first place.

I can disable the colliders on the sim though, so this fixes that issue. I also don’t think it will be an issue in production because a client will only ever have 1 scene with physics (the game scene) as the menu scene has no physics objects.

Ideally when loading with SceneLoader and local physics, there would also be a way to have a local raycaster that only interacts with objects in the client game scene. I haven’t looked into it beyond this, but this is sufficient for my purposes and also glad we got to the root! :smile:

Yeah so my proposed fix is not really a fix but a way for you to keep working. By using the solution above, every physics object will be interactable through scenes, which defeats the purpose of MRS completely.

The way forward would be to make PhysicsRaycaster to use scene-specific physics.

Take a look at uGUI/PhysicsRaycaster.cs at 2019.1 · Unity-Technologies/uGUI (

You can subclass PhysicsRaycaster and handle physics per-scene basis.

We might consider providing a helper CoherencePhysicsRaycaster, but it’s not a priority right now. So IMO best chance you’ve got at getting this to work without giving up on uGUI is to create your custom PhysicsRaycaster.

CoherenceScene has a reference to the physics scene, and ticks it. Let us know if there’s anything we could do to improve this API to allow for your implementation to work!


Appreciate the response, and I will definitely keep all that in mind.

Your suggestion though, actually is a solution for my case because physics are only needed for mouse interactions coming from the client, ie: clicking on units. Client only sees 1 scene, so can only interact with 1 scene. There is no other physics, so there will be no collisions in between scenes.

It also doesn’t completely defeat the purpose of MRS because the reason I chose to use MRS in the first place was to have more game instances/rooms in a single Unity application to conserve server resources where possible.

1 Like