Integrating Physx WASM into BabylonJS project

Im trying to integrate this into a multiplayer WebRTC Project but don’t know where to begin… Do I have to create a new Plugin for this or Can I utilise what already exists?

I found the other physics libs lacking!

1 Like

Pinging @syntheticmagus to see if he has a little guidance to offer on this.

1 Like

Hi bozworth,

The direct answer is that I think you would need to add your own physics engine to use this, though that might end up being even trickier than anticipated for this particular use case. (If WebRTC implies P2P, it might be hard to get multiplayer physics collisions working as expected without some sort of authoritative server.)

The other thing to note about this particular repo is that it doesn’t appear to have any sort of license whatsoever, so it’s not clear under what parameters that code is shared or what kind of usage is okay. Do you have a way to reach out to the repo owners for clarification about that?

1 Like

Isnt phsyx a hardware api for nvidia gpus ? Not sure that makes sense to do.

Webrtc is just how you use udp in the browser. Its painful to use, but data channel gives u unordered encrypted udp. Quic3/web transport is doable now but getting the server deployment setup isnt so easy


Right, I was referring to the network architecture. If you’re doing physics on a networked app (in my limited understanding), there are challenges associated with the fact that not every client believes everything to be in the same place at the same time. Because of this, if multiple clients are running the simulation, they may get different answers because their states don’t match, which can be tricky to reconcile. I think there are a lot of ways to solve this, but the one I’ve heard most about is the “authoritative server” model where the server runs a physics simulation that can override the results from the client (if they even have local simulations at all). A P2P architecture based on (my again limited understanding of) WebRTC wouldn’t have an active server, so it might need to use some other approach to figuring out whose physics simulation was “right.”


webrtc doesnt have to be p2p clients in the sense of mesh topology, you can still be server authoritative in the sense of single source of truth for game state. its not ideal but its the only option really. chromium had a web channel api or something along those lines, but they removed it. next thing is now web transport which is pretty cool, because you can use it in a worker. it requires http3 connection though and not very many examples out there. i’ve always thought an alternative to webrtc could be to encode game state into a video or audio stream but i havnt seen anything doing that. hoping web transport gets some adoption so i can copy paste things


@bozworth just curious, did you ever figure out the integration? We’d love to know or if you have any other further questions!


I saw this recently and thought about this thread. It seems people are indeed actively developing with it. This is a link to where the wasm is, if u back out into the main repo there is a link to web demos too.

I skimmed over the physx docs, which says the default is to run on the cpu with optional extension to run on gpu depending on complexity of scene. I am not a cg expert by any means, but from my understanding , running stuff on the gpu when many objects are the same shape (thin instances?) is very performant, but that is also not a very realistic expectation, so when shapes of collision objects vary a lot, it’ll be a lot faster on the cpu. This is why some physics demos of like, how many blocks can this falling tower hold, are not really very good, because it will favor gpu physics over cpu physics, which isnt representative of game perf. However, if there was some physics engine that could split static objects and dynamic ones, putting the static on gpu and dynamic on cpu, that seems like it’d be good if reconciliation time is acceptably low. idk if its feasible or makes even sense in practice though.

One can assign static/dynamic properties to objects as some metadata to pass to the physics engine and split processing between CPU and GPU. Would be interesting to see an implementation; if there are GPU particles there should be some kind of GPU physics, why not :slight_smile:


Here is a glsl impl from the cannon author gpu-physics.js/src/shaders at master · schteppe/gpu-physics.js · GitHub . At the time of its authoring, i think the browser api was too restricting, but now its a different story. Maybe shader god x aka Mr. Popov aka my compute hero has insight on feasibility of a hybrid system using compute shaders. Not gonna at him on a Saturday morning though;)


Yeah essentially there is already an implementation in THREE, from reading it,
its simply about passing the objects through into the simulator and getting back values to then update in babylon / js

In the example it shows you can install physxjs an npm package and in the node_modules folder you get a distributed wasm build :
import physxModule from ‘physx-js/dist/physx.release.wasm’

Can initialise it :

physics = PhysX.PxCreatePhysics(
    new PhysX.PxTolerancesScale(),
  PhysX.PxInitExtensions(physics, null)
  const sceneDesc = PhysX.getDefaultSceneDesc(
  scene = physics.createScene(sceneDesc)

Then to create a body :

  geometry = new PhysX.PxBoxGeometry(
        // PhysX uses half-extents
        entity.body.size[0] / 2,
        entity.body.size[1] / 2,
        entity.body.size[2] / 2

Then its just a case of calling the update
physics update

export const update = entities => {
  scene.simulate(1 / 60, true)
  entities.forEach(entity => {
    const body = bodies[]
    const transform = body.getGlobalPose()
    entity.transform.position[0] = transform.translation.x

main app loop

const update = () => {

I will post a babylonjs example


Cool. Looking forward to trying it out.


Here are the license terms, it is open source - PhysX License — NVIDIA PhysX SDK 4.0 Documentation

1 Like