First-person golden path

So… This is not perfect, not by a long shot, but it might take me some time to get it to the level of polish I want to leave it with, so I thought I might as well share a draft now that I have a working flow.

Note: The experience is very audio-heavy and won’t make much sense if the sound is disabled. I haven’t figured out exactly how I want to message getting audio allowed upfront, but I’ll do that later. Also, this experience is specifically for mouse/touchpad and keyboard controls. Expanding support to things like phones and gamepads is an eventual goal of the First-Person Player, but that’s definitely not within the scope of the current demo.

Now, with all those disclaimers out of the way, check out the first working flow of the Metaverse Acclimation Guide!

This eventually is supposed to be the basis for a golden path, but it might take some time for me to actually write the dev story. This demo shows patterns for…frankly quite a lot of different things, some of the most important of which are as follows:

  • Physics-enabled first-person navigation. This builds on my old Playground from last year demonstrating first-person character control integrated with an actual physics engine. I built this into a Babylon Utility, so it should be easy to integrate into other things as well. For a quick demo of just this Utility, here’s the demo page (recreating the old Playground).
  • Physics-enabled environment creation. A physics-enabled player needs a physics-enabled environment to move around in, so I created another Babylon Utility which establishes patterns for creating physics-enabled environments in a DCC tool—in my case, Blender. For a (fairly crude) demo of just this Utility, you can drag-and-drop level.glb into the Physics Post-Loader Demo Scene; this will load the level, process it to turn the physics geometry into the appropriate impostors, then add in a first-person player so you can explore the level.
  • Fast trigger-based game logic. There’s actually a lot of game logic for such a short experience, but a lot of it has to do with triggering mechanisms—looking at a thing, being in a place, etc. Canonically this would be done using Babylon.js’s built-in picking mechanism. However, since that does mesh comparison math in JavaScript, that can get pretty slow, especially for complex scenes. As an alternative, this experience explores using raycasts from the physics system (which we’re already paying the cost of for movement, so we might as well get the speed benefits) and other positional information alongside scaled unit cubes placed in the DCC tool, which allows each trigger volume to be tested using a single matrix multiplication each.
  • Game logic flow. The sequencing of events in this experience is fairly complex for something so short, and coroutines really came into their own on this one. Basically the entire flow of the game—playing sounds, enabling features, and even a few animations—is powered by a combination of events and coroutines, showing ways to make the game react to the player (including pausing).
  • Independent effect-powered transitions. This is something of a specialized technique, but I did the scene transitions using scene-independent effects for blur and compositing. This gets pretty low-level, but it gives you a tremendous amount of power and freedom to take the rendered output of your scenes and reshape it in pretty much any way you want.

Might be more stuff worth calling out, but quite frankly it’s lunchtime, and mainly I just wanted to share my progress so far. :upside_down_face:

14 Likes

OMAGAD!!! This is really good man! Thanks a lot

2 Likes

I need invert y :upside_down_face:

1 Like

@syntheticmagus does the plan remain to add it to the doc?

This is amazing @syntheticmagus ! Great job! It remember me of the Portal game series!

Yes it is!

I have been staring at the floor for quite some time, thinking how amazing it is :slight_smile:

1 Like

I’m getting grey screen with error on Chrome 101.0.4951.64, MacOS Big Sur 11.4

That’s a good point for the first-person player, and perhaps even for the settings, but I’m not sure I’ll want to have the experience specifically guide the player to it. The idea of this experience is to teach common defaults so that people unfamiliar with first-person mechanics won’t be totally at-a-loss in a typical experience, so it’s pretty tightly focused on the most common case.

Yes, I still want to write a dev story for it, but that may take a bit more time because (1) this needs a bit more polish as it is—a bit more environment detail, more settings in the menu, a better site page to host it, and some code refactoring—and (2) I do have some other writing that I really need to prioritize in the immediate term.

Interesting. I don’t have the hardware to repro this, but it sounds like it has something to do with the render target textures I use for the effect-powered transitions, which I wouldn’t have expected to have browser-specific problems. @sebavan, do you know if this is a bug or is this just one of those “Safari broke a very specific thing for unknown reasons” situations?

3 Likes

Are all collisions physics based?

I think it was fixed :frowning: could you try on Chrome Canary ?

1 Like

Hello. Cool start!

Thanks for the updated version of physics, you solved my problem from this post!

I’m currently using the old version of this physics for the first person camera, to control the player mesh when I go into third person, and also for the WebXr camera.

Concerning WebXR. If you remember, we talked with you in this post

I never found a smart way to mimic the physics for the WebXR camera, so I tied it to a mesh and started calling WASD artificially, when controlled from the joystick. I solved the physics problem and saved 6DOF this way.
But I would like something more elegant :slight_smile:

I hope in this golden path you can solve this problem

I would really like to get some kind of simple API and a new section in the documentation!

1 Like

I thought you already did it when the voice asked me to look up from the floor. I, of course, move my mouse down to look up. You can figure it out right there. Anyways, this experience was hard to me to navigate because I’m used to invert y.

Yep! Everything is done using Ammo.js, allowing all the mechanisms to operate in the same physics context.

Me too. :slight_smile: It won’t be in this golden path, though. Golden paths by their nature need to stay focused on constrained goals, and the goal for this one was to set up patterns and infrastructure for the most base-case first-person experience on PC. These foundations can then be built upon in future golden paths, just as pretty much everything I’m doing now builds on the workflows codified in the Fruit Fallin’ golden path.

My specific hope is to extend the first-person player Babylon Utility to support other inputs, most prominently mobile and XR. (Gamepad support is also a possibility, but not something I’m planning to do myself in the near future.) I’m not fully sure how I’ll do either yet, and like I said neither will be part of this golden path, but mobile will probably come first because it’s almost certainly easier. Ideally, the first-person player will reach a point where you can just plop it into a scene and immediately have first-person movement controls for PC, mobile, and XR, but that’s still several steps down the road. All that will be in a separate Utility to start with, though; whether and how it goes beyond that will be up to the Babylon Team and Community. :slight_smile:

1 Like