![Dilmer Valecillos](/img/default-banner.jpg)
- 950
- 8 524 026
Dilmer Valecillos
United States
Приєднався 5 гру 2012
XR Programmer 🥽 with expertise in Unity, C#, C++, Python, and a passion for coding. I am the founder of LearnXR.io, where I teach XR Development by offering AR, VR, and programming courses. In addition, I have worked with large enterprise clients on a variety of XR projects.
📣 For 1 Hour A Month Consultation To Help With You Project:
👉 www.patreon.com/dilmerv
📢 Consider Subscribing to:
👉 www.youtube.com/@dilmerv to avoid missing weekly videos.
📣 For 1 Hour A Month Consultation To Help With You Project:
👉 www.patreon.com/dilmerv
📢 Consider Subscribing to:
👉 www.youtube.com/@dilmerv to avoid missing weekly videos.
GIANT Rockets In Mixed Reality With OpenXR ML2 Spatial Anchors Features!
Today, I am excited to announce that Persistent Spatial Anchors with OpenXR are now available for Magic Leap 2.
❤️ Support on Patreon: www.patreon.com/dilmerv
🔔 Subscribe for more XR Tutorials : www.youtube.com/@dilmerv?sub_confirmation=1
🐦 Twitter X: dilmerv
👥 Discord : discord.gg/dNMHBc8KdP
📸 Instagram : tiktok.com/@dilmerval
📚 Video Chapters:
00:00 - Introduction To Spatial Anchors API & Spatial Anchors Storage API
02:13 - Unity Project Setup And Resources
02:26 - Camera Near Clipping Configuration & Spatial Anchor Permission
03:17 - AR Foundation (AR Anchor Manager Setup)
04:26 - C# Script With Spatial Anchors API (Anchor Creator Component)
17:05 - Anchor Creator Component Demo From The Headset
18:17 - Getting Additional Anchor Info From ML XR Anchor Subsystem
20:42 - ML XR Anchor Subsystem Demo
21:11 - Adding Spatial Anchors Storage API Capabilities
50:24 - Spatial Anchors Storage API Demo
51:00 - Adding Restore of Anchors From Storage With New UI
55:53 - Restoring of Anchors With UI Demo
56:52 - Outro
Spatial Anchors are fully compatible and built on top of Unity’s AR Foundation. These new API additions allow you to perform asynchronous calls for publishing, creating, and deleting anchors, as well as updating anchors expiration dates for those stored with the Spatial Anchor Storage API.
Thanks to @MagicLeap @MagicLeapDevs for sponsoring this video.
🥽 Learn & Get my XR Courses from:
www.learnxr.io
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news)
blog.learnxr.io
#openxr #unity #ml2 #programming
❤️ Support on Patreon: www.patreon.com/dilmerv
🔔 Subscribe for more XR Tutorials : www.youtube.com/@dilmerv?sub_confirmation=1
🐦 Twitter X: dilmerv
👥 Discord : discord.gg/dNMHBc8KdP
📸 Instagram : tiktok.com/@dilmerval
📚 Video Chapters:
00:00 - Introduction To Spatial Anchors API & Spatial Anchors Storage API
02:13 - Unity Project Setup And Resources
02:26 - Camera Near Clipping Configuration & Spatial Anchor Permission
03:17 - AR Foundation (AR Anchor Manager Setup)
04:26 - C# Script With Spatial Anchors API (Anchor Creator Component)
17:05 - Anchor Creator Component Demo From The Headset
18:17 - Getting Additional Anchor Info From ML XR Anchor Subsystem
20:42 - ML XR Anchor Subsystem Demo
21:11 - Adding Spatial Anchors Storage API Capabilities
50:24 - Spatial Anchors Storage API Demo
51:00 - Adding Restore of Anchors From Storage With New UI
55:53 - Restoring of Anchors With UI Demo
56:52 - Outro
Spatial Anchors are fully compatible and built on top of Unity’s AR Foundation. These new API additions allow you to perform asynchronous calls for publishing, creating, and deleting anchors, as well as updating anchors expiration dates for those stored with the Spatial Anchor Storage API.
Thanks to @MagicLeap @MagicLeapDevs for sponsoring this video.
🥽 Learn & Get my XR Courses from:
www.learnxr.io
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news)
blog.learnxr.io
#openxr #unity #ml2 #programming
Переглядів: 908
Відео
WebXR Tutorial: BUILD A Mixed Reality Game In Mattercraft Using Realtime Physics!
Переглядів 1,7 тис.21 день тому
In today's video, we're going to create a Mixed Reality (MR) game from the ground up using WebXR in Mattercraft. ❤️ Support on Patreon: www.patreon.com/dilmerv 🔔 Subscribe for more XR Tutorials : www.youtube.com/@dilmerv?sub_confirmation=1 🐦 Twitter X: dilmerv 👥 Discord : discord.gg/dNMHBc8KdP 📸 Instagram : dilmerv Mattercraft allows us to maintain a single codebase an...
Building A Mixed Reality Tabletop Game FAST - Quest 3 & Presence Platform!
Переглядів 3,7 тис.Місяць тому
Today, I am excited to announce a new UA-cam video series featuring Meta's Presence Platform. In this series, we will prototype small tabletop games and productivity apps with mixed reality. ❤️ Support on Patreon: www.patreon.com/dilmerv 🔔 Subscribe for more XR Tutorials : www.youtube.com/@dilmerv?sub_confirmation=1 🐦 Twitter X: dilmerv 👥 Discord : discord.gg/dNMHBc8KdP 📸 Instagram ...
WebXR in Mattercraft: Rapid AR/VR creation for Apple Vision Pro, Quest 3 & ML2
Переглядів 3,1 тис.2 місяці тому
Today, I would like to invite you to join me as I introduce you to a new WebXR tool and build a SpaceX fan prototype that runs on multiple platforms. ❤️ Support on Patreon: www.patreon.com/dilmerv 🔔 Subscribe for more XR Tutorials : www.youtube.com/@dilmerv?sub_confirmation=1 🐦 Twitter X: dilmerv 👥 Discord : discord.gg/dNMHBc8KdP 📸 Instagram : dilmerv 📢 To register for...
Unity visionOS 2D Windows and FULLY Immersive VR! (Apple Vision Pro Development)
Переглядів 4 тис.2 місяці тому
In this video, we'll take a look at how to create visionOS experiences in Unity using 2D Windowed and Fully Immersive app modes. ❤️ Support on Patreon: www.patreon.com/dilmerv 🔔 Subscribe for more XR Tutorials : www.youtube.com/@dilmerv?sub_confirmation=1 🐦 Twitter X: dilmerv 👥 Discord : discord.gg/dNMHBc8KdP 📸 Instagram : dilmerv Additionally, I will go over creating ...
Diving Into Unity OpenXR ML2 Gaze Features - Eye Tracking!
Переглядів 1,1 тис.2 місяці тому
Today, I'd like to go over a few Eye Tracking/Gaze Features that are now part of the new Magic Leap OpenXR support released recently. ❤️ Support on Patreon: www.patreon.com/dilmerv 🔔 Subscribe for more XR Tutorials : www.youtube.com/@dilmerv?sub_confirmation=1 🐦 Twitter X: dilmerv 👥 Discord : discord.gg/dNMHBc8KdP 📸 Instagram : dilmerv We will cover the gaze interactio...
How To GET Advanced VR & AR Player Data With Cognitive3D Analytics
Переглядів 2,2 тис.3 місяці тому
Today, I like to share my experience by using an advanced analytics tools for VR/AR which allows you to track and get player data. ❤️ Support on Patreon: www.patreon.com/dilmerv 🔔 Subscribe for more XR Tutorials : www.youtube.com/@dilmerv?sub_confirmation=1 🐦 Twitter X: dilmerv 👥 Discord : discord.gg/dNMHBc8KdP 📸 Instagram : dilmerv The visualizations offered by Cognit...
OpenXR With Magic Leap 2 NOW Available - Unity Setup & Plane Detection!
Переглядів 1,5 тис.3 місяці тому
Today, I am really excited to announce that Magic Leap is now moving from their custom MLSDK to OpenXR (Open standards for XR). ❤️ Support on Patreon: www.patreon.com/dilmerv 🔔 Subscribe for more XR Tutorials : www.youtube.com/@dilmerv?sub_confirmation=1 🐦 Twitter X: dilmerv 👥 Discord : discord.gg/dNMHBc8KdP 📸 Instagram : dilmerv This is a very good move because it now...
NEW Interaction SDK Features Are HERE! Hands Locomotion & MORE!
Переглядів 3,6 тис.3 місяці тому
Today, I am excited to provide you with an overview of Meta's New Interaction SDK features released with Meta XR version 62.0.0 & Interaction SDK OVR Samples package. ❤️ Support on Patreon: www.patreon.com/dilmerv 🔔 Subscribe for more XR Tutorials : www.youtube.com/@dilmerv?sub_confirmation=1 🐦 Twitter X: dilmerv 👥 Discord : discord.gg/dNMHBc8KdP 📸 Instagram : dilmerv ...
Step-By-Step Guide: Mapping My YouTube Studio With Immersal AR
Переглядів 1,5 тис.3 місяці тому
Today, I would like to continue my AR location-based video series, where I use Immersal to scan my UA-cam studio and generate an Augmented Reality map. ❤️ Support on Patreon: www.patreon.com/dilmerv 🔔 Subscribe for more XR Tutorials : www.youtube.com/@dilmerv?sub_confirmation=1 🐦 Twitter X: dilmerv 👥 Discord : discord.gg/dNMHBc8KdP 📸 Instagram : dilmerv We will then us...
How To Build Immersive AR Environments With Immersal SDK?
Переглядів 3 тис.4 місяці тому
How To Build Immersive AR Environments With Immersal SDK?
Apple Vision Pro Developer Strap Is Here! BUT DO You Need It?
Переглядів 14 тис.4 місяці тому
Apple Vision Pro Developer Strap Is Here! BUT DO You Need It?
Apple Vision Pro First Impressions From A DEV Perspective!
Переглядів 4,1 тис.4 місяці тому
Apple Vision Pro First Impressions From A DEV Perspective!
Unity XR Hands Custom Gestures Tools Are Here!
Переглядів 3,9 тис.4 місяці тому
Unity XR Hands Custom Gestures Tools Are Here!
HOW To Get Started With ML2 Hand Tracking Features In Unity (XR Toolkit & ML2 SDK)
Переглядів 1,5 тис.5 місяців тому
HOW To Get Started With ML2 Hand Tracking Features In Unity (XR Toolkit & ML2 SDK)
I Built A visionOS Prototype With ShapesXR And Unity!
Переглядів 2,8 тис.5 місяців тому
I Built A visionOS Prototype With ShapesXR And Unity!
Meta Haptics Studio and Haptics SDK: Full Walkthrough NOW Available!
Переглядів 1,9 тис.6 місяців тому
Meta Haptics Studio and Haptics SDK: Full Walkthrough NOW Available!
VERY FAST Iteration With Unity visionOS PolySpatial Play To Device!
Переглядів 4,3 тис.7 місяців тому
VERY FAST Iteration With Unity visionOS PolySpatial Play To Device!
Get Started With Unity visionOS PolySpatial Tools! (Apple Vision Pro Development)
Переглядів 28 тис.7 місяців тому
Get Started With Unity visionOS PolySpatial Tools! (Apple Vision Pro Development)
Powerful Magic Leap 2 INPUT Features Are HERE! (Controller & Head Pose)
Переглядів 1,5 тис.7 місяців тому
Powerful Magic Leap 2 INPUT Features Are HERE! (Controller & Head Pose)
Transitioning Between Realities: ML2 Dynamic Dimmer And Unity
Переглядів 1,4 тис.7 місяців тому
Transitioning Between Realities: ML2 Dynamic Dimmer And Unity
Quest 3: Powerful Mixed Reality Features with The NEW Meta Depth API
Переглядів 11 тис.7 місяців тому
Quest 3: Powerful Mixed Reality Features with The NEW Meta Depth API
Is The QUEST 3 Good For MR Development? (First Impressions & Dev Tools)
Переглядів 17 тис.8 місяців тому
Is The QUEST 3 Good For MR Development? (First Impressions & Dev Tools)
visionOS Development Fundamentals - RealityView Attachments, Systems, & More!
Переглядів 5 тис.8 місяців тому
visionOS Development Fundamentals - RealityView Attachments, Systems, & More!
Unreal Engine 5 For Magic Leap 2 Is Here - Build UE5 From Source!
Переглядів 2,3 тис.8 місяців тому
Unreal Engine 5 For Magic Leap 2 Is Here - Build UE5 From Source!
PICO Dev Jam 2023 - PICO 4 Review And Big Announcement!
Переглядів 2,5 тис.9 місяців тому
PICO Dev Jam 2023 - PICO 4 Review And Big Announcement!
visionOS Development Fundamentals - Build A VisionOS App From Scratch!
Переглядів 26 тис.9 місяців тому
visionOS Development Fundamentals - Build A VisionOS App From Scratch!
Unity Interaction Toolkit 2.4.0 & 2.5.0 Versions Are Here - Huge Improvements!
Переглядів 6 тис.9 місяців тому
Unity Interaction Toolkit 2.4.0 & 2.5.0 Versions Are Here - Huge Improvements!
How Easy Is It To Develop With Tilt Five AR Glasses? (Unity & Unreal)
Переглядів 2 тис.9 місяців тому
How Easy Is It To Develop With Tilt Five AR Glasses? (Unity & Unreal)
Brilliant Labs Monocle: The World's Smallest AR Glass Is Here!
Переглядів 16 тис.10 місяців тому
Brilliant Labs Monocle: The World's Smallest AR Glass Is Here!
I wantttttt
this is a fake ?
There's an earth there
Hello, first of all, I apologize for using a translator. This is because I am not familiar with English. After struggling for several days to create an AR Map project, I watched your video and successfully built it. (It's the best.) However, after building, no camera video is shown in the app. I traced the log and found that the following error occurred when building or running it on a PC. "No active ARCoreCameraSubsystem is available in this session, Please ensure that a valid loader configuration exists in the XR project settings. UnityEngine.Debug:LogError (object) Google.XR.ARCoreExtensions.ARCoreExtensions:OnEnable () (at ./Library/PackageCache/com.google.ar.core.arfoundation.extensions@f995c62e2b92/Runtime/Scripts/ARCoreExtensions.cs:213" I thought there was something wrong with my code, so I downloaded the code directly from Git and ran it, but the same symptom was repeated. Do you know anything about this issue? I would really appreciate it if you could give me a little advice. PS. I used Unity version 2022.3.32, and the remaining packages used the versions in the video.
Solved! The problem was solved by installing the latest versions of the packages just in case. I am posting a successful version just in case. Unity was version 2022.3.32, AR Foundation was 5.1.4, AR Core XR Plugin was 5.1.4, cesuim was 1.10.1, and ar core extension was 1.44.0. The problem that occurred before does not occur and we will try to address it further. thank you!!
Excuse me, but I have a few questions. I tracked the log, and the camera keeps showing a log saying 'waiting for ar session to become stable'. Is this normal? (It's not an error message, it's just a log.) Also, in Korea, where I am, Geospatial is not available in 3D. It's not recorded, but I'd like to know if Geospatial can't be used in such a space. Sorry to bother you, but I've been trying to resolve this for several days now. I would appreciate your help.
Can you make tutorial about connecting API with mattercraft. maybe like gift card message
I will put a sample demo together for this and reply here, if I don’t feel free to email me but I will try to get to it tomorrow.
@@dilmerv any news for this?
wow is that unnecessary complicated...
Pico 4 is better
Does this work with the XR Simulator in Editor?
can I down load this demo game?
You can download it from the video description through GitHub.
does it works when max players is 5000? :d
222
quest 2 gang
I have always had a question about mlagents: they randomly select actions at the beginning of training. Can we incorporate human intervention into the training process of mlagents to make them train faster? Is there a corresponding method in mlagents? Looking forward to your answer.
Take a look at these two resources: - Video about imitation learning which should drastically help with faster training: ua-cam.com/video/kpb8ZkMBFYs/v-deo.htmlsi=Foe8FnbPabfj2K9A And also the docs here: github.com/Unity-Technologies/ml-agents/blob/develop/docs/ML-Agents-Overview.md#imitation-learning Thank you for your comment!
I have always had a question about mlagents: they randomly select actions at the beginning of training. Can we incorporate human intervention into the training process of mlagents to make them train faster? Is there a corresponding method in mlagents? Looking forward to your answer.
Take a look at these two resources: - Video about imitation learning which should drastically help with faster training: ua-cam.com/video/kpb8ZkMBFYs/v-deo.htmlsi=Foe8FnbPabfj2K9A And also the docs here: github.com/Unity-Technologies/ml-agents/blob/develop/docs/ML-Agents-Overview.md#imitation-learning Thank you for your comment!
@@dilmerv Thank you very much for your reply, but I have found that the improvement learning in mlagents is just a step-by-step teaching of machine learning by humans, without any feedback, that is, it cannot "correct" the behavior of the machine, but only input the desired action step by step to the machine. Can mlagents achieve "real-time correction" of the machine's behavior? That is to say, there is a process of feedback.
I have always had a question about mlagents: they randomly select actions at the beginning of training. Can we incorporate human intervention into the training process of mlagents to make them train faster? Is there a corresponding method in mlagents? Looking forward to your answer.
Take a look at these two resources: - Video about imitation learning which should drastically help with faster training: ua-cam.com/video/kpb8ZkMBFYs/v-deo.htmlsi=Foe8FnbPabfj2K9A And also the docs here: github.com/Unity-Technologies/ml-agents/blob/develop/docs/ML-Agents-Overview.md#imitation-learning Thank you for your comment!
@@dilmerv Thank you very much for your reply, but I have found that the improvement learning in mlagents is just a step-by-step teaching of machine learning by humans, without any feedback, that is, it cannot "correct" the behavior of the machine, but only input the desired action step by step to the machine. Can mlagents achieve "real-time correction" of the machine's behavior? That is to say, there is a process of feedback.
I made it work somehow, but when I pass through the portal, everything inside the portal disappears.
it's a good place to put some secret keys for encription in that Plugin?
I love the pico 4 it’s one of my favorite headsets my fav headset is the rift s
why is your hand tracking better than mine
There's no difference man
Hi Dilmer, I have project made for Hololens 2 made in unity, which has Microsoft Mixed Reality Feature Tool inplemented, So I want that project to run on quest 2 and 3 , but the oculus integration is now deprecated, theres Meta-all-in-one-SDK , so can you make a video guiding to convert the project or guide me on this if possible , it would be a great help
That's a great idea, let me give this some thought but thank you for your feedback. As far as making it work, are you using MRTK 3? or MRTK 2?
@@dilmerv Hi Dilmer thankyou for your reply , I am using MRTK3 , with unity version 2022.3.3, with using MixedReality Toolkit in the project . Want to convert it to Meta Quest 2 and 3..
Hi Dilmer great video! This may be a silly question but I wanted to ask how to make the background of the app show my current room like how during your recording you see your studio. Currently it opens up the app with a black background/solid color. I am not sure if I need to use dynamic dimmer or something else but any help would be appreciated.
Hey thanks for your question. Can you check your URP profile and make sure HDR is uncheck? Also did you use the Magic Leap Setup tool to configure your project?
Are you testing the project in Unity or are you using an external app?
If you use PolySpatial then there is an option to use Play To Device which works for the simulator and also a remote app on the device. Without PolySpatial which is what I did here, I tested it mostly by deploying to the actual device. Great question!
Now, one year later...Quest2 and Quest3 win hands down
quest2❤❤❤❤❤
Hi Dilmer, thanks for your useful videos! I'm using Unity 2022.3.32f1 and iOS 17.3, but I got a XCode Error: Thread 1: EXC_BAD_ACCESS (code=1, address=0x0) when launching the app. There are some people got the same problem but I didn't find a useful solution, do you have any idea? Thanks!
Hey thanks for your message. Could you verify that you have the following or greater versions? - Xcode version 13.0 or later - Cocoapods 1.4.0 or later if using Cocoapods
@@dilmerv Thanks for the reply! I'm using Xcode 15.4 and Cocoapods 1.12.1
Hi, it is a nice plugin for the real-time body tracking, May I know if the Mars is support the body tracking with the mobile front camera
Great question. MARS uses ARKit behind the scenes which means only the rear-camera is supported for body tracking. I recommend looking into something like mediapipe github.com/google-ai-edge/mediapipe/blob/master/docs/solutions/pose.md
Hi, I've been trying to use the interaction sdk. Im using the interaction camera rig but I'm not able to move using the left joystick. How do I set up the joystick movement. Your help will be appreciated
Does the strap allow you to use the mac as a virtual display even if on another apple id/account? The main account is not the same one I use on my mac, as this is a shared device, but putting it on to test without being able to connect to my mac is infuriating and I can't swap out the accounts every time I do this. This is mainly to use sensors and other features the simulators does not have. For now I just resetup the eye and hands but would be willing to spend the money if this allows me to keep a different account on while developing
To be honest I am not sure, normally I use the same account across devices so I wouldn't be able to tell you. But as a dev yes that would be amazing if you do so since most times we don't use the same account during development for other companies.
Nice video, my friend. Although I do not have a ML2, it is good to see these tutorials and imaging what will come with the MagicLeap and Google partnership. We have to be prepared!!
Hey that’s a smart move, thanks for staying up to date and for your feedback Lair !
Hi Dilmer, great content as always! I just want to ask If you had to do area scan one more time after closing the app and reopening it again to restore the anchors?
Hey this is a great question. This is what Magic Leap responded to your question which makes sense to me as how it worked when I tested it. From Magic Leap “After scanning the space with spaces, the map will be saved to the device. Once the device is a localized into the space. It will try to re-localize to the space the next time it is reboot. The device tries to gain localization in the background , if the space was unchanged, the device will be able to localize i to it very quickly.” Let me know if you have other questions, thanks for your time.
Bruh, by “metaverse has been here for years” you’re essentially referring to VRCHAT
VRChat has contributed a ton to it, completely agree!
Many thanks for that! Would it also work for Xreal, iOS or android?
Great question and thanks for your comment. Currently, the Spatial Anchors will work with the devices mentioned through AR Foundation support, but the persistent aspect won’t unless you add other services such as Google Cloud Anchors or similar that will handle the native localization and storage part.
@@dilmerv many thanks! 🙏
Hey Dilmer, do you have some suggestions for HandGrabPoseTool of New Interaction SDK? It seems doesn't work recording Hand Grab Pose with Unity Editor 2022
Hey how are you! Mmm what part of it is not working? Are you getting any errors? Let me know and I can try to test it this weekend.
@@dilmerv Unity 2022.3.32 + Meta XR Interaction SDK OVR Samples v.65 + Scene HandGrabPoseTool + Meta Quest 3 (dev mode) connected with OculusLink. Unity Editor in Play mode cannot "record" grab poses, or better, no Hands are rocognized at all.
Great tutorial! I appreciate you taking the time to walk through each line of code to help follow along. It really makes a difference. Thank you!
Hey, thanks for letting me know. I did try to keep a balance between regular-paced coding and fast-forwarding, and I'm glad you noticed.
Hi Dilmer! is this possible to achieve on the oculus quest 3?
Hey great question, and yes Meta provides Spatial Anchors as well and they are also persistent. You can find more info about them here: developer.oculus.com/documentation/unity/unity-spatial-anchors-persist-content/ Thanks and let me know if you've further questions.
@@dilmerv thanks for the info man! do you plan to play this topic on your channel and test it as well?
A great lesson! Very interesting and fascinating. Thank you very much for the work you have done. Your lessons are very inspiring!
I am glad you found this video useful, thank you for letting me know. If you ever seek specific XR content let me know!
@@dilmerv thank you, I will take this into account)
💻 Unity project & video resources: github.com/dilmerv/MagicLeapSpatialAnchors 👉 Also, consider subscribing to my channel www.youtube.com/@dilmerv as it honestly helps me in bringing you more FREE content. Thank YOU!
Hi Dilmer, in minute 1:56, you mentioned that you would show how to deploy to the device, but I couldn't find that part. The demo project works in the simulator but not when I build.
Hey Juan thanks for your comment and for watching this video. Take a look at this video which should be the first one of this series: ua-cam.com/video/KqH0zv3e2AY/v-deo.html Let me know if you've additional questions.
Nice video. Very helpful! Thanks as always, Dilmer.
Hey thanks again for your feedback Lair !