-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Stereo Quad Buffer Support #1
Comments
Thanks! Stereo-QuadBuffer - you mean Direct3D 11.1 native S3D which implemented in Unity as "Non HMD"? I don't found it via the new XR plugin management system but able to enable it via editor script in Unity2020. I already made a version of my S3D system with the "Non HMD" and get their S3D settings to working correctly using my system. But anyway D3D 11.1 S3D have problems - GUI shows only in one eye(I solved this) and not working correctly with URP, HDRP, and Linear color space which also was mentioned here - https://docs.unity3d.com/560/Documentation/Manual/StereoscopicRendering.html |
Thanks for your quick reply. I would be curious to understand how to restore S3D Direct3D 11.1 support via editor script. I agree with you on the advantages of your system, but for certain types of installations it makes sense to use active stereo, so I ask you if you can create a branch with this functionality, I would be grateful. |
Ok, I will add the branch. |
Easier than I thought, thanks ! |
I built the demo with D3D11.1 - https://drive.google.com/file/d/1GcSBspZIISI0UMH745li9e0Xp2iZbxyQ/view?usp=sharing |
Hi Vital, thank you for the excellent support, yesterday I was able to test the two demos you sent me. I have tested them on two different systems. System A: NVIDIA Quadro RTX 4000 - DirectX 12 - Stereo - Display Mode: nView Clone mode The two demos work correctly on both sistems, I confirm that the linear version is white-out. Aftar that, I created a tiny demo project from scratch to check how it behaves with the native stereo support with Unity 2020.3.0f1, simply by re-enabling 3D stereo support with the script you kindly passed me and nothing else. Result > The build goes in stereo mode only on System B. I suspect the problem may be the Nvidia Clone Mode, but I am wondering instead why your build is ok on that system, Maybe I'm missing somethings? N.B. Unity 2019 with Stereo (Non HMD) still works on both systems |
Hi XR-Jaco, |
Hi Vital-Volkov, thanks for sharing the D3D11.1 branch. I know this issue is not related to your library, but I am reporting it for completeness. |
Hi XR-Jaco, |
I wonder if switching between the d3d stereo and None HMD modes at the runtime is possible? |
|
Hi there, I am new to Stereo3D, so apologies if this is a repeat question. I'm currently using an LTS version of Unity 2020 (2020.3.16f1). The editor script enables the Stereo3D, as there is no option in the XR plugin management in player settings like for 2019 and earlier. Would the Stereo3D script then be attached to the camera? Are there any other steps required for the setup for active Stereo? Thanks in advance. |
Hi, |
Great! Thanks for the fast response and for producing this very useful repo! |
Hi Vital, First of all, thank you for publishing this great repo, I am a Robotics Ph.D. student, and I have 2 questions for you: 1) Can I use your work for a scientific publication? How do I credit your work? Do you have preferences on how should I cite your work? 2) I have tested Stereo3D on widows with a Gefore RTX 980 Ti with Unity 2019 and a Sony 3D TV with active glasses. I need however to run it with the same hardware on Unity 5.6.2 on Ubuntu 16.0 Linux, the reason is that I have a Robot that interfaces with Unity Thanks |
Hi naveed1366,
|
Dear Vital, |
Hi gui-vasconcelos, P.S. I found that Cinemachine is working if changing camera culling mask from Nothing(I set this for 7% FPS boost in S3D mode) to Everything or Default |
Hi Vital.
I've managed to 'bypass' using Cinemachine and now it is working fine. But
thank you anyway.
I'd like to ask you another question. When I close the Stereo3d window
(pressing Tab) I lose my mouse pointer. Is there a way to avoid this?
Thanks!
*Guilherme Nunes de Vasconcelos*
*Professor Adjunto | Adjunct Professor*
*Escola de Arquitetura da UFMG - Departamento de Projetos*
*f: +55 31 3409 8812*
Em sex., 19 de ago. de 2022 às 19:20, Vital ***@***.***>
escreveu:
… Dear Vital, Thank you so much for this very useful repo. It worked
perfectly for an academic app I developed some time ago. My actual project
involves using Unity's Cinemachine - an asset that manages multiple cams
and eases working with multiple cams. The problem is that it does not use
conventional cameras, but "virtual" ones. Thus, if i attach your script to
all virtual cams it does not work, and if i attach to the CineMachineBrain
(the object that manages all virtual cameras), it works for a single cam,
but it does not apply any transition to the other virtual cams. Have you
tried you script through Cinemachine or knows some workaround. All the best!
Hi gui-vasconcelos,
I glad that my repo is useful, thanks. I not used Cinemachine but I can
look how it working and try to find workaround.
—
Reply to this email directly, view it on GitHub
<#1 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AENGOFBMC5YVVQUZNFE7OUDV2ACB7ANCNFSM4ZOT3VIQ>
.
You are receiving this because you commented.Message ID:
<Vital-Volkov/Stereoscopic-3D-system-for-Unity-2019-/issues/1/1221132533@
github.com>
|
Find "Cursor.visible = false;" and comment the line(remove). Also make same with the "Cursor.lockState = CursorLockMode.Locked;" to avoid locking cursor. |
Thank you so much. Done! ;)
*Guilherme Nunes de Vasconcelos*
*Professor Adjunto | Adjunct Professor*
*Escola de Arquitetura da UFMG - Departamento de Projetos*
*f: +55 31 3409 8812*
Em ter., 23 de ago. de 2022 às 20:01, Vital ***@***.***>
escreveu:
… I'd like to ask you another question. When I close the Stereo3d window
(pressing Tab) I lose my mouse pointer. Is there a way to avoid this?
Thanks!
Find "Cursor.visible = false;" and comment the line(remove). Also make
same with the "Cursor.lockState = CursorLockMode.Locked;" to avoid locking
cursor.
—
Reply to this email directly, view it on GitHub
<#1 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AENGOFDSBUSRGJL2T6UBRKLV2VJ3NANCNFSM4ZOT3VIQ>
.
You are receiving this because you commented.Message ID:
<Vital-Volkov/Stereoscopic-3D-system-for-Unity-2019-/issues/1/1224972499@
github.com>
|
Dear @Vital-Volkov I'm using Unity 2021.14 LTD (Locked to this version) I have confimed that Quad buffer stereo is working on my system with older applications however following your steps I cannot seem to get my project to activate stereo QBS when loading. The enable stereo boolean seems to be false in this version of unity and overriding it still doesn't render them in quad buffer. YOur demos above work perfectly well on the system (AMD Quad Buffer with 2x 4K Displays). Would it be possible to grab the whole project so i can test a build out fully my end? |
I even not have installed XR kit. Unity's builtin DirectX 11 S3D(you call it QBS I guess) plugin activated via Editor Script file "VR_SDK_Enable_EditorMenu" in "Editor" folder(as result in Unity Editor must be new Menu "Vr SDK") selecting "Vr SDK->Build with Stereo3D". And you can see working DirectX 11 S3D only in builded project(not in Unity Editor) with the plugin on Windows 8.1+ and enabled S3D in Windows video driver settings. Do you see "Vr SDK->Build with Stereo3D" menu in Unity 2021.14 LTD while you have "VR_SDK_Enable_EditorMenu" file in "Editor" folder? |
Hey, I do yes, i select the Build With Stereo Option, then i do a normal build and run it but stereo doesn't seem to turn on. I have the Windows S3D enabled and other S3D Applications like the demo work too. I can confirm in unity 2019, this does work perfectly fine and builds for stereo, however in Unity 2021 it doesn't work nor does it work in Unity 2020 (We need C# 8) It seems it was removed entirely in 2020.2, running c# 8, 2020.1 still supports it but runs C# 7.3. This would explain it |
Hi VRS3DGuru2! No problem, I'll make it ;) I've already made my S3D system for UE5 and now making the UI as a 3D widget attached to the camera so I have an S3D UI with adjustable depth. After release for UE5 I'll make the same update for Unity. When I tested one pass S3D method implemented in Unity it did not work with required camera's separation(view difference) and only engaged at only specific low separation(too small view difference) and I didn't see performance boost anyway so it must be tested. If you need a sequential method I can add it too. Best Regards |
Hi there, please reach out to me if someone would like to discuss contract work to make this happen. Our company has built built novel hardware to bring back active 3D to displays and TVs. Please visit www.athanos.com to see what we are doing (there are plenty of videos that go over the details in the MEDIA section of the page). I have started looking over the Unity XR SDK and I believe it's possible to get things working again, but would prefer to first validate and then possibly contract out the implementation to an expert in the field. You can contact me through the Contact page or reach out to me directly at peter@athanos.com Thank you! |
Hi everyone, Thanks for all the interesting discussions and thanks @Vital-Volkov for this great repository ! If I had to enable back a frame sequential S3D method, with the aim to get shutter glasses synced directly from Unity, this is where I would start my investigations: this plugin is for Nvidia 3D vision but anyway it gives us an overview how to implement S3D using a native plug-in. I would try to implement a native plugin using D3D code and then I would try to implement a working Unity XR SDK version. What do you think ? Emmanuel |
Hi Emmanuel, This makes sense to me. I need to look deeper into the implementation of the Nvidia3D vision plug-in. My 'ideal' XR plug-in would do the following: It would integrate perfectly into any Unity project pipeline and when turned on, it will handle the buffer presentations at the frequency of the display. Much like time warp requirements in VR, there must be intervention at the display frequency to make a decision: do I show the previous L/R pair because the new L/R pair hasn't come in yet due to CPU/GPU stall? As well, it would require some extra data attached to submitted buffers that can be passed down from Unity. Mainly Left or Right frame as well as frame number (so something like L1, R1, L2, R2 etc.) Having a solid framework where experimentation can take place at display frequency, along with optimized blits of buffers (if required; it would be better to do the target buffer flips in GPU memory IF possible) would be a really good start. I'm assuming full frame frame submission here at this time, where I am rendering Left on one frame and Right the following frame at full resolution. Down the road, using FFSBS or TAB modes and converting them to the proper output (for active, passive or glasses-free) should be considered. Best, |
Hi Peter! I explored your videos. I also made tracking for my S3D system in 2015 with TrackIR5 to be able free tilt head(circular polarized glasses allow this but only give purple ghosting max at 90deg and at 180deg clear again) and move relative screen - https://youtu.be/zYCOV7fKqrI TrackIR5 is not precise and has lag but now with HMD tracking quality such function is must have)) So you can play non VR games in S3D on virtual screen in VR and see 3D world like real world through window with respect to movement relative the window. As I understand from your media, regular OLED displays are good for sequential but to minimize crosstalk ghosting black frame insertion is required. But what is required from software except for switching left-right image each frame. OK ideally active glasses should receive info which frame is left but why do you need frame number? |
Hi ebadier, thanks! Yes, a desktop S3D must live and it has absolutely precision geometry unlike VR lense distortions I can see in Quest 2)). |
Hi Vital, thank you for the message. I will admit that I don't know the best approach to implementing a quad buffer in a software-only solution. My guess is that a brute force method of tagging buffers as they come in can help differentiate which ones should be used at any time. If there is major lag from rendering and the frame-rate drops from say, 60 to 15, there may be logic that indicates that the previous (back buffer) frames are 2 frames behind. I do see value in adding extra data for each buffer that will allow for passing of info from main thread and graphics thread. This would help in debugging as well. Quick question to everyone: using the Unity XR SDK, does the graphics thread get a chance to run code at the refresh rate of the monitor, or is there a chance that a processing frame can be skipped due to a GPU stall? When I was at Oculus (early days), we were just starting to explore and implement time warp for dropped frames, but an active stereo display has the requirement of the display needing to be fed a L or R frame on every single frame. A stall on a monolithic L/R frame in VR is subjectively OK since it won't cause the eyes to cross, but it will cause judder. Finally, I don't know if the 'compositor' concept in VR is a separate app (like a Spout receiver, for example), or if it's a separate dll that communicates within the app alongside the XR plugin, OR if it's really just an XR plugin itself. I would hope that 'compositor' like functionality can be built into an XR plugin, provided the GPU thread is keeping up with the display frequency. |
Thanks Peter! Thanks for the info VRS3DGuru2! So we need to make a universal S3D plugin for developers to build into projects, maximum possible independent from 3D drivers software and this is the best and maximum performance way for S3D. |
Hi VRS3DGuru and Vital, Thank you both for detailed responses! Currently, I encode the information of which frame is left and which is right directly onto the display buffer (in Unity, I render on GUI in main camera to ensure that it's the last thing written into the buffer). I use 2 cameras, one for left and one for right, and every other frame I shut down the previous camera, and also ensure that I step main code logic through only once for the pair of cameras. I cannot guarantee that anything time-related which should be run for both left and right at the same time are synced 100% (i.e. physics, animation etc.) but I have enough checks and failsafe's in place with my demos to ensure camera movement is perfectly synced. The SYNC device reads directly off the display and keeps the active glasses in sync, even if there are frame drops (the glasses eventually self-correct). However, the issue of frame dropping is the first problem I want to tackle, so I propose a quick solution to ensure that we can feed the SYNC device with a steady signal to mitigate the frame drops in expense of temporal fidelity. Instead of fully developing a quad buffer, I would like to ensure I have a 'previous' left or right buffer in place that can be switched to, even if it's not up to date. The outcome of this is that there should never be a frame drop that will de-sync the glasses, at the expense of having a temporal shift due to the fact that both eyes are potentially not seeing in sync (i.e. left eye is frame 2, while right eye is frame 3, for example). However, utilizing the reprojection capabilities of the XR SDK and building a simple solution like this should validate if true Quad buffering can be done. Also, this solution is built upon an agnostic hardware synchronization platform being developed at Athanos, and should continue to stay as detached to any caveats that may come up, from using anything that ties to a specific API or GPU architecture. I am compiling the Unity demos with DX11 at this time, which I feel is a stable and generic enough windows-based graphics API, and would like to continue using it. What do you think? |
Hi Adrian,
Of course I am interested. :) What are the components and where are you
from?
…On Wed, Apr 19, 2023 at 3:03 AM adriandrewes ***@***.***> wrote:
So I need to get an active S3D setup for testing all this before adding it
to my plugin.
Hello Vital,
because I am very interested in the development I could provide free
components for an active stereo setup. Are you interested?
—
Reply to this email directly, view it on GitHub
<#1 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AR7YEBCNCO4ZCJO64ZFBAYTXB6Z6LANCNFSM4ZOT3VIQ>
.
You are receiving this because you were mentioned.Message ID:
<Vital-Volkov/Stereoscopic-3D-system-for-Unity-2019-/issues/1/1514466085@
github.com>
|
Hi Peter, thank you too for the info! |
Hi, I would have intuitively chosen OpenGL as it was the working solution till recently and it could also work on linux.
I could not find any indications for URP/HDRP being incompatible with OpenGL or are you referring to DX specific raytracing features?
@VRS3DGuru2 do you have a source for the DX 12 stereo support as I could not find anything about that. And what do you mean by Nvidia has not leveraged it? Does it mean you can force nvidia cards to work with it somehow or does it mean it does not work at all? |
I searched google, bing and even bings AI for DX12 quadbuffer and the only thing any of the search engines finds is this exact thread.
What kind of genlock would you use for this? Are you talking about G-Sync / Freesync? And would that work with multiple monitors / projectors? |
Does anyone have any other information about how stereo quad buffers work with DX12, or with Unity? I'm trying to investigate the viability of accomplishing something similar with renderer features for URP/HDRP but would vastly prefer to hook into a lower level method if possible. There seems to be vastly little public knowledge on this topic. |
Hi! As I understand, Quad Buffer is supported only on certain devices like Quadro GPUs and certain monitors. We should avoid any restriction if possible and now I am thinking about how to manually draw render textures(custom quad buffer) each hertz of monitor independent from GPU FPS and if we found a solution then active shutter stereoscopic will work on any 3D API, any GPU and any monitor. I am now searching code examples of shutter output methods on the net to see how it was implemented before NVIDIA 3D vision and Quad Buffer existed at all. |
The first 3D shutter glasses only worked with the graphics card driver suitable for stereo output. We used Elsa shutter glasses back in 1999. However, these glasses were already running with NVidia quadro drivers. Have a look at this page. http://www.stereo3d.com/revelator.htm |
I also have edimensional 3D glasses somewhere :) https://www.guru3d.com/page/edimensional-e-d-3d-glasses/ |
Hi Peter! I found in Unity offscreen rendering to texture using coroutine even when the main camera is disabled, FPS of black screen(render nothing) in Unity player drops to low due to offscreen heavy rendering to texture. Problem is the render process in Unity can't be parallel or detached from the main camera to show every screen refresh, only when all cameras in scene finish rendering then something can be shown on screen, even just clear screen with color waiting for all cameras rendering to complete. So I see solution as light rendering in main thread only screen quad with S3D shader which just compose output methods from already rendered textures and this lightest operation will guarantee show left or right image from last rendered textures(2 buffers of 4) at each refresh of screen independent from rendering process to current textures(last 2 buffers of 4) on separate offscreen thread. When rendering is complete then just switch pointer to new 2 completed textures in memory and begin rendering to old 2 or even additional 2 to avoid render/show waiting each other. So you have custom made quad or six buffers like this. :) |
Hello Vital, thank you for the informative message. I have not yet solved this task and am still wanting to. Your solution sounds very promising. Is it possible to code this up quickly to verify that it works? I can test on my end and if it does, I would be happy to send you a SYNC device so you could get it working on your end. I'm currently eyeing the following display monitor: This monitor apparently has black frame insertion in firmware for 120Hz refresh. Since it's a 240Hz monitor, it basically throws the black frame on it's side, which is great news since it will deterministically show a black frame even if the frame rate slows down on the compute side. It's supposed to go on sale tomorrow (fingers crossed); I am going to purchase it to test it out. Between the SYNC device, firmware BFI, and finally solving the quad buffer solution in Unity, we may see S3D be a thing again soon! Please let me know if it's possible to test the solution you have outlined above. Thank you!! |
Hello Peter, thank you too! |
Hi! I found this "To continue using quad-buffered stereo, developers must switch to the Microsoft native (DXGI) stereo APIs." and I'll look into it. |
Hi! I am now trying to get active shutter S3D works on my passive monitor using EDID override of the active LG W2363D monitor for testing and further development. |
Hi @Vital-Volkov and thanks for the amazing plugin ! We are trying to port our HMD app to also work with Stereo 3D Displays (such as powerwalls). We have confirmed that your build (as seen here) works :
Here is my problem : whenever we try to make even a simple app with your system enabled, the checkbox for native stereo ( "DirectX 11.1 S3D" ) can't be checked, just as if the driver didn't support it. I am sure this is just a configuration problem in our project, but the settings have changed compared to when you wrote the readme and I can't find what I'm missing. We are currently on Unity 2021.3 and can't move backwards. Here's what we have checked :
Can you nudge me in the right direction ? |
Hi Boris, thanks! I just built 2 old branches with D3D11 S3D Unity native plugin and confirmed D3D11 S3D not working. I just added the default App Package to https://github.com/Vital-Volkov/Remake-of-Direct3D11-native-stereoscopic-sample so you can install it(first uninstall remake) and run. I think you'll get the message "Stereo 3D is not enabled on your system" in the top left corner of the app but it should work with 425.31. My remake works with 452.06 because it ignores buggy checks of D3D11 S3D availability and just forces using S3D functionality of the NV driver. Anyway I'll soon add D3D11 S3D directly to my Unity S3D system as native plugin and it will directly work the same as the UWP app remake. |
I'm hesitant to chime in here because I don't fully understand the conversation, but I think I can possibly add some info that may help you all. Our modding group has used NVidia 3D Vision for years, and we continue to use it today, even though it's a canceled project. (HelixModBlog) Apologies in advance if this is not relevant. The 3D Vision Driver can still run on current drivers, but is not normally installed now because the product is defunct. However, we know that the DX9 code path still works in modern drivers if we force install the 3D Vision Driver. We can still play DX9 3D games, and use stereo photo viewers on 3D Vision hardware. The DX11 code path was destroyed when they released the 3xxx series, and the last working driver for that path is 452.06. I don't understand the relationship of QuadBuffered OpenGL to the drivers. I know that still exists for their professional series Quadro cards, but I don't know how that translates into consumer drivers. We know that some old time games like Doom still work, which strongly suggests QuadBuffered is still available. Also, I believe that they want to maintain HDMI compatibility for 3D. It's possible to force install the 3D Vision driver itself, and I wrote a tool to do it fully automatically after driver updates. See: https://github.com/bo3b/3DV_Installer. You can also use the 3DFM or HelixVision as suggested above, but I thought the source code can potentially be helpful. That tool changes the versions and calls the 3DV driver to install on any driver. It will then show up in the NVidia control panel as well, and allow you to enable/disable 3D Vision directly. As I note above, this may not be helpful, because you may not care about 3D Vision specifically. Once installed, the 3D Vision Driver will show 3D TV Play if it finds hardware on its white list for old 3D TVs or Projectors. Spoofing the EDID can allow you have it see what you want. Without any EDID, you can always get CRT mode, which allows red/blue 3D Vision Discover. |
I added the D3D11 output method as native rendering plugin for quick testing. |
Thanks bo3b! |
Thanks @Vital-Volkov. We'll try that as soon as someone has access to the stereo setup again! |
Hey @Vital-Volkov I just wanted to confirm that your recent update does indeed work, both on our setup and the client's ! Thank you again ! |
This package is fantastic, it would be great if it could support Stereo-QuadBuffer for Active Stereoscopic System (Cave, Virtual Wall, etc). Unfortunately Unity has removed support for Stereo (Non HMD) systems in 2020+ and it appears that it is required to go through the XR plugin management system in orderd to rebuild that functionality. What do you think about the matter?
The text was updated successfully, but these errors were encountered: