Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Windowless operation / DirectX integration #20

Open
nnemkin opened this issue May 31, 2019 · 16 comments
Open

Windowless operation / DirectX integration #20

nnemkin opened this issue May 31, 2019 · 16 comments
Assignees
Labels
feature request feature request tracked We are tracking this work internally.

Comments

@nnemkin
Copy link

nnemkin commented May 31, 2019

Please provide an API that doesn't require a HWND parent, something that integrates well with Direct2D, Direct3D and DirectComposition.

I.e. provide (or consume) a DXGI surface that I can display with the abovementioned APIs + a way to feed input to a WebView + a notification mechanism for visual updates.

AB#28491736

@david-risney david-risney reopened this May 31, 2019
@david-risney david-risney added the feature request feature request label May 31, 2019
@david-risney
Copy link
Contributor

(Accidentally closed. Reopening)
Thanks, great feature suggestion! We're working on this. Do you have any specific ideas about how you would most like the API to look and why? Thanks!

@nnemkin
Copy link
Author

nnemkin commented Jun 12, 2019

RichEdit has windowless D2D mode, something similar would be nice. A combination of ITextServices2::TxDrawD2D, ITextServices::TxSendMessage and ITextHost::TxInvalidateRect (+ other host methods you find necessary).

Instead of (or in addition to) drawing to D2D context, provide a method to draw into a user specified DXGI surface, which can be shared with both D2D and D3D.

A simple but limited option is to provide a windowless swap chain that can be plugged into DirectComposition.

@pabloko
Copy link

pabloko commented Sep 25, 2019

Looking into this one, an offscreen rendering mode should be a must-have feature on the component.
Real offscreen, in addition to reliable framebuffer access, should also include access for raw audio and interfaces for input events proxy, as in CEF project that shares chromium codebase, its worth reading this issue from where chromium switched rendering to Viz.
As @nnemkin points out, getting into the swap chain would allow gpu only solution but im afraid this isn't so easy since that lives on another process.

@david-risney
Copy link
Contributor

Thanks for the comment!

We're currently working on a DirectComposition solution where the host app can provide a visual that the WebView content will be rendered into and with explicit APIs for input.

We don't plan on a D2D, DXGI Surface, or DXGI SwapChain based solution but let us know if there are issues with a visual based DirectComposition solution that don't work for you that would be solved by D2D/DXGI.

Can you elaborate on the raw audio comment? Our current plan is for the WebView to produce audio directly without help from the host app.

@pabloko
Copy link

pabloko commented Sep 28, 2019

Well our use case is to render the webview over live video feed for graphics overlay production.
I think that self explains why I would need access to raw audio/video in some manner, instead drawing to a window and playing trought default device. Many thanks for your comments.

@codecat
Copy link

codecat commented Jun 11, 2020

Has there been any significant progress in this area since the last few comments? I'd love to get a WebView rendered on top of a game for its UI. Plugging it directly into D3D would be a really good solution.

@llcepick
Copy link

So for the API surface - the ability to ask for a 32-bit RGBA like surface to render to would allow post-render composition - so rendering pages with transparent backgrounds would be able to be composed onto any other surface types. Support for higher-color-bit counts with HDR rendering would be also beneficial into the future as more and more screens are getting higher than 10-bit color support - which would be useful as well. Extending the CaptureRender API to allow for flags to PNG would even be good enough for many of these cases - cause the rendering would typically not need to be 'realtime' for a lot of cases.

Ideally, this should also work if the control is only showing a SVG (instead of a page) - with the alpha channel rendering preserved.

@douglas-jordan
Copy link

Rendering into a DC would be great for my usage.

@philomathikus
Copy link

Do you guys have an ETA on this? Btw, could we make it work with swapchain already?

@jnschulze
Copy link

Any update on this?
The only workaround I currently see is using GraphicsCaptureItem.CreateFromVisual(my_webview_visual) for obtaining frames on recent Windows 10 versions. While this works, I don't think the capturing API is intended for that and it keeps producing frames even if nothing is actually drawn.
So please provide a way for directly obtaining a DXGI surface handle as CEF's CefRenderHandler.OnAcceleratedPaint does.

@blaind
Copy link

blaind commented Feb 5, 2022

I've been experimenting with this (no success), learnings so far:

  • the CreateFromVisual method requires a WPF application, and does not work with HWND's (based on documentation)
  • BitBlt or AlphaBlend do not work when using any of the webview2 windows as a source. This is because webview2 is rendered using compositing (GPU). This may be caused because the webview window has WS_EX_NOREDIRECTIONBITMAP flag turned on
  • It is not possible to toggle the WS_EX_NOREDIRECTIONBITMAP off anymore by SetWindowLongPtr, after the window has been created. Also, the flag is not toggleable (?) in Chromium (see https://chromium.googlesource.com/chromium/src/+/refs/heads/main/ui/gl/child_window_win.cc)
  • Reparenting of the webview window to the main window does not work
  • Disabling the chrome composition or GPU does not change the situation (--disable-gpu, --disable-composition)
  • Webview2.CapturePreview works, but instead of bitmaps it produces PNG/JPG and is very slow
  • According to some sources it is possible to capture the whole screen, and crop the area of webview. Doesn't work if the webview is renderer off-screen (or to a hidden window)
  • Using GraphicsCapture.CreateForWindow crashes, if trying to supply any of the webview2 HWND handles to it. Using it for main windows (into which webview2 is initialized) does not produce anything valid

Not sure, if a valid combination of WS_EX_COMPOSITED / WS_EX_TRANSPARENT / WS_EX_LAYERED and a BitBlt could work (e.g. is it possible to get the parent window to receive data into redirection bitmap from the underlying composited webview2 window)?

(I have no experience of Win32 API's before this, so there's reasonable chance that I did some error, and some of the above could in fact have worked... anyway, hopefully this helps if anyone else investigating)

@jnschulze
Copy link

  • the CreateFromVisual method requires a WPF application, and does not work with HWND's (based on documentation)

I use GraphicsCaptureItem.CreateFromVisual in the Flutter webview_windows plugin and it works fine with a low overhead. However, it doesn’t support DRM’ed content.

@champnic champnic added the tracked We are tracking this work internally. label May 10, 2022
@honzapatCZ
Copy link

DXGI surface might be too limiting. Bitmap as electron provides would be more preferred in my opinion.

@fredemmott
Copy link

fredemmott commented Mar 8, 2023

Please provide an API that doesn't require a HWND parent

Passing HWND_MESSAGE gets this part, as long as you don't want to render live at all (perhaps an HTML -> PDF converter would work with this?), but...

something that integrates well with Direct2D, Direct3D and DirectComposition.

... not this.

To add another use case, https://github.com/OpenKneeboard/OpenKneeboard renders in VR as a compositor layer; SteamVR, OpenXR, Oculus, and Windows Holographic compositor layers (e.g. Windows.Graphics.Holographic.HolographicQuadLayer) do not have an HWND or any support for DirectComposition - just a Direct3D texture (or OpenGL or Vulkan, but there is existing interop there).

@wegylexy
Copy link

wegylexy commented Jun 8, 2023

My use case is similar to https://streamlabs.com/content-hub/post/introducing-browser-source-interaction-for-streamlabs-desktop that it redirects video and audio for offscreen composition, and support virtual mouse/keyboard/touch/stylus/voice inputs in the background. Support for transparent background is especially important for adding an overlay to a video composition using web technology. I want a way to enable DXGI capture, and if possible, an NDI source (with ARGB video and audio).
Audio redirection would enable further processing with custom effects, e.g. custom ducking filter during live stream, spatial positioning in VR.

@fredemmott
Copy link

  • the CreateFromVisual method requires a WPF application, and does not work with HWND's (based on documentation)

I use GraphicsCaptureItem.CreateFromVisual in the Flutter webview_windows plugin and it works fine with a low overhead. However, it doesn’t support DRM’ed content.

I ended up taking the same approach but with C++/WinRT; outlined at #547 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request feature request tracked We are tracking this work internally.
Projects
None yet
Development

No branches or pull requests