macos - How to use Electron sharedTextureHandle returned by offscreen rendering - Stack Overflow

admin2025-04-16  3

I want to use an Electron's window updated (painted) texture to send it to Syphon framework via a node addon (my library: node-syphon).

TL;DR

Electron 33 offers an option for getting a shared texture from a window rendered offscreen:

// Javascript side.

const window: {
  // ...
  webPreferences: { 
    offscreen: { 
      useSharedTexture: true 
    }
  }
};

// Then...

window.webContents.on('paint', (event: any) => {
  const tex = event.texture;

  if (tex) {
    const handle = tex.textureInfo.sharedTextureHandle; // What is it really and how to use it?
    let handleNumber;

    if (os.endianness() == 'LE') {
      handleNumber = handle.readInt32LE();
    } else {
      handleNumber = handle.readInt32BE();
    }
    
    // ... send to node addon...

    tex?.release();
  }
});
// NAPI side.
GLuint tex = info[0].As<Napi::Number>().Uint32Value(); // Is it really a texture?

What is actually this sharedTextureHandle and how to use it with OpenGL (or Metal)?

I tried to pass it directly to Syphon server, but it resulted in a blank (white) texture, since context are different I guess.

I also tried, for testing, to get a uint8_t * buffer from it and write this buffer to a new texture, with no luck.

Resources:

  • Offscreen Rendering
  • OffscreenSharedTexture Object
  • 'paint' event
  • Discussion about this feature in the new version

Update 20250203

Returned handle points to an IOSurfaceRef as stated here. But I can't find a way to get it correctly.

auto buffer = info[0].As<Napi::Buffer<uintptr_t>>();

// Crashes as it is not an IOSurfaceRef.
IOSurfaceRef surface = *reinterpret_cast<IOSurfaceRef*>(buffer.Data()); 

// Weird tinkering: looks like OK, recognizes the IOSurface from... the Napi::Buffer, but with weird size (e.g. height is 1).
IOSurfaceRef surface = *reinterpret_cast<IOSurfaceRef*>(&buffer);

Previous attempt with getNativeWindowHandle() and NSView *

My first attempt was to get my window's native handle as NSView * to render its layer in a pixel buffer. It is 'kind of working' as it correctly renders the background... but nothing more, as if it is never updated.

// Javascript side.
const handle = win.getNativeWindowHandle();

// ... Send handle to node addon...
// NAPI side.
auto buffer = info[0].As<Napi::Buffer<uint8_t>>();
NSView * view = *reinterpret_cast<NSView **>(buffer.Data());

uint8_t * pixel_buffer = (uint8_t *)malloc(4 * view.bounds.size.width * view.bounds.size.height);

CGColorSpaceRef colourSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pixel_buffer, view.bounds.size.width, view.bounds.size.height, 8, 4 * view.bounds.size.width, colourSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);

CGColorSpaceRelease(colourSpace);

[view.layer renderInContext:context];

CGLSetCurrentContext([m_server context]);
CGLLockContext(CGLGetCurrentContext());

glGenTextures(1,& m_texture);  

glEnable(GL_RECTANGLE_2D);
glBindTexture(GL_RECTANGLE_2D, m_texture);

glTexImage2D(GL_RECTANGLE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, pixel_buffer);

glTexParameteri(GL_RECTANGLE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_RECTANGLE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

glBindTexture(m_texture, 0);
glDisable(GL_RECTANGLE_2D);

[m_server publishFrameTexture: m_texture 
                textureTarget: texture_target 
                  imageRegion: imageRegion 
            textureDimensions: texture_size 
                      flipped: flipped
];

glDeleteTextures(1, &m_texture);
CGContextRelease(context);
delete [] pixelBuffer;

CGLSetCurrentContext(NULL);

I want to use an Electron's window updated (painted) texture to send it to Syphon framework via a node addon (my library: node-syphon).

TL;DR

Electron 33 offers an option for getting a shared texture from a window rendered offscreen:

// Javascript side.

const window: {
  // ...
  webPreferences: { 
    offscreen: { 
      useSharedTexture: true 
    }
  }
};

// Then...

window.webContents.on('paint', (event: any) => {
  const tex = event.texture;

  if (tex) {
    const handle = tex.textureInfo.sharedTextureHandle; // What is it really and how to use it?
    let handleNumber;

    if (os.endianness() == 'LE') {
      handleNumber = handle.readInt32LE();
    } else {
      handleNumber = handle.readInt32BE();
    }
    
    // ... send to node addon...

    tex?.release();
  }
});
// NAPI side.
GLuint tex = info[0].As<Napi::Number>().Uint32Value(); // Is it really a texture?

What is actually this sharedTextureHandle and how to use it with OpenGL (or Metal)?

I tried to pass it directly to Syphon server, but it resulted in a blank (white) texture, since context are different I guess.

I also tried, for testing, to get a uint8_t * buffer from it and write this buffer to a new texture, with no luck.

Resources:

  • Offscreen Rendering
  • OffscreenSharedTexture Object
  • 'paint' event
  • Discussion about this feature in the new version

Update 20250203

Returned handle points to an IOSurfaceRef as stated here. But I can't find a way to get it correctly.

auto buffer = info[0].As<Napi::Buffer<uintptr_t>>();

// Crashes as it is not an IOSurfaceRef.
IOSurfaceRef surface = *reinterpret_cast<IOSurfaceRef*>(buffer.Data()); 

// Weird tinkering: looks like OK, recognizes the IOSurface from... the Napi::Buffer, but with weird size (e.g. height is 1).
IOSurfaceRef surface = *reinterpret_cast<IOSurfaceRef*>(&buffer);

Previous attempt with getNativeWindowHandle() and NSView *

My first attempt was to get my window's native handle as NSView * to render its layer in a pixel buffer. It is 'kind of working' as it correctly renders the background... but nothing more, as if it is never updated.

// Javascript side.
const handle = win.getNativeWindowHandle();

// ... Send handle to node addon...
// NAPI side.
auto buffer = info[0].As<Napi::Buffer<uint8_t>>();
NSView * view = *reinterpret_cast<NSView **>(buffer.Data());

uint8_t * pixel_buffer = (uint8_t *)malloc(4 * view.bounds.size.width * view.bounds.size.height);

CGColorSpaceRef colourSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pixel_buffer, view.bounds.size.width, view.bounds.size.height, 8, 4 * view.bounds.size.width, colourSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);

CGColorSpaceRelease(colourSpace);

[view.layer renderInContext:context];

CGLSetCurrentContext([m_server context]);
CGLLockContext(CGLGetCurrentContext());

glGenTextures(1,& m_texture);  

glEnable(GL_RECTANGLE_2D);
glBindTexture(GL_RECTANGLE_2D, m_texture);

glTexImage2D(GL_RECTANGLE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, pixel_buffer);

glTexParameteri(GL_RECTANGLE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_RECTANGLE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

glBindTexture(m_texture, 0);
glDisable(GL_RECTANGLE_2D);

[m_server publishFrameTexture: m_texture 
                textureTarget: texture_target 
                  imageRegion: imageRegion 
            textureDimensions: texture_size 
                      flipped: flipped
];

glDeleteTextures(1, &m_texture);
CGContextRelease(context);
delete [] pixelBuffer;

CGLSetCurrentContext(NULL);
Share Improve this question edited Feb 3 at 12:22 Benoît Lahoz asked Feb 2 at 18:06 Benoît LahozBenoît Lahoz 1,3961 gold badge19 silver badges48 bronze badges
Add a comment  | 

1 Answer 1

Reset to default 0

After posting this issue on Electron's GitHub, the creator of the feature wrote an example (this PR) I was not so far of.

My problem was that I was passing the handle (Buffer) to a worker, that clones the buffer into a Uint8Array, and was converting back to a Buffer, losing data I guess.

A piece of code that works, slightly adapted from the example:

auto buffer = info[0].As<Napi::Buffer<void**>>();

IOSurfaceRef io_surface = *reinterpret_cast<IOSurfaceRef*>(buffer.Data());
GLsizei width = (GLsizei)IOSurfaceGetWidth(io_surface);
GLsizei height = (GLsizei)IOSurfaceGetHeight(io_surface);

CGLSetCurrentContext(/** YOUR CGL CONTEXT **/);
CGLLockContext(CGLGetCurrentContext());

GLuint m_texture;
glGenTextures(1, &m_texture);
glEnable(GL_TEXTURE_RECTANGLE_ARB);
glBindTexture(GL_TEXTURE_RECTANGLE_ARB, m_texture);

CGLTexImageIOSurface2D(CGLGetCurrentContext(), GL_TEXTURE_RECTANGLE_ARB, GL_RGBA8, width, height, GL_BGRA, GL_UNSIGNED_INT_8_8_8_8_REV, io_surface, 0);

glTexParameteri(GL_TEXTURE_RECTANGLE_ARB, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_RECTANGLE_ARB, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glBindTexture(GL_TEXTURE_RECTANGLE_ARB, 0);
  
// Here do something with the texture.

glDeleteTextures(1, &m_texture);
CGLUnlockContext(CGLGetCurrentContext());
CGLSetCurrentContext(NULL);

ThreeJs in Electron renderer to Syphon framework

转载请注明原文地址:http://anycun.com/QandA/1744792342a87688.html