How can I create a live video stream in a SvelteKit app using a websocket as a source - Stack Overflow

admin2025-04-25  2

I am working on a web based robotics control panel with one of the desired features being the ability to stream video data from an onboard camera. The problem that I have found is that the data I receive from the rosbridge node for the camera are JPEG's in string form over a websocket and I'm having trouble finding a good way of converting this data into a video.
The current solution that I have tested to make sure that the camera is working it to update the src property of a <img /> tag with the string.

An example of what im trying to achive is the foxglove studio image panel however I can't find any open source code for it.

I am working on a web based robotics control panel with one of the desired features being the ability to stream video data from an onboard camera. The problem that I have found is that the data I receive from the rosbridge node for the camera are JPEG's in string form over a websocket and I'm having trouble finding a good way of converting this data into a video.
The current solution that I have tested to make sure that the camera is working it to update the src property of a <img /> tag with the string.

An example of what im trying to achive is the foxglove studio image panel however I can't find any open source code for it.

Share Improve this question asked Jan 15 at 7:24 Huon SwalesHuon Swales 417 bronze badges 3
  • See my answer here: stackoverflow.com/a/37475943/362536 – Brad Commented Jan 16 at 2:59
  • In any case, what you're getting right now is an MJPEG stream. If you must use this stream as-is, you're going to have to decode it and either output a MediaStream (developer.mozilla.org/en-US/docs/Web/API/VideoFrame) or render it to a canvas yourself. – Brad Commented Jan 16 at 3:02
  • Thanks for the response I just found a solutions that works well enough for what I need that involves rendering to a canvas as you say. – Huon Swales Commented Jan 16 at 7:13
Add a comment  | 

1 Answer 1

Reset to default 1

You will want to use webrtc for that. We wrote a long blog post explaining why this is the right choice for robotics. The key point is that it gives you UDP (so when network is bad, old frames can be dropped), h264 compression, congestion control (reduce image quality when network is bad), packet loss mitigation, it avoids the need for a VPN, and of course most of all: low-latency (~200ms). On the robot you will most likely want to use hardware acceleration, assuming your computer or SoC support that (they better).

Here the post: 5 Ways to Stream Video from Robots and Why You Should Use WebRTC

转载请注明原文地址:http://anycun.com/QandA/1745595453a90944.html