Stream bitmaps from PC to RPi for display

As the complexity of my project has grown, I am now considering the possibility of switching from drawing and rendering my panel outputs directly on the Raspberry Pi, to rendering a sequence of bitmaps on a server which are streamed to the Pi for display.

Possible benefits of this are:

  • The Pi is now only driving the display and its CPU burden is reduced.
  • I can use the C++ library for rendering instead of Python, and get the performance improvements therein.
  • I can write the bitmap drawing code in whichever language I like, rather than one of the API supported languages.

My display is 64x128 and I’m aiming for 15-30fps, low latency.

Before I start trying to mangle the image-viewer example into receiving and decoding bitmap streams from the network, I wondered if there is an easier/pre-tested way of doing this. I get the feeling that using the image sequence pre-processor would be a good idea so I would imagine the dataflow to be something like this:

[Server]MyCanvasDrawingProg->[pipe]->RPI-RGB-LED-MatrixPreProcessor->[network]->[RaspberryPi]ApiImageViewer

I’m not looking for anyone to do my homework for me, just any suggestions or pointers in the right direction.

Thank you.

Have you already looked at GitHub - hzeller/flaschen-taschen: Noisebridge Flaschen Taschen display ? It may be close to what you described.

Hi

For a similar need, I used a network to segment an image and send it to multiple Rpis to make a large LED display.

This is my blog about this topic. Maybe it can help you with your homework.

1 Like

@ccoenen
This is exactly the kind of thing I was looking for.

Thank you.

1 Like