How can I duplicate my phone's display onto a TFT screen using processing and a Rasp Pi?

edited May 2017 in Raspberry PI

Hi all!

I'm new around here. I have previous programming and electronics experience but I'm new to Processing. I am trying to find a way of displaying my phone's screen onto the display of my raspberry pi (I've installed processing on the pi), mainly so I can have a duplicate of what the phone's camera is displaying. I've seen this done somewhere but do not know where to start. Can anyone give me some pointers as to where to start? Any ideas? It can be using cables or wireless, I don't think that matters too much at this stage.

Thank you very much in advance for any help. I'm excited to explore Processing in depth.

Cheers!

Emmanuel

Tagged:

Answers

  • Hi Emmanuel,

    How is the phone and the Raspberry Pi related to each other in what you are trying to do?

    You could add a camera to the Raspberry Pi itself, and display this image - but I guess that you want to have the phone mobile and untethered, is this right?

    Best, gohai

  • Hi Gohai,

    That is correct.

    I'm using a rasp pi since its what I have available that offers a portable and relatively small solution... What I'm trying to do, is to have a realtime duplicate of what my phone is seeing on its camera and/or screen, showing on my raspberry pi's display.

    Hope that made sense. :D

  • btw... found this on youtube, it might illustrate what im saying:

  • edited May 2017 Answer ✓

    For this you'll likely need to program two separate programs: one that runs on the phone, captures the camera data and sends it over the network. And one on the Raspberry Pi that receives and displays the data on its screen.

    I wasn't even aware that this is possible, but... the video you posted appears to make use of a technology called MHL, which is supported by some (but not all) Android phones (it isn't on mine). I am not sure if it is necessary to install any additional software on the side of the Raspberry Pi, but the phone seems to show up as a capture device in GLCapture.list(), at which point it can be used like a regular camera in the library.

    Be aware that the USB interface on all Raspberry Pis is particularly slow, which is most likely what cause the poor performance that the author saw.

    If you have an MHL-compatible phone you might want to go the same route... if not, and you have an Android phone, you could try Processing's Android mode to write an application that captures the camera input and sends it on.

    On the Raspberry Pi you could make use of the TCP or the UDP libraries for receiving data over the network (e.g. WiFi), or configure the GLVideo library to be interpreting the stream of data directly (see the GoProCapture example how something like this was done for GoPro).

    Hope this helps.

  • Thanks Gohai, it does help a lot!

Sign In or Register to comment.