Open a camera preview inside a custom Layout (Android)

edited January 15 in Android Mode

Hi all, I'm creating a real time app when the user can freely draw on camera preview. But... i don't know how to open a camera preview in my app (for example inside a rect).

Please,help

Answers

  • We need to see your code to see your approach. Can you output your camera stream through your application?

    Kf

  • edited January 15

    Well, i don't have any approach yet. My idea is something like that :

    void settings(){
        fullScreen();
    }
    
    void draw(){
    //Drawing a rect to contain my preview
    
    rect(50,50,width*0.95,height/0.75);
    
    //now i need to put inside an Android Camera Preview
    //But...HOW TO ?
    }
    

    I tried something in Android Studio but i'm forced to use Google's Programmers logic and create a SurfaceView Object. And the result is a stretched image preview....

    So i need an hybrid... Google's native hardware camera controls and Processing Graphics (drawing shapes)

    Sorry for my bad english

  • You should try the ketai library. Ther is an example that will work for you: http://ketai.org/examples/cameragettingstarted/

    You can install the library from the Processing's library manager. However, make sure your android mode is set before you proceed any further. You should be able to run simple sketches in your android device. You can also check previous posts working with Android code either here: https://forum.processing.org/two/categories/android or using the search engine: https://forum.processing.org/two/search?Page=p3&Search=ketai or https://forum.processing.org/two/search?Page=p3&Search=android

    Kf

  • I see this library while googling but in Xubuntu i can't import it from the library manager...

  • You can install the library manually. I am curious why you can import in xubuntu. There are linux users in the community and I don't recall ppl having this issue.

    Kf

  • edited January 15

    Oh, so how can i install that manually ? Thanks

  • Try these instructions: https://forum.processing.org/one/topic/how-to-install-a-contributed-library.html

    Basically you download the library folder from their website and you place them in the library folder in your sketchpath. You can have a look at the other libraries installed as a reference. However I believe the instructions provided are clear.

    Kf

  • Ok thank you very much. I can confirm the issue. I renamed the folder from Ketai to ketai and now i see it in the "Import Library" menu.

    Now... i have to figure out how to use it for an implementation of what i want.

  • @the_real_doc=== as for the pure android coding that is simple: you create a class extending surfaceView && implementing surfaceHolder.Callback; yet, as P5 extends Applet this way is not the good one i think; perhaps creating another fragment??? or an interface??? - i ll give a look && tell you the result....

  • @the_real_doc=== ok, i have created a fragment and a videoView, it works

  • Do you mind sharing your code @akenaton?

    Kf

  • @kfrager=== yes i can share; problem is that for the video view i dont use ketai but the android api and so i dont think that it helps really @the_real_doc..., till it decides to try himself.

  • edited January 17

    If you use the native android api maybe it will be better in terms of speed. I have created a little preview with ketai but, it is REEEAAALLLY slow. low res, low fps, and it is slow too....

    I don't know...

    Obviously sharing ideas is great for the community.

    I tried to use the following code in Android Studio (sorry for some italian comments ;)):

    public class cameraPreview extends AppCompatActivity implements SurfaceHolder.Callback{
    
        private static final int REQUEST_CAMERA = 0;
        public static final String TAG = "cameraPreview";
        Camera camera;
        SurfaceView surfaceView;
        SurfaceHolder surfaceHolder;
        private int width;
        private int height;
        private TextView t;
    
        @Override
        protected void onCreate(Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
            requestWindowFeature(Window.FEATURE_NO_TITLE);
            setContentView(R.layout.activity_camera_preview);
            Intent intent = getIntent();
            t = (TextView)findViewById(R.id.textView4);
            Button start = (Button)findViewById(R.id.button4);
            start.setOnClickListener(new Button.OnClickListener() {
                public void onClick(View arg0) {
                    start_camera();
                }
            });
            Button data = (Button)findViewById(R.id.button5);
            data.setOnClickListener(new Button.OnClickListener() {
                public void onClick(View arg0) {
                    collect_data();
                }
            });
    
            surfaceView = (SurfaceView)findViewById(R.id.surfaceView2);
            surfaceHolder = surfaceView.getHolder();
            surfaceHolder.addCallback(this);
            surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
            width = surfaceView.getHolder().getSurfaceFrame().width();
            height = surfaceView.getHolder().getSurfaceFrame().height();
    
        }
    
        private void collect_data() {
            if(camera!=null){
                    float f[] = new float[3];
                    Camera.Parameters param= camera.getParameters();
                    param.getFocusDistances(f);
                    t.setText("Distanza Rilevata : " +f[0] + f[1] + f[2]);
            }
        }
    
    
        void start_camera(){
            try{
                camera = Camera.open();
            }catch(RuntimeException e){
                Log.e(TAG, "init_camera: " + e);
                return;
            }
            //modify parameter
            camera.setDisplayOrientation(270);
            Camera.Parameters params= camera.getParameters();
            width = surfaceView.getLayoutParams().width;
            height = surfaceView.getLayoutParams().height;
            List<String> focusModes = params.getSupportedFocusModes();
            params.setFocusMode(Camera.Parameters.FOCUS_MODE_FIXED);
            camera.setParameters(params);
    
            try {
                camera.setPreviewDisplay(surfaceHolder);
                camera.startPreview();
            } catch (Exception e) {
                Log.e(TAG, "init_camera: " + e);
                return;
            }
        }
    
        private void stop_camera() {
            camera.stopPreview();
            camera.release();
        }
    
        @Override
        public void surfaceCreated(SurfaceHolder surfaceHolder) {
    
        }
    
        @Override
        public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {
    
        }
    
        @Override
        public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
    
        }
    }
    
  • edited January 17

    @the_real_doc===

    felice di sapere che lei sappia parlare italiano...

    your code is a standard one and mine is quite the same for eclipse or as; it runs fine, even with a video 800/600 or 640/480 , better); yet, as for P5, without ketai and P5 as with java you cannot extend twice, you cannot use as it is; so i have tried another way, more complicated: in the Applet fragment i create another fragment and in this second fragment i create a videoView in the standard android way. It runs very well.

    PS: if finally you decide to work with your code and inAS without P5 dont forget to add some code to your callbacks, mainly for SurfaceChanged (orientation changing) and surface destroyed (here you must release your camera).

  • @the_real_doc Thanks for sharing

    @akenaton Thxs for comment. i wouldn't mind seeing your approach :ar!

    So far I have done most of my cam work with Ketai, and I have to say it is slow. So I am def interesting in trying new ideas 8-} =D>

    Kf

  • @kfrager===

    here the code i used for that (tested, dont forget the camera permission)

    first fragment (pApplet)====

        import android.widget.FrameLayout;
        import java.io.IOException;
        import android.view.LayoutInflater;
        import android.view.SurfaceView;
        import android.view.View;
        import android.view.ViewGroup;
        import android.os.Build;
        import android.os.Bundle;
        import android.os.Looper;
        import android.app.Activity;
        import android.app.Fragment;
        import android.app.FragmentManager;
        import android.app.FragmentTransaction;
        import android.widget.VideoView;
        import android.content.Context;
        import android.content.Intent;
        import android.view.ViewGroup.LayoutParams;
        import android.widget.LinearLayout;
        import android.content.pm.ActivityInfo;
        import android.content.pm.ConfigurationInfo;
        import android.util.*;
        import android.view.SurfaceHolder;
        import android.hardware.Camera;//better to use the new api camera2
    
    
        ////////////////////////////////////////////////////////////////////////////////////
    
          Context c;
          VideoView vv;
          Activity act;
          View v;
          FrameLayout fragment_two;
         // LinearLayout layout;
          View rootview;
          int r;
          FrameLayout fl;
          Fragment fr;
          Camera camera;
    
    
           public void OnCreate(Bundle savedInstanceState){
    
             act = getActivity();
             println(act);
            c = act.getApplicationContext();
    
           }
    
    
            public void settings() {
    
                size(displayWidth, displayHeight, P2D);
            }
    
            //@SuppressWarnings("deprecation")
          @Override
            public void setup() {
    
    
              orientation(LANDSCAPE);
              Looper.prepare();
    
              fill(255,0,0);
    
    
            }
    
    
            public void draw() {
    
    
    
            };
    
    
    
    
        public void selectFrag(View view) {
    
    
          Fragment fr;
          fr = newInstance();
           fl = (FrameLayout)act.findViewById(0x1000);
           FragmentManager fm = act.getFragmentManager();
           FragmentTransaction fragmentTransaction = fm.beginTransaction();
           Fragment fragment = act.getFragmentManager(). findFragmentByTag("main_fragment");
           r = fragment.getId();
           fragmentTransaction.replace(r, fr);
    
           fragmentTransaction.commit();
    
    
        }
    
    
    
        @Override
        public void onViewCreated(View view, Bundle savedInstanceState) {
          act = getActivity();
    
    
    
            //DisplayMetrics dm = new DisplayMetrics();//useless
            //act.getWindowManager().getDefaultDisplay().getMetrics(dm);
    
         selectFrag(view);
    
        }
    
        public  Fragment2 newInstance() {
            Fragment2 myFragment = this.new Fragment2();
    
            return myFragment;
        }
    

    //////////////////////////////////////////////////////////////fragment for /////video ///some imports useless (redundant, but i have not time enough to verify)

            import android.app.Fragment;
            import android.os.Bundle;
            import android.view.LayoutInflater;
            import android.view.View;
            import android.view.ViewGroup;
            import android.widget.VideoView;
            import android.app.Activity;
            import android.hardware.Camera.CameraInfo;
            import android.content.Context;
    
            //import android.view.SurfaceHolder;
            //import android.view.SurfaceView;
            import java.io.IOException;
    
    
            public static Camera getCameraInstance(){// use this if not choosing a camera
    
                Camera camera = null;
                try {
                    camera = Camera.open(); // attempt to get a Camera instance
                }
                catch (Exception e){
                    // Camera is not available (in use or does not exist)
                }
                return camera; // returns null if camera is unavailable
            }
    
    
    
            public  class Fragment2 extends Fragment implements SurfaceHolder.Callback {
    
    
            public Context c;
            VideoView vv;
            SurfaceHolder holder = null;
            public Camera camera;
            public Activity act ;
            public Bundle savedState;
            //public CameraPreview camPrev; // for using surface view
            int cameraId ;
    
    
    
              public void OnCreate(Bundle savedInstanceState) {
             //in case that there are args
              }
    
    
    
              @Override
                public View onCreateView(LayoutInflater inflater, ViewGroup container, 
                Bundle savedInstanceState) {
                act = this.getActivity();
    
             println("je suis dans le fragment2 de camera");
    
              cameraId = findFrontFacingCamera();
              println("-----------------------------je suis dans le fragment 2 et je cherche l'id de la camera" + cameraId);
                vv = new VideoView(this.getActivity());
    
                holder = vv.getHolder();
                 holder.addCallback(this);
                try {
                  camera = Camera.open(cameraId);
    
                    println("---------------------initialize the Camera");
                    camera.lock();
                } catch(RuntimeException re) {
                    println("Could not initialize the Camera");
                    re.printStackTrace();
                }
    
                return vv;
                };
    
                private int findFrontFacingCamera() {// choose the face camera (you can skip that because front is default)
                      int cameraId = -1;
    
    
                      int numberOfCameras = Camera.getNumberOfCameras();
                      for (int i = 0; i < numberOfCameras; i++) {
                          CameraInfo info = new CameraInfo();
                          Camera.getCameraInfo(i, info);
    
                          if (info.facing == CameraInfo.CAMERA_FACING_FRONT) {
                              cameraId = i;
                              //cameraFront = true;
                              break;
                          }
    
                      }
                  return cameraId;
    
                  }
    
    
    
                 public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3) {
                 // add here code if the  orientation is not locked!!!!
                }
    
    
    
                  public void surfaceCreated(SurfaceHolder arg0) {
                  //:::::::now that the display is created you can start the preview!!!!
    
                  try {
    
                    camera.setPreviewDisplay(holder);
                    camera.startPreview();
                    println("starting the preview");
                  } 
                  catch(IOException e) {
                    println("Could not start the preview");
                    e.printStackTrace();
                  }
                }
    
                public void surfaceDestroyed(SurfaceHolder holder) {
    
                  if(camera!=null){
    
                            camera.stopPreview();
                            camera.release();
    
                        }
                }
                //////////////////////////////////////////////////////////////////////////////////
    
                // UNCOMMENT and call this classe IN CASE YOU WANT TO USE A SURFACE VIEW
    
                //class CameraPreview extends SurfaceView implements SurfaceHolder.Callback {
    
    
                //   public CameraPreview(Context context, Camera camera) {
                //       super(context);
    
                //       holder = getHolder();
                //       holder.addCallback(this);
                //       // Android versions < 3.0
                //       holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
                //   };
    
    
    
                //    public void surfaceCreated(SurfaceHolder holder) {
                //        // The Surface has been created, now tell the camera where to draw the preview.
                //        try {
                //            camera.setPreviewDisplay(holder);
                //            camera.startPreview();
                //        } catch (IOException e) {
                //            e.printStackTrace();
                //        }
                //    }
    
                //    public void surfaceDestroyed(SurfaceHolder holder) {
                //    //    // empty. Take care of releasing the Camera preview in your activity.
                //    }
    
                //    public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
                //        // If your preview can change or rotate, take care of those events here.
                //        // Make sure to stop the preview before resizing or reformatting it.
    
                //        if (holder.getSurface() == null){
                //            // preview surface does not exist
                //            return;
                //        }
    
                //        // stop preview before making changes
                //        try {
                //            camera.stopPreview();
                //        } catch (Exception e){
                //            // ignore: tried to stop a non-existent preview
                //        }
    
                //        // set preview size and make any resize, rotate or
                //        // reformatting changes here
    
                //        // start preview with new settings
                //        try {
                //            camera.setPreviewDisplay(holder);
                //            camera.startPreview();
    
                //        } catch (Exception e){
                //            e.printStackTrace();
                //        }
                //    }
                //}
            //////////////////////////////////////////////////////////////////////////////////
    
    
    
    
                public void onBackPressed() {/// perhaps useless,????????
                  act.onBackPressed();
                }
    
    
    
                public void   onPause() {
                  super.onPause();
            if(camera!=null){
                 try {
                            camera.stopPreview();
                            camera.release();
                        } catch (Exception e){
    
                        }
    
            }
                }
    
                public void onStop() {
                  super.onStop();
                  if(camera!=null){
                 try {
                            camera.stopPreview();
                            camera.release();
                        } catch (Exception e){
    
                        }
    
            }
                }
    

    }

  • @akenaton

    Thank you =D> <:-P I will have a look at it today. By the way, I am kfrajer, not kfraGer lol 8-|

    Kf

  • edited January 19

    Just another little thing (if you know): how can i determine the distance between the smartphone camera and the subject using autofocus data ? it seems that google's camera2 API doesn't allow to get specific parameters.

  • Hmmm... if we forget about autofocus and zoom for one moment, getting the distance would require a calibration and this distance measurement will only work based on your calibration file. You should consider exploring some apps in the play store that are doing this already to see the capability and limitations. However, if you know exactly what you want, you should elaborate more on what you want to measure.

    I don't know abot camera parameters. I do know the ketai offer some functions to access some parameters (zoom I believe, for example) but I've never tested it myself... yet.

    Kf

  • I gave a look and.... no info about focus values or other useful functions...

  • @kfrajer, @ the_real_doc===

    android camera api has a method which is supposed to return this distance;

        Parameters parametres = camera.getParameters();
        float fDistances[] = new float [3];//3 values: min, max, optimal
        parametres.getFocusDistances( fDistances );
        println(fDistances[0]);
    

    yet i say "supposed" because this method is not really precise...And moreover does not work for a lot of devices. according to android doc:: The precision depends on the camera hardware, autofocus algorithm, the focus area, and the scene. The error can be large and it should be only used as a reference. –

    i have tested myself and seen that in a lot of cases i get "infinity" for these values.So finally i have used the old hard way, putting some real object at a distance and mapping that to pixels on the screen. Not very elegant, i know!

  • Yes,i know. I've already tried getFocusDistances() method, but i discovered that basically Android focus modes are "in a String format" according with Google programmers logic. For example we have "fixed" "macro" "infinity" "auto".... But i need numbers ! ;) . I know that the precision depends on camera hardware, but i suppose that my Nexus 5X with Laser focus can do the (dirty :) ) job.

  • It will be interested to know what is the meaning of this distance. I mean, when you use a real object (as @akenaton mention in its prev post, and exactly what I had in mind), you need to do a calibration since distance will depend on object size - I am assuming the object is spherical so the object's spacial orientation doesn't introduce length shortening on any of the object's dimension. However, the distance use for focus must be based in some other algorithm and it must be used to reduce blurring of the actual image. When an object is far enough, it will make sense for the distance to be infinite, so the object's projected image is at the focal point of the lens. I assume the distance obtained from a focus operation can greatly vary from device to device. I doubt all devices will have a laser focus to perform this calculation. Otherwise any android device will be used in surveying applications and make surveying tasks accessible to anybody (I know, I know, it is probly not a strong laser...)

    Some post that have some detail but not full answers:
    http://stackoverflow.com/questions/6401370/camera-focus-distances
    http://android-er.blogspot.ca/2012/04/gets-distances-from-camera-to-focus.html

    @The_real_doc The second link might have a code you could be useful to you. Can you elaborate a bit about the goal of your project or the purpose of your distance measurement?

    Kf

  • Ok, so.... Basically what i want to do for now is to create a telemeter app :

    example here: https://play.google.com/store/apps/details?id=kr.sira.measure

    Then i will extend the logic of the app using processing graphical functions (I hope directly on the preview)

  • Yes, similar to this other app https://play.google.com/store/apps/details?id=kr.sira.distance&hl=en

    So the concept they used is based on a calibration. I might be wrong, but I don't think they are using a laser to determine the range. Instead in their instructions, they recommend to point the camera to the feet of the person, and then you need to provide the person's height. From standard camera settings and basic optics, they translate distance and pixels to actual real distance between the subject and the camera. In other words, you are doing a calibration on the fly. I suggest you download the app and become familiar with it to understand the concept. However, I don't mean to say a focusDista approach is not correct, just that I am unfamiliar with it and I don't know if it is a viable approach nowadays. Further research required...

    Kf

  • Hi @akenaton . Your code works great for displaying the camera. Much much faster than Ketai. But I am at a complete loss on how to use the stream of images as if they were a PImage, or how to save photos to storage, etc. Where could I start looking?

    By the way, it still only uses Camera, not Camera2, correct? I've read Camera2 is more complex to implement.

    Thanks

  • @Kajuna===

    put answered for others. You are right, this is not Camera2 API, but it s easy to make changes if you want.As for saving to external or internal storage i think that i have answered to this kind of questions in the forum. Code must be added AFTER surface created.

  • pzqpzq
    edited March 31

    Hello. This is something i'm also looking into at the moment. I have played with ketai and blobdetector but now i would want to replace ketai with this raw android api functionality.

    I got the @akenaton example to work but i would need help how to pass the camerapreview into blobdetector funtion so that it would draw blobs on top of api hardware.camera image.

    I'm also planning to test out the openCV4Android found in this forum: stackoverflow.com/questions/7991621/opencv-on-android-using-eclipse But its harder since i'm still using PDE and not eclipse...

    But for now, i would want to see what could be achieved with current processing ide.

    blobdetector is used with ketai camera like this: (this will draw green blobs around brigth areas as well as show coordinatess in center of the blobs as text.) I think the draw() function would now need something around these function calls

    image(cam,width/2, height/2,width,height); img.copy(cam, 0, 0, cam.width, cam.height, 0, 0, img.width, img.height);

    Maybe i need to convert somehow the camerapreview int o an PImage somehow, but how ? :)

    The main code

    Sorry for the code being normal text but i couldnt get the code style working...

    import ketai.camera.*; import blobDetection.*; BlobDetection theBlobDetection; Blob theBlob; PImage img; KetaiCamera cam; boolean newFrame=false; float xCoord,yCoord; void setup() { fullScreen(); orientation(LANDSCAPE); imageMode(CENTER); textSize(45); int x = 2; img = new PImage(80*x,40*x); theBlobDetection = new BlobDetection(img.width, img.height); theBlobDetection.setPosDiscrimination(true); // false will detect dark areas theBlobDetection.setThreshold(0.9f); //detect bright areas (luminosity > 0.9f) cam = new KetaiCamera(this, 640, 480, 60); cam.start(); } void draw() { if (newFrame) { newFrame=false; image(cam,width/2, height/2,width,height); img.copy(cam, 0, 0, cam.width, cam.height, 0, 0, img.width, img.height); fastblur(img, 2); theBlobDetection.computeBlobs(img.pixels); drawBlobsAndEdges(false,true); } } void onCameraPreviewEvent() { cam.read(); newFrame = true; } void mousePressed() { if (cam.isStarted()) { cam.stop(); } else cam.start(); } void keyPressed() { if(cam == null) return; if (key == CODED) { if (keyCode == MENU) { if (cam.isFlashEnabled()) cam.disableFlash(); else cam.enableFlash(); } } } void drawBlobsAndEdges(boolean drawBlobs, boolean drawEdges) { noFill(); Blob b; EdgeVertex eA,eB; for (int n=0 ; n<theBlobDetection.getBlobNb() ; n++) { b=theBlobDetection.getBlob(n); if (b!=null) { // Edges if (drawEdges) { strokeWeight(3); stroke(0,255,0); for (int m=0;m<b.getEdgeNb();m++) { eA = b.getEdgeVertexA(m); eB = b.getEdgeVertexB(m); if (eA !=null && eB !=null) line( eA.x*width, eA.y*height, eB.x*width, eB.y*height ); } } if (drawBlobs) { strokeWeight(1); stroke(255,0,0); rect( b.xMin*width,b.yMin*height, b.w*width,b.h*height ); } text(b.x, b.x*width-50, b.y*height); text(b.y, b.x*width-50, b.y*height+40); } } } void fastblur(PImage img,int radius) { if (radius<1){ return; } int w=img.width; int h=img.height; int wm=w-1; int hm=h-1; int wh=w*h; int div=radius+radius+1; int r[]=new int[wh]; int g[]=new int[wh]; int b[]=new int[wh]; int rsum,gsum,bsum,x,y,i,p,p1,p2,yp,yi,yw; int vmin[] = new int[max(w,h)]; int vmax[] = new int[max(w,h)]; int[] pix=img.pixels; int dv[]=new int[256*div]; for (i=0;i<256*div;i++){ dv[i]=(i/div); } yw=yi=0; for (y=0;y<h;y++){ rsum=gsum=bsum=0; for(i=-radius;i<=radius;i++){ p=pix[yi+min(wm,max(i,0))]; rsum+=(p & 0xff0000)>>16; gsum+=(p & 0x00ff00)>>8; bsum+= p & 0x0000ff; } for (x=0;x<w;x++){ r[yi]=dv[rsum]; g[yi]=dv[gsum]; b[yi]=dv[bsum]; if(y==0){ vmin[x]=min(x+radius+1,wm); vmax[x]=max(x-radius,0); } p1=pix[yw+vmin[x]]; p2=pix[yw+vmax[x]]; rsum+=((p1 & 0xff0000)-(p2 & 0xff0000))>>16; gsum+=((p1 & 0x00ff00)-(p2 & 0x00ff00))>>8; bsum+= (p1 & 0x0000ff)-(p2 & 0x0000ff); yi++; } yw+=w; } for (x=0;x<w;x++){ rsum=gsum=bsum=0; yp=-radius*w; for(i=-radius;i<=radius;i++){ yi=max(0,yp)+x; rsum+=r[yi]; gsum+=g[yi]; bsum+=b[yi]; yp+=w; } yi=x; for (y=0;y<h;y++){ pix[yi]=0xff000000 | (dv[rsum]<<16) | (dv[gsum]<<8) | dv[bsum]; if(x==0){ vmin[y]=min(y+radius+1,hm)*w; vmax[y]=max(y-radius,0)*w; } p1=x+vmin[y]; p2=x+vmax[y]; rsum+=r[p1]-r[p2]; gsum+=g[p1]-g[p2]; bsum+=b[p1]-b[p2]; yi+=w; } } }

Sign In or Register to comment.