ARCamera is missing Properties

Hi, I’m trying to use ARKit in my libgdx app with moe 1.4.+ but just before everything works fine I discovered that ARCamera doesn’t provide the transform matrix. When I print out the camera object I get all necessary information as string:

<ARCamera: 0x1c713bbc0 image-resolution=(1280, 720) focal-length=(1049.139, 1049.139) principal-point=(631.186, 359.982) transform=<translation=(0.004549 0.009288 0.000869) rotation=(-62.51° -10.67° -110.26°)>>
<ARCamera: 0x1c713d600 image-resolution=(1280, 720) focal-length=(1049.139, 1049.139) principal-point=(631.191, 359.937) transform=<translation=(0.004006 0.009897 0.001099) rotation=(-62.23° -10.99° -109.82°)>>
<ARCamera: 0x1c713c7a0 image-resolution=(1280, 720) focal-length=(1048.767, 1048.767) principal-point=(631.287, 360.020) transform=<translation=(0.003296 0.010702 0.001439) rotation=(-61.85° -11.37° -109.30°)>>

Any hint how I get those data direct from the ARCamera object?

Hi Raimund,

thanks for the report, we are looking into it.

Best Regards,

Good news: The ARKit feature is now supported in my Google Cardboard App.

For now I’m using the following (ugly) workaround to get all camera parameter:

private final static String FOCAL_LENGTH_KEY = "focal-length=(";
private final static String PRINCIPAL_POINT_KEY = ") principal-point=(";
private final static String TRANSFORM_KEY = ") transform=<translation=(";
private final static String ROTATION_KEY = ") rotation=(";

private void sendCameraData(ARCamera camera) {
    String cameraData = camera.toString();
    int focalLengthIndex = cameraData.indexOf(FOCAL_LENGTH_KEY);
    int focalLengthValueIndex = focalLengthIndex + FOCAL_LENGTH_KEY.length();
    int principalPointIndex = cameraData.indexOf(PRINCIPAL_POINT_KEY);
    int principalPointValueIndex = principalPointIndex + PRINCIPAL_POINT_KEY.length();
    int transformIndex = cameraData.indexOf(TRANSFORM_KEY);
    int transformValueIndex = transformIndex + TRANSFORM_KEY.length();
    int rotationIndex = cameraData.indexOf(ROTATION_KEY);
    int rotationValueIndex = rotationIndex + ROTATION_KEY.length();
    String[] focalLength = cameraData.substring(focalLengthValueIndex, principalPointIndex).split(", ");
    float focalLengthX = Float.valueOf(focalLength[0]);
    float focalLengthY = Float.valueOf(focalLength[1]);
    String[] principalPoint = cameraData.substring(principalPointValueIndex, transformIndex).split(", ");
    float principalPointX = Float.valueOf(principalPoint[0]);
    float principalPointY = Float.valueOf(principalPoint[1]);
    String[] transform = cameraData.substring(transformValueIndex, rotationIndex).split(" ");
    float posX = Float.valueOf(transform[0]);
    float posY = Float.valueOf(transform[1]);
    float posZ = Float.valueOf(transform[2]);
    String[] rotation = cameraData.substring(rotationValueIndex, cameraData.length() - 3).split(" ");
    float pitch = Float.valueOf(rotation[0].substring(0, rotation[0].length() - 1));
    float yaw = Float.valueOf(rotation[1].substring(0, rotation[1].length() - 1));
    float roll = Float.valueOf(rotation[2].substring(0, rotation[2].length() - 1));
    pos.set(posX, posY, posZ);
    rot.setEulerAngles(yaw, pitch, roll);
    float width = (float) camera.imageResolution().width();
    float height = (float) camera.imageResolution().height();

The rotation array mapping to yaw, pitch and roll are changed to match my rendering settings.


Great, thanks for the update. Is your app (or even source code) accessible somewhere so we can try it out?

We did not forget this issue either, we just need to add vector type support to Nat/J to get proper access to the matrices, we just could not get around to do it yet.

A slightly less hacky workaround could be to add an ObjC category to ARCamera that converts the vector matrix to a regular array matrix that can be accessed with Nat/J today.

The source-code is currently part of my master-thesis which will be finished next month. After that time I will make it accessible. The title of my thesis is “a collaborative 3D mixed-reality editor for mobile devices”. In two weeks the app will be evaluated. I can invite you to TestFlight if you like. Btw I also wrote a Network-Middleware with google protobuf as dependency in the libgdx core project and it worked immediately on iOS :+1: