I am struggling to convert Java data to “native” iOS pointers.
I have an InputStream containing a Jpeg image. I like to load this stream into an UIImage.
I did this before with RoboVM using NSData. How do I create the right “pointer” to my byte array so NSData.dataWithBytesLength() accepts it.
In general I struggle to with all the pointer magic in MOE for example using Apples Security library. Is there a document which explains how to create/handle the pointers so it can be used in Apples native libraries?
// convert InputStream to byte[] to use with NSData
ByteArrayOutputStream arrayStream = new ByteArrayOutputStream();
int nRead;
byte[] buf = new byte[ 16384 ];
while( ( nRead = stream.read( buf, 0, buf.length ) )!=-1 )
{
arrayStream.write( buf, 0, nRead );
}
arrayStream.flush();
byte[] imageData = arrayStream.toByteArray();
LOGGER.info( "Stream > byte array" );
BytePtr ptr = PtrFactory.newByteArray( imageData );
LOGGER.info( "byte array > MOE pointer" );
NSData data = NSData.dataWithBytesLength( ptr, imageData.length );
LOGGER.info( "MOE pointer > NSData "+data.length() ); // >> FAILS! length is 0
arrayStream.close();
stream.close();
LOGGER.info( "Streams closed" );
UIImage image = UIImage.imageWithData( data );
LOGGER.info( "Image created: "+image );
Now, once we hopefully have the UIImage loaded if should be the content of an CALayer via a CGImage. In RoboVM this worked, but in MultiOS the image doesn’t show up in the layer.
CALayer layer = CALayer.alloc().init();
layer.setContents( image.CGImage() ); // << Does ot work
// layer.setBackgroundColor( UIColor.redColor().CGColor() );
layer.setContentsGravity( "ResizeAspectFill" );
layer.setMasksToBounds( true );
layer.setFrame( view().bounds() );
view().layer().insertSublayerAtIndex( layer, 0 )
Stackoverflow gives the answer, but how is this translated into MOE? Any advice?