I trying to make a Native module to React Native using native camera api from objective C, but I can't render camera, I only can render the background set in layoutSubviews method.
My RNCameraManager.m file:
#import "RNCameraManager.h"#import <React/RCTViewManager.h>#import "RNCamera.h"@interface RNCameraManager : RCTViewManager@end@implementation RNCameraManagerRCT_EXPORT_MODULE(RNCameraManager)- (UIView *)view{ return [[RNCamera alloc] initWithBridge:self.bridge];}@end
My RNCamera.m file:
#import "RNCamera.h"#import <React/RCTUtils.h>#import <React/UIView+React.h>@interface RNCamera ()@property (nonatomic, weak) RCTBridge *bridge;@property (nonatomic, strong) AVCaptureSession *session;@property (nonatomic, strong) AVCaptureVideoPreviewLayer *previewLayer;@end@implementation RNCamera- (id)initWithBridge:(RCTBridge *)bridge{ if((self = [super init])) { self.bridge = bridge; self.session = [AVCaptureSession new]; self.sessionQueue = dispatch_queue_create("cameraQueue", DISPATCH_QUEUE_SERIAL); } return self;}- (void)layoutSubviews{ [super layoutSubviews]; self.previewLayer.frame = self.bounds; [self setBackgroundColor:[UIColor redColor]]; [self.layer insertSublayer:self.previewLayer atIndex:0];}@end
When I call this method in React Native, I only get a Red Screen (beacause I set red to background color).
I set view size on react native render, (When I try to switch RN Camera to Maps, works perfectly, so the problem is time to actually render the camera and not the view)