Quantcast
Channel: Active questions tagged react-native+ios - Stack Overflow
Viewing all articles
Browse latest Browse all 16552

react native stream ios screen to local server

$
0
0

I am building a React Native application to be able to stream my phone content to a local NodeJS server.

On android, it works great with MediaProjectionManager but on iOS this is more complicated.

I tried to do it with RPScreenRecorder, this is my code

#import "ScreenShare.h"@implementation ScreenShareRCT_EXPORT_MODULE();- (NSString *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer{  CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);  CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imageBuffer];  CIContext *temporaryContext = [CIContext contextWithOptions:nil];  CGImageRef videoImage = [temporaryContext                           createCGImage:ciImage                           fromRect:CGRectMake(0, 0,                                               CVPixelBufferGetWidth(imageBuffer),                                               CVPixelBufferGetHeight(imageBuffer))];  UIImage *image = [[UIImage alloc] initWithCGImage:videoImage];  NSString *base64String = [UIImagePNGRepresentation(image) base64EncodedStringWithOptions:NSDataBase64Encoding64CharacterLineLength];  CGImageRelease(videoImage);  return (base64String);}- (NSArray<NSString *> *)supportedEvents{  return @[@"ImageCaptured"];}RCT_REMAP_METHOD(start,                 startWithResolver:(RCTPromiseResolveBlock)resolve                 rejecter:(RCTPromiseRejectBlock)reject){  NSMutableDictionary *result = [[NSMutableDictionary alloc] init];  [result setObject:@true forKey:@"success"];  if (@available(iOS 11.0, *)) {    if([RPScreenRecorder.sharedRecorder isRecording]) {      return resolve(result);    }    [RPScreenRecorder.sharedRecorder startCaptureWithHandler:^(CMSampleBufferRef  _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) {      dispatch_sync(dispatch_get_main_queue(), ^{        if(bufferType == RPSampleBufferTypeVideo) {          NSString *strEncoded = [self imageFromSampleBuffer:sampleBuffer];          [self sendEventWithName:@"ImageCaptured" body:@{@"image": strEncoded}];        }      });    } completionHandler:^(NSError * _Nullable error) {      if(error == NULL) return resolve(result);      // The user declined application recording      if([error code] == -5801) {        return reject(@"401", [error localizedDescription], error);      }      reject([NSString stringWithFormat:@"%ld", [error code]], [error localizedDescription], error);    }];  } else {    NSError * error = [NSError errorWithDomain:@"com.xxx.ConnectApp" code:426 userInfo:nil];    reject([NSString stringWithFormat:@"%ld", [error code]], @"Failed to start screen capture", error);  };}RCT_REMAP_METHOD(stop,                 stopWithResolver:(RCTPromiseResolveBlock)resolve                 rejecter:(RCTPromiseRejectBlock)reject){  NSMutableDictionary *result = [[NSMutableDictionary alloc] init];  [result setObject:@true forKey:@"success"];  if (@available(iOS 11.0, *)) {    [RPScreenRecorder.sharedRecorder stopCaptureWithHandler:^(NSError * _Nullable error) {      if(error == NULL) return resolve(result);      reject([NSString stringWithFormat:@"%ld", [error code]], [error localizedDescription], error);    }];  } else {    NSError * error = [NSError errorWithDomain:@"com.xxx.ConnectApp" code:426 userInfo:nil];    reject([NSString stringWithFormat:@"%ld", [error code]], @"Failed to stop screen capture", error);  }}@end

The video quality is not really good and it stops when I am outside the app. The goal of the app is to be able to stream outside the app and not the app.

I investigate another solution to create an airplay server on my NodeJS local server but I can not found any documentation, the only documentation or module I get are old and did not work.


Viewing all articles
Browse latest Browse all 16552

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>