Quantcast
Channel: Active questions tagged react-native+ios - Stack Overflow
Viewing all articles
Browse latest Browse all 17233

How to capture audio from other applications while using GetDisplayMedia for screen sharing in a React Native app?

$
0
0

I've implemented screen sharing using WebRTC in my React Native app, but I'm encountering an issue where the audio from other apps, such as YouTube, is not being captured during screen broadcasting.

Below is the React Native code snippet I'm using for screen sharing:

const screenShareStreamRef = useRef();const [screenShareStream, setScreenShareStream] = useState(null);const screenShareVideoProducer = useRef();const screenShareAudioProducer = useRef();let audioTrack, videoTrack, screenstream;  const startScreenStream = async () => {    try {      const displayStream = await mediaDevices.getDisplayMedia({ video: true });      videoTrack = displayStream.getVideoTracks()[0];      const audioStream = await mediaDevices.getUserMedia({ audio: true });      audioTrack = audioStream.getAudioTracks()[0];      // Combine video and audio tracks into a single MediaStream      screenstream = new MediaStream([videoTrack, audioTrack]);      screenShareStreamRef.current = screenstream;      setScreenShareStream(screenstream);    } catch (error) {      console.error(error);    }})

For both iOS and Android, I've followed the setup provided in this guide: Link to the setup guide.

Additionally, for iOS, I've added the necessary files into the project, as mentioned in the guide: Link to the files.

I'm looking for insights or solutions on how to capture audio from other apps during screen sharing. Any help would be appreciated. Thank you!


Viewing all articles
Browse latest Browse all 17233

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>