I've implemented screen sharing using WebRTC in my React Native app, but I'm encountering an issue where the audio from other apps, such as YouTube, is not being captured during screen broadcasting.
Below is the React Native code snippet I'm using for screen sharing:
const screenShareStreamRef = useRef();const [screenShareStream, setScreenShareStream] = useState(null);const screenShareVideoProducer = useRef();const screenShareAudioProducer = useRef();let audioTrack, videoTrack, screenstream; const startScreenStream = async () => { try { const displayStream = await mediaDevices.getDisplayMedia({ video: true }); videoTrack = displayStream.getVideoTracks()[0]; const audioStream = await mediaDevices.getUserMedia({ audio: true }); audioTrack = audioStream.getAudioTracks()[0]; // Combine video and audio tracks into a single MediaStream screenstream = new MediaStream([videoTrack, audioTrack]); screenShareStreamRef.current = screenstream; setScreenShareStream(screenstream); } catch (error) { console.error(error); }})
For both iOS and Android, I've followed the setup provided in this guide: Link to the setup guide.
Additionally, for iOS, I've added the necessary files into the project, as mentioned in the guide: Link to the files.
I'm looking for insights or solutions on how to capture audio from other apps during screen sharing. Any help would be appreciated. Thank you!