flypig.co.uk

List items

Items from the current list are shown below.

Gecko

12 Jul 2024 : Day 286 #
Still working on WebRTC today, after some success getting video to work, but less success with audio. I have two things explicitly on my to-do list. The first being the error when using GlobalMuteListener, the second the AudioPlaybackParent.jsm error triggered when trying to grant access to the microphone.

Let's start with the GlobalMuteListener class then. This is a new class added to ESR 91 with the following description associated with it, taken from changeset D86719:
$ git log -1 -S "GlobalMuteListener" browser/actors/WebRTCChild.jsm
commit 01fa52e2f5946f511199b6879919e2547f63fa25
Author: Mike Conley <mconley@mozilla.com>
Date:   Thu Aug 27 15:46:26 2020 +0000

    Bug 1643027 - Add GlobalMuteListener to WebRTCChild for globally muting the 
    microphone and camera. r=pbz
    
    This listens to state changes set via SharedData. Those state changes are 
    executed by
    front-end code in a later patch in this series.
    
    Depends on D87129
    
    Differential Revision: https://phabricator.services.mozilla.com/D86719
The associated bug is 1643027 and there are two further changesets associated with this bug. The first is D86619 which turns the Firefox microphone and camera icons into global mute toggles. The second is D86762 which adds unit tests for the other changes.

At some point I should take a look at the first of these to see whether it relates to anything in the Sailfish user interface. This could have privacy implications, so is of no small importance.

Right now though, I'm just going to add in the missing GlobalMuteListener class so that we're matching at least this part of the upstream functionality and which we might need to hook the user interface changes in to later.

The new class isn't especially large; in fact it's small enough that I can justify sharing all of it here.
/**
 * GlobalMuteListener is a process-global object that listens for changes to
 * the global mute state of the camera and microphone. When it notices a
 * change in that state, it tells the underlying platform code to mute or
 * unmute those devices.
 */
const GlobalMuteListener = {
  _initted: false,

  /**
   * Initializes the listener if it hasn't been already. This will also
   * ensure that the microphone and camera are initially in the right
   * muting state.
   */
  init() {
    if (!this._initted) {
      Services.cpmm.sharedData.addEventListener(&quot;change&quot;, this);
      this._updateCameraMuteState();
      this._updateMicrophoneMuteState();
      this._initted = true;
    }
  },

  handleEvent(event) {
    if (event.changedKeys.includes(&quot;WebRTC:GlobalCameraMute&quot;)) {
      this._updateCameraMuteState();
    }
    if (event.changedKeys.includes(&quot;WebRTC:GlobalMicrophoneMute&quot;)) {
      this._updateMicrophoneMuteState();
    }
  },

  _updateCameraMuteState() {
    let shouldMute = Services.cpmm.sharedData.get(&quot;WebRTC:
    GlobalCameraMute&quot;);
    let topic = shouldMute
      ? &quot;getUserMedia:muteVideo&quot;
      : &quot;getUserMedia:unmuteVideo&quot;;
    Services.obs.notifyObservers(null, topic);
  },

  _updateMicrophoneMuteState() {
    let shouldMute = Services.cpmm.sharedData.get(
      &quot;WebRTC:GlobalMicrophoneMute&quot;
    );
    let topic = shouldMute
      ? &quot;getUserMedia:muteAudio&quot;
      : &quot;getUserMedia:unmuteAudio&quot;;

    Services.obs.notifyObservers(null, topic);
  },
};
All this is really doing is listening to a couple of events for muting the camera and microphone and, in the event that they're received, broadcasting a notification to indicate the change (microphone/camera mute/unmute) that can be picked up by the user interface and acted upon.

Crucially it doesn't appear to depend on anything that's not already available in our EmbedLiteWebrtcUK.js source file. So I've added it towards the top of the file and uncommented the line that calls it.

When I use this updated code I don't get any obvious changes, except that there's now no longer an error output to the console.

I'll need to create a task to check the status of global mute on Sailfish OS, but this change is going to be enough for what we need right now.

Let's move on to the second error, which was the following:
JavaScript error: resource://gre/actors/AudioPlaybackParent.jsm, line 20: 
    TypeError: browser is null
The AudioPlaybackParent.jsm source is available upstream for both ESR 78 and ESR 91. Both versions are tiny files (29 lines and 45 lines respectively) but there have been changes between the two.

The change that is triggering the error relates to the browser variable. It's being initialised in two different ways. On ESR 78 we have this: let topBrowsingContext = this.browsingContext.top; let browser = topBrowsingContext.embedderElement; Whereas on ESR 91 we have this:
    const browser = this.browsingContext.top.embedderElement;
Looking carefully at and comparing both of these there's not much practical difference. On ESR 91 the variable is marked as const but that seems to be the only practical difference.

The top level class doesn't include a browsingContext member so in both cases this must be being inherited from the parent class, which is JSWindowActorParent:
class AudioPlaybackParent extends JSWindowActorParent {
The file I'm checking in the ESR 78 codebase is the patched version with the Sailfish OS changes applied, so it's worth checking to see whether these lines in particular are from upstream or from us:
$ git blame toolkit/actors/AudioPlaybackParent.jsm -L 10,+4
5ce82c5c12dfa (Abdoulaye O. Ly 2019-08-19 21:17:21 +0000 10)
   receiveMessage(aMessage) {
5ce82c5c12dfa (Abdoulaye O. Ly 2019-08-19 21:17:21 +0000 11)
     let topBrowsingContext = this.browsingContext.top;
5ce82c5c12dfa (Abdoulaye O. Ly 2019-08-19 21:17:21 +0000 12)
     let browser = topBrowsingContext.embedderElement;
5ce82c5c12dfa (Abdoulaye O. Ly 2019-08-19 21:17:21 +0000 13) 
They're from upstream. So then I wonder why they were changed? Let's find out.
$ git blame toolkit/actors/AudioPlaybackParent.jsm -L 15,+4
Blaming lines:   8% (4/45), done.
5ce82c5c12dfa (Abdoulaye O. Ly 2019-08-19 21:17:21 +0000 15)
   receiveMessage(aMessage) {
ebf5370d519e5 (alwu            2020-10-02 03:56:09 +0000 16)
     const browser = this.browsingContext.top.embedderElement;
5ce82c5c12dfa (Abdoulaye O. Ly 2019-08-19 21:17:21 +0000 17)
     switch (aMessage.name) {
5ce82c5c12dfa (Abdoulaye O. Ly 2019-08-19 21:17:21 +0000 18)
       case &quot;AudioPlayback:Start&quot;:
$ git log -1 ebf5370d519e5
commit ebf5370d519e5fe34c4a0b7aeb210de3c851c00a
Author: alwu <alwu@mozilla.com>
Date:   Fri Oct 2 03:56:09 2020 +0000

    Bug 1656414 - part1 : stop audio playback and media block if needed when 
    destroying an parent actor. r=farre
    
    If content process dispatches event after actor close, then we are not able 
    to clear the tab media indicator. Therefore, we should do corresponding 
    cleanup when actor destroys.
    
    Differential Revision: https://phabricator.services.mozilla.com/D91367
It's not quite clear to me what's going on with this, but it's beginning to feel like my brain needs to get some sleep. So I'll return to this fresh in the morning and take another look.

If you'd like to read any of my other gecko diary entries, they're all available on my Gecko-dev Diary page.

Comments

Uncover Disqus comments