List items
Items from the current list are shown below.
Blog
12 Nov 2023 : Day 84 #
This is a blip, an anomaly, a perturbation in the space time continuum. I'm not supposed to be writing this while I pause my gecko development for a couple of weeks while I attend to a Book Dash. But I'm not the only person developing the gecko build and the world doesn't wait, so here's a quick update on some of the activity that's been happening despite my best efforts!
Before I get onto that though, I just want to go on a slight diversion. The world outside my window is turning from green to a vibrant yellow with the occasional red berries poking through. Soon everything will be orange, and soon after that the colour will drain away to leave just the browns of the skeletons beneath. It's all just too wonderful not to share with you.
So what's the reason for this update? One of the key requirements to having gecko in a state that others can test and develop, and ultimately be deployed to users phones, is there being an OBS build available.
OBS, which will be familiar to many Sailfish OS users and developers, is the Open Build Service. When I use this term it's usually in relation to the OBS provided by Jolla for use by the community, but there are of course other OBS deployments. Jolla has its own internal OBS for building the full Sailfish OS repositories, and the software that runs it was originally developed for SUSE (where the 'O' stands for 'openSUSE').
If you're familiar with runners on GitLab or GitHub Actions you'll already have a good idea about OBS: it's essentially a bunch of VMs running remotely that build software and are accessible through a Web interface. OBS offers an environment that's very similar to the scratchbox2 environment provided by the Sailfish SDK, which means it's the ideal place to build software for Sailfish OS. If you want to know more about OBS I strongly recommend the Sailfish OS Chum documentation, which does a great job of explaining what it is and how to use it. Sailfish OS Chum is built around OBS.
Many moons ago — back on Day 53 in fact — I configured a project on OBS to build gecko. Things got stuck quite early on in the gecko build and I left it in a semi-configured state. I planned to go back to it, but never quite got around to it.
Over the last couple of days mal, keto and direc85 have been hard at work getting it into a usable state.
There were a bunch of changes needed for this. On the gcc side, direc85 identified a couple of bugs, 90139 and 96206, each caused an Internal Compiler Error (ICE) during the build. He and mal backported these so that they're incorporated into the OBS backend and available for the build.
These issues don't affect the aarch64 target which is why I'd not experienced them myself.
An additional problem arose in relation to the Python requirements. Back on Day 2 I hit an issue with this, the fix for which involved me configuring the build process to create a Python virtual environment. The Python requirements are then pip installed into this environment during the build.
This is fine for a local build, but on OBS there's no Internet access available after the prepare step completes; that means for most of the build, including at the point where this virtual environment is set up.
The result is that the build was failing on OBS due to it not being able to successfully complete creation of a suitable virtual environment. Mal identified this and switched the build to use a different approach. Essentially, he switched from using the create-mach-environmen to setting MACH_USE_SYSTEM_PYTHON instead:
Overnight mal triggered the OBS build to run, while at the same time I had the latest changes building locally on my machine.
Both went through successfully, which means the OBS build is nearly available for anyone to use.
There is still a little more to do: the packages that depend on gecko still need to be rebuilt. Since I'd not yet pushed any tags to my repository the gecko package was built with version number 60.9.1. That won't be accepted by any of the other packages, so I've pushed the tag `sailfishos/91.9.1+git1` to the repository and set it building again. Once that's complete (probably tomorrow) I can trigger the other packages to rebuild against it, which should be relatively swift.
In the meantime, a huge thank you goes to direct85, keto and mal for their efforts getting this working. It makes gecko that much more accessible to other developers and testers and is therefore an essential step in getting more people involved and getting it out and installed on to users' phones.
I'll be back writing again in just over a week unless there's something else specific to share in the meantime. Until then feel free to catch up on all the previous entries on my Gecko-dev Diary page.
Before I get onto that though, I just want to go on a slight diversion. The world outside my window is turning from green to a vibrant yellow with the occasional red berries poking through. Soon everything will be orange, and soon after that the colour will drain away to leave just the browns of the skeletons beneath. It's all just too wonderful not to share with you.
So what's the reason for this update? One of the key requirements to having gecko in a state that others can test and develop, and ultimately be deployed to users phones, is there being an OBS build available.
OBS, which will be familiar to many Sailfish OS users and developers, is the Open Build Service. When I use this term it's usually in relation to the OBS provided by Jolla for use by the community, but there are of course other OBS deployments. Jolla has its own internal OBS for building the full Sailfish OS repositories, and the software that runs it was originally developed for SUSE (where the 'O' stands for 'openSUSE').
If you're familiar with runners on GitLab or GitHub Actions you'll already have a good idea about OBS: it's essentially a bunch of VMs running remotely that build software and are accessible through a Web interface. OBS offers an environment that's very similar to the scratchbox2 environment provided by the Sailfish SDK, which means it's the ideal place to build software for Sailfish OS. If you want to know more about OBS I strongly recommend the Sailfish OS Chum documentation, which does a great job of explaining what it is and how to use it. Sailfish OS Chum is built around OBS.
Many moons ago — back on Day 53 in fact — I configured a project on OBS to build gecko. Things got stuck quite early on in the gecko build and I left it in a semi-configured state. I planned to go back to it, but never quite got around to it.
Over the last couple of days mal, keto and direc85 have been hard at work getting it into a usable state.
There were a bunch of changes needed for this. On the gcc side, direc85 identified a couple of bugs, 90139 and 96206, each caused an Internal Compiler Error (ICE) during the build. He and mal backported these so that they're incorporated into the OBS backend and available for the build.
These issues don't affect the aarch64 target which is why I'd not experienced them myself.
An additional problem arose in relation to the Python requirements. Back on Day 2 I hit an issue with this, the fix for which involved me configuring the build process to create a Python virtual environment. The Python requirements are then pip installed into this environment during the build.
This is fine for a local build, but on OBS there's no Internet access available after the prepare step completes; that means for most of the build, including at the point where this virtual environment is set up.
The result is that the build was failing on OBS due to it not being able to successfully complete creation of a suitable virtual environment. Mal identified this and switched the build to use a different approach. Essentially, he switched from using the create-mach-environmen to setting MACH_USE_SYSTEM_PYTHON instead:
$ git diff 4e8eeba521 5bd0410081 diff --git a/rpm/xulrunner-qt5.spec b/rpm/xulrunner-qt5.spec index b3a7a4419142..5e397b2d11fe 100644 --- a/rpm/xulrunner-qt5.spec +++ b/rpm/xulrunner-qt5.spec @@ -470,7 +470,8 @@ echo 'mk_add_options LDFLAGS="$FIX_LDFLAGS"' >> "${MOZCONFIG}" RPM_BUILD_NCPUS=`nproc` -./mach create-mach-environment +export MACH_USE_SYSTEM_PYTHON=1 + #./mach build -j$RPM_BUILD_NCPUS ./mach build -j1 # This might be unnecessary but previously some filesThe consequence of this is that the virtual environment is no longer used and instead the preexisting Python install is used instead. Since all of the required packages are already available in the system install, this does the job and prevents the need to download anything during the build process.
Overnight mal triggered the OBS build to run, while at the same time I had the latest changes building locally on my machine.
Both went through successfully, which means the OBS build is nearly available for anyone to use.
There is still a little more to do: the packages that depend on gecko still need to be rebuilt. Since I'd not yet pushed any tags to my repository the gecko package was built with version number 60.9.1. That won't be accepted by any of the other packages, so I've pushed the tag `sailfishos/91.9.1+git1` to the repository and set it building again. Once that's complete (probably tomorrow) I can trigger the other packages to rebuild against it, which should be relatively swift.
In the meantime, a huge thank you goes to direct85, keto and mal for their efforts getting this working. It makes gecko that much more accessible to other developers and testers and is therefore an essential step in getting more people involved and getting it out and installed on to users' phones.
I'll be back writing again in just over a week unless there's something else specific to share in the meantime. Until then feel free to catch up on all the previous entries on my Gecko-dev Diary page.
Comments
Uncover Disqus comments