Phones - BrixIT Bloghttps://blog.brixit.nl/tag/phones/page/1Wed, 20 Mar 2024 23:50:30 -000060Bootstrapping Alpine Linux without roothttps://blog.brixit.nl/bootstrapping-alpine-linux-without-root/98LinuxMartijn BraamWed, 20 Mar 2024 23:50:30 -0000<p>Creating a chroot in Linux is pretty easy: put a rootfs in a folder and run the <code>sudo chroot /my/folder</code> command. But what if you don't want to use superuser privileges for this?</p> <p>This is not super simple to fix, not only does the <code>chroot</code> command itself require root permissions but the steps for creating the rootfs in the first place and mounting the required filesystems like /proc and /sys require root as well.</p> <p>In pmbootstrap the process for creating an installable image for a phone requires setting up multiple chroots and executing many commands in those chroots. If you have the password timeout disabled in sudo you will notice that you will have to enter your password tens to hundreds of times depending on the operation you're doing. An example of this is shown in the long running "<a href="https://gitlab.com/postmarketOS/pmbootstrap/-/issues/2052#note_966447872">pmbootstrap requires sudo</a>" issue on Gitlab. In this example sudo was called 240 times!</p> <p>Now it is possible with a lot of refactoring to move batches of superuser-requiring commands into scripts and elevate the permissions of that with a single sudo call but to get this down to a single sudo call per pmbootstrap command would be really hard.</p> <h2>Another approach</h2> <p>So instead of building a chroot the "traditional" way what are the alternatives?</p> <p>The magic trick to get this working are user namespaces. From the Linux documentation:</p> <blockquote>User namespaces isolate security-related identifiers and attributes, in particular, user IDs and group IDs (see <a href="https://man7.org/linux/man-pages/man7/credentials.7.html">credentials(7)</a>), the root directory, keys (see <a href="https://man7.org/linux/man-pages/man7/keyrings.7.html">keyrings(7)</a>), and capabilities (see <a href="https://man7.org/linux/man-pages/man7/capabilities.7.html">capabilities(7)</a>). A process's user and group IDs can be different inside and outside a user namespace. In particular, a process can have a normal unprivileged user ID outside a user namespace while at the same time having a user ID of 0 inside the namespace; in other words, the process has full privileges for operations inside the user namespace, but is unprivileged for operations outside the namespace. </blockquote> <p>It basically allows running commands in a namespace where you have UID 0 on the inside without requiring to elevate any of the commands. This does have a lot of limitations though which I somehow all manage to hit with this.</p> <p>One of the tools that makes it relatively easy to work with the various namespaces in Linux is <code>unshare</code>. Conveniently this is also part of <code>util-linux</code> so it's a pretty clean dependency to have.</p> <h2>Building a rootfs</h2> <p>There's enough examples of using <code>unshare</code> to create a chroot without sudo but those all assume you already have a rootfs somewhere to chroot into. Creating the rootfs itself has a few difficulties already though.</p> <p>Since I'm building an Alpine Linux rootfs the utility I'm going to use is <code>apk.static</code>. This is a statically compiled version of the package manager in Alpine which allows building a new installation from an online repository. This is similar to <code>debootstrap</code> for example if you re more used to Debian than Alpine.</p> <p>There's a wiki page on running <a href="https://wiki.alpinelinux.org/wiki/Alpine_Linux_in_a_chroot">Alpine Linux in a chroot</a> that documents the steps required for setting up a chroot the traditional way with this. The initial commands to aquire the <code>apk.static</code> binary don't require superuser at all, but after that the problems start:</p> <div class="highlight"><pre><span></span><span class="gp">$ </span>./apk.static -X <span class="si">${</span><span class="nv">mirror</span><span class="si">}</span>/latest-stable/main -U --allow-untrusted -p <span class="si">${</span><span class="nv">chroot_dir</span><span class="si">}</span> --initdb add alpine-base </pre></div> <p>This creates the Alpine installation in <code>${chroot_dir}</code>. This requires superuser privileges to set the correct permissions on the files of this new rootfs. After this there's two options of populating /dev inside this rootfs which both are problematic:</p> <div class="highlight"><pre><span></span><span class="gp">$ </span>mount -o <span class="nb">bind</span> /dev <span class="si">${</span><span class="nv">chroot_dir</span><span class="si">}</span>/dev <span class="go">mounting requires superuser privileges and this exposes all your hardware in the chroot</span> <span class="gp">$ </span>mknod -m <span class="m">666</span> <span class="si">${</span><span class="nv">chroot_dir</span><span class="si">}</span>/dev/full c <span class="m">1</span> <span class="m">7</span> <span class="gp">$ </span>mknod -m <span class="m">644</span> <span class="si">${</span><span class="nv">chroot_dir</span><span class="si">}</span>/dev/random c <span class="m">1</span> <span class="m">8</span> <span class="go">... etcetera, the mknod command also requires superuser privileges</span> </pre></div> <p>The steps after this have similar issues, most of them for <code>mount</code> reasons or <code>chown</code> reasons.</p> <p>There is a few namespace options from <code>unshare</code> used to work around these issues. The command used to run <code>apk.static</code> in my test implementation is this:</p> <div class="highlight"><pre><span></span><span class="gp">$ </span>unshare <span class="se">\</span> --user <span class="se">\</span> --map-users<span class="o">=</span><span class="m">10000</span>,0,10000 <span class="se">\</span> --map-groups<span class="o">=</span><span class="m">10000</span>,0,10000 <span class="se">\</span> --setuid <span class="m">0</span> <span class="se">\</span> --setgid <span class="m">0</span> <span class="se">\</span> --wd <span class="s2">&quot;</span><span class="si">${</span><span class="nv">chroot_dir</span><span class="si">}</span><span class="s2">&quot;</span> <span class="se">\</span> ./apk-tools-static -X...etc </pre></div> <p>This will use <code>unshare</code> to create a new userns and change the uid/gid inside that to 0. This effectively grants root privileges inside this namespace. But that's not enough.</p> <p>If <code>chown</code> is used inside the namespace it will still fail because my unprivileged user still can't change the permissions of those files. The solution to that is the uid remapping with <code>--map-users</code> and <code>--map-groups</code>. In the example above it sets up the namespace so files created with uid 0 will generate files with the uid 100000 on the actual filesystem. uid 1 becomes 100001 and this continues on for 10000 uids. </p> <p>This again does not completely solve the issue though because my unprivileged user still can't chown those files, doesn't matter if it's chowning to uid 0 or 100000. To give my unprivileged user this permission the <code>/etc/subuid</code> and <code>/etc/subgid</code> files on the host system have to be modified to add a rule. This sadly requires root privileges <i>once</i> to set up this privilege. To make the command above work I had to add this line to those two files:</p> <pre><code>martijn:100000:10000</code></pre> <p>This grants the user with the name <code>martijn</code> the permission to use 10.000 uids starting at uid 100.000 for the purpose of userns mapping.</p> <p>The result of this is that the <code>apk.static</code> command will seem to Just Work(tm) and the resulting files in <code>${chroot_dir}</code> will have all the right permissions but only offset by 100.000.</p> <h2>One more catch</h2> <p>There is one more complication with remapped uids and <code>unshare</code> that I've skipped over in the above example to make it clearer, but the command inside the namespace most likely cannot start.</p> <p>If you remap the uid with <code>unshare</code> you get more freedom inside the namespace, but it limits your privileges outside the namespace even further. It's most likely that the <code>unshare</code> command above was run somewhere in your own home directory. After changing your uid to 0 inside the namespace your privilege to the outside world will be as if you're uid 100.000 and that uid most likely does not have privileges. If any of the folders in the path to the executable you want <code>unshare</code> to run for you inside the namespace don't have the read and execute bit set for the "other" group in the unix permissions then the command will simply fail with "Permission denied".</p> <p>The workaround used in my test implementation is to just first copy the executable over to <code>/tmp</code> and hope you at least still have permissions to read there.</p> <h2>Completing the rootfs</h2> <p>So after all that the first command from the Alpine guide is done. Now there's only the problems left for mounting filesystems and creating files.</p> <p>While <code>/etc/subuid</code> does give permission to use a range of uids as an unprivileged user with a user namespace it does not give you permissions to create those files outside the namespace. So the way those files are created is basically the complicated version of <code>echo "value" | sudo tee /root/file</code>: </p> <div class="highlight"><pre><span></span><span class="gp">$ </span><span class="nb">echo</span> <span class="s2">&quot;nameserver a.b.c.d&quot;</span> <span class="p">|</span> unshare <span class="se">\</span> --user <span class="se">\</span> --map-users<span class="o">=</span><span class="m">10000</span>,0,10000 <span class="se">\</span> --map-groups<span class="o">=</span><span class="m">10000</span>,0,10000 <span class="se">\</span> --setuid <span class="m">0</span> <span class="se">\</span> --setgid <span class="m">0</span> <span class="se">\</span> --wd <span class="s2">&quot;</span><span class="si">${</span><span class="nv">chroot_dir</span><span class="si">}</span><span class="s2">&quot;</span> <span class="se">\</span> sh -c <span class="s1">&#39;cat &gt; /etc/resolv.conf&#39;</span> </pre></div> <p>This does set-up and tear down the entire namespace for every file change or creation which is a bit inefficient, but inefficient is still better than impossible. Changing file permissions is done in a similar way.</p> <p>To fix the mounting issue there's the mount namespace functionality in Linux. This allows creating new mounts inside the namespace as long as you still have permissions on the source file as your unprivileged user. This effectively means you can't use this to mount random block devices but it works great for things like <code>/proc</code> and loop mounts.</p> <p>There is a <code>--mount-proc</code> parameter that will tell <code>unshare</code> to set-up a mount namespace and then mount <code>/proc</code> inside the namespace at the right place so that's what I'm using. But I still need other things mounted. This mounting is done as a small inline shell script right before executing the commands inside the chroot:</p> <div class="highlight"><pre><span></span><span class="gp">$ </span>unshare <span class="se">\</span> --user <span class="se">\</span> --fork <span class="se">\</span> --pid <span class="se">\</span> --mount <span class="se">\</span> --mount-proc <span class="se">\</span> --map-users<span class="o">=</span><span class="m">10000</span>,0,10000 <span class="se">\</span> --map-groups<span class="o">=</span><span class="m">10000</span>,0,10000 <span class="se">\</span> --setuid <span class="m">0</span> <span class="se">\</span> --setgid <span class="m">0</span> <span class="se">\</span> --wd <span class="s2">&quot;</span><span class="si">${</span><span class="nv">chroot_dir</span><span class="si">}</span><span class="s2">&quot;</span> <span class="se">\</span> -- <span class="se">\</span> sh -c <span class="s2">&quot; \</span> <span class="s2"> mount -t proc none proc ; \</span> <span class="s2"> touch dev/zero ; \</span> <span class="s2"> mount -o rw,bind /dev/zero dev/zero ;\</span> <span class="s2"> touch dev/null ; \</span> <span class="s2"> mount -o row,bind /dev/null dev/null ;\</span> <span class="s2"> ...</span> <span class="go"> chroot . bin/sh \</span> <span class="go"> &quot;</span> </pre></div> <p>The mounts are created right between setting up the namespaces but before the chroot is started so the host filesystem can still be accessed. The working directory is set to the root of the rootfs using the <code>--wd</code> parameter of <code>unshare</code> and then bind mounts are made from <code>/dev/zero</code> to <code>dev/zero</code> to create those devices inside the rootfs.</p> <p>This combines the two impossible options to make it work. <code>mknod</code> can still not work inside namespaces because it is a bit of a security risk. <code>mount</code>'ing /dev gives access to way too many devices that are not needed but the mount namespace does allow bind-mounting the existing device nodes one by one and allows me to filter them.</p> <p>Then finally... the <code>chroot</code> command to complete the journey. This has to refer to the rootfs with a relative path and this also depends on the working directory being set by <code>unshare</code> since host paths are breaking with uid remapping.</p> <h2>What's next?</h2> <p>So this creates a full chroot without superuser privileges (after the initial setup) and this whole setup even works perfectly with having cross-architecture chroots in combination with <code>binfmt_misc</code>. </p> <p>Compared to <code>pmbootstrap</code> this codebase does very little and there's more problems to solve. For one all the filesystem manipulation has to be figured out to copy the contents of the chroot into a filesystem image that can be flashed. This is further complicated by the mangling of the uids in the host filesystem so it has to be remapped while writing into the filesystem again.</p> <p>Flashing the image to a fastboot capable device should be pretty easy without root privileges, it only requires an udev rule that is usually already installed by the android-tools package on various Linux distributions. For the PinePhone flashing happens on a mass-storage device and as far as I know it will be impossible to write to that without requiring actual superuser privileges.</p> <p>The code for this is in the <a href="https://git.sr.ht/~martijnbraam/ambootstrap">~martijnbraam/ambootstrap</a> repository, hopefully in some time I get this to actually write a plain Alpine Linux image to a phone :D</p> <p></p> Megapixels 2.0: DNG exportinghttps://blog.brixit.nl/megapixels-2-0-dng-exporting/89MegapixelsMartijn BraamSat, 18 Nov 2023 14:17:38 -0000<p>It seems overkill to make a whole seperate library dedicated to replacing 177 lines of code in Megapixels that touches libtiff, but this small section of code causes significant issues for distribution packaging and compatability with external photo editing software. Most importantly the adjusted version in Millipixels for the Librem 5 does not output DNG files that are close enough to the Adobe specifications to be loaded into the calibration software.</p> <p>Making this a seperate library would make it easier to test. In the Adobe DNG SDK there is a test utility that can verify if a TIFF file is up to DNG spec and it can (with a lot of complications) be build for Linux.</p> <h2>The spec</h2> <p>The first thing after copying over the code block from Megapixels to a seperate project is reading the Adobe DNG specification.</p> <p>When I wrote the original export code in Megapixels it was based around some example code I found on Github for using Libtiff that I can no longer find and it results in something that's close enough to a valid DNG file for the <code>dcraw</code> utility. This is also a DNG 1.0 file that is generated.</p> <p>I have spend the next day reading the <a href="https://www.kronometric.org/phot/processing/DNG/dng_spec_1.4.0.0.pdf">DNG 1.4 specification</a> from Adobe to understand what a valid DNG file is absolutely minimally required to have. These are my notes from that:</p> <div class="highlight"><pre><span></span><span class="gu">## Inside a DNG file</span> <span class="k">*</span> SubIFDType 0 is the original raw data <span class="k">*</span> SubIFDType 1 is the thumbnail data <span class="k">*</span> The recommendation is to store the thumbnail as the first IFD <span class="k">*</span> TIFF metdata goes in the first IFD <span class="k">*</span> EXIF tags are preferred <span class="k">*</span> Camera profiles are stored in the first IFD <span class="gu">## Required tags</span> <span class="k">*</span> DNGVersion <span class="k">*</span> UniqueCameraModel </pre></div> <h2>Validation</h2> <p>I also spend a long time to build the official Adobe DNG SDK. This is mostly useless for developing any open source software due to licensing but it does provide a nice <code>dng_validate</code> utility that can be used to actually test the DNG files. Building this utility is pretty horrifying since it requires some specific versions of dependencies and some patches to work on modern compilers.</p> <p>The libdng codebase now has the <a href="https://gitlab.com/megapixels-org/libdng/-/blob/master/adobe_dng_sdk.sh">adobe_dng_sdk.sh</a> script that will build the required libraries and the validation binary.</p> <p>with the Megapixels code adjusted with the info from the documentation above I fed some random noise as data to the library to generate a DNG file and run it through the validator.</p> <div class="highlight"><pre><span></span><span class="gp">$ </span>dng_validate out.dng <span class="go">Validating &quot;out.dng&quot;...</span> <span class="go">*** Warning: This file has Chained IFDs, which will be ignored by DNG readers ***</span> <span class="go">*** Error: Unable to find main image IFD ***</span> </pre></div> <p>Well that's not a great start... There's also a <code>-v</code> option to get some more verbose info</p> <div class="highlight"><pre><span></span><span class="gp">$ </span>dng_validate -v out.dng <span class="go">Validating &quot;out.dng&quot;...</span> <span class="go">Uses little-endian byte order</span> <span class="go">Magic number = 42</span> <span class="go">IFD 0: Offset = 308, Entries = 10</span> <span class="go">NewSubFileType: Preview Image</span> <span class="go">ImageWidth: 20</span> <span class="go">ImageLength: 15</span> <span class="go">BitsPerSample: 8</span> <span class="go">Compression: Uncompressed</span> <span class="go">PhotometricInterpretation: RGB</span> <span class="go">StripOffsets: Offset = 8</span> <span class="go">StripByteCounts: Count = 300</span> <span class="go">DNGVersion: 1.4.0.0</span> <span class="go">UniqueCameraModel: &quot;LibDNG&quot;</span> <span class="go">NextIFD = 10042</span> <span class="go">Chained IFD 1: Offset = 10042, Entries = 6</span> <span class="go">NewSubFileType: Main Image</span> <span class="go">ImageWidth: 320</span> <span class="go">ImageLength: 240</span> <span class="go">Compression: Uncompressed</span> <span class="go">StripOffsets: Offset = 441</span> <span class="go">StripByteCounts: Count = 9600</span> <span class="go">NextIFD = 0</span> <span class="go">*** Warning: This file has Chained IFDs, which will be ignored by DNG readers ***</span> <span class="go">*** Error: Unable to find main image IFD ***</span> </pre></div> <p>Let's have a look at what the DNG spec says about this:</p> <blockquote>DNG recommends the use of SubIFD trees, as described in the TIFF-EP specification. SubIFD chains are not supported.<br><br>The highest-resolution and quality IFD should use NewSubFileType equal to 0. Reduced resolution (or quality) thumbnails or previews, if any, should use NewSubFileType equal to 1 (for a primary preview) or 10001.H (for an alternate preview). <br><br>DNG recommends, but does not require, that the first IFD contain a low-resolution thumbnail, as described in the TIFF-EP specification.</blockquote> <p>So I have the right tags and the right IFDs but I need to make an IFD tree instead of chain in libtiff. I have no idea how IFD trees work so up to the next specification!</p> <p>It seems like TIFF trees are defined in the Adobe PageMaker 6 tech notes from 1995. That document describes that the NextIFD tag that libtiff used for me is used primarily for defining multi-page documents, not multiple encodings of the same document like what happens here with a thumbnail and the raw data. You know this is a 1995 spec because it gives a Fax as example of a multi-page document.</p> <p>In the examples provided in that specification the first image is the main image and the NextIFD tag is just replaced by a subIFD tag. In case of DNG the main image is the thumbnail for compatibility with software that can't read the raw camera data.</p> <p>Switching over to a SubIFD tag is suprisingly simple, just badly documented. Libtiff will create the NextIFD tag automatically for you but if you create an empty SubIFD tag then libtiff will fill in the offset for the next IFD for you when closing the file:</p> <div class="highlight"><pre><span></span><span class="n">TIFF</span><span class="w"> </span><span class="o">*</span><span class="n">tif</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="n">TIFFOpen</span><span class="p">(</span><span class="n">path</span><span class="p">,</span><span class="w"> </span><span class="s">&quot;w&quot;</span><span class="p">);</span><span class="w"></span> <span class="c1">// Set the tags for IFD 0 like normal here</span> <span class="n">TIFFSetField</span><span class="p">(</span><span class="n">tif</span><span class="p">,</span><span class="w"> </span><span class="n">TIGTAG_SUBFILETYPE</span><span class="p">,</span><span class="w"> </span><span class="n">DNG_SUBFILETYPE_THUMBNAIL</span><span class="p">);</span><span class="w"></span> <span class="c1">// Create a NULL reference for one SubIFD</span> <span class="kt">uint64_t</span><span class="w"> </span><span class="n">offsets</span><span class="p">[]</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="p">{</span><span class="w"> </span><span class="mf">0L</span><span class="w"> </span><span class="p">};</span><span class="w"></span> <span class="n">TIFFSetField</span><span class="p">(</span><span class="n">tif</span><span class="p">,</span><span class="w"> </span><span class="n">TIFFTAG_SUBIFD</span><span class="p">,</span><span class="w"> </span><span class="mi">1</span><span class="p">,</span><span class="w"> </span><span class="o">&amp;</span><span class="n">offsets</span><span class="p">);</span><span class="w"></span> <span class="c1">// Write the thumbnail image data here</span> <span class="c1">// Close the first IFD</span> <span class="n">TIFFWriteDirectory</span><span class="p">(</span><span class="n">tif</span><span class="p">);</span><span class="w"></span> <span class="c1">// Start IFD1 describing the raw data</span> <span class="n">TIFFSetField</span><span class="p">(</span><span class="n">tif</span><span class="p">,</span><span class="w"> </span><span class="n">TIFFTAG_SUBFILETYPE</span><span class="p">,</span><span class="w"> </span><span class="n">DNG_SUBFILETYPE_ORIGINAL</span><span class="p">);</span><span class="w"></span> <span class="c1">// write raw data and close the directory again</span> <span class="n">TIFFWriteDirectory</span><span class="p">(</span><span class="n">tif</span><span class="p">);</span><span class="w"></span> <span class="c1">// Close the tiff, this will cause libtiff to patch up the references</span> <span class="n">TIFFCLose</span><span class="p">(</span><span class="n">tif</span><span class="p">);</span><span class="w"></span> </pre></div> <p>So with the code updated the validation tool neatly shows the new SubIFD tags and finds actual errors in my DNG file data now</p> <pre><code>Uses little-endian byte order Magic number = 42 IFD 0: Offset = 308, Entries = 11 NewSubFileType: Preview Image ImageWidth: 20 ImageLength: 15 BitsPerSample: 8 Compression: Uncompressed PhotometricInterpretation: RGB StripOffsets: Offset = 8 StripByteCounts: Count = 300 SubIFDs: IFD = 10054 DNGVersion: 1.4.0.0 UniqueCameraModel: &quot;LibDNG&quot; NextIFD = 0 SubIFD 1: Offset = 10054, Entries = 6 NewSubFileType: Main Image ImageWidth: 320 ImageLength: 240 Compression: Uncompressed StripOffsets: Offset = 453 StripByteCounts: Count = 9600 NextIFD = 0 *** Error: Missing or invalid SamplesPerPixel (IFD 0) *** *** Error: Missing or invalid PhotometricInterpretation (SubIFD 1) ***</code></pre> <p>Ah, so these two tags are actually required but not described as such in the DNG specification since these are TIFF tags instead of DNG tags (while it does explicitly tells other TIFF required data).</p> <p>Patching up these errors is easy, just slightly annoying since the validation tool seemingly gives only a single error per IFD requiring to iterate on the code a bit more. After a whole lot of iterating on the exporting code I managed to get the first valid DNG file:</p> <pre><code>Raw image read time: 0.000 sec Linearization time: 0.002 sec Interpolate time: 0.006 sec Validation complete</code></pre> <p>Now the next step is adding all the plumbing to make this usable as library and making an actually nice command line utility.</p> <h2>First actual test</h2> <p>Now I have written the first iterations of libmegapixels and libdng it should be possible to actually load a picture in some editing software. So let's try some end-to-end testing with this.</p> <p>With the <code>megapixels-getframe</code> utility from libmegapixels I can get a frame from the sensor (In this case the rear camera of the Librem 5) and then feed that raw data to the <code>makedng</code> utility from libdng.</p> <div class="highlight"><pre><span></span><span class="gp">$ </span>getframe -o test.raw <span class="go">Using config: /usr/share/megapixels/config/purism,librem5.conf</span> <span class="go">received frame</span> <span class="go">received frame</span> <span class="go">received frame</span> <span class="go">received frame</span> <span class="go">received frame</span> <span class="go">Stored frame to: test.raw</span> <span class="go">Format: 4208x3120</span> <span class="go">Pixfmt: GRBG</span> <span class="gp">$ </span>makedng -w <span class="m">4208</span> -h <span class="m">3120</span> -p GRBG test.raw test.dng <span class="go">Reading test.raw...</span> <span class="go">Writing test.dng...</span> </pre></div> <p>No errors and the file passes the DNG validation, let's load it into RawTherapee :)</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1700184535/image.png" class="kg-image"><figcaption>The first frame loaded into RawTherapee</figcaption></figure> <p>I had to boost the exposure a bit since the <code>megapixels-getframe</code> tool does not actually control any of the sensor parameters like the exposure time so the resulting picture is very dark. There's also no whitebalance or autofocus happening so the colors look horrible. </p> <p>But... </p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1700184873/compare-checker.jpg" class="kg-image"></figure> <p>The colors are correct! The interpetation of the CFA pattern of the sensor and the orientation of the data is all correct.</p> <h2>Integration testing</h2> <p>The nice thing about having the seperate library is that testing it becomes a lot easier than testing a GTK4 application. I have added the first simple end-to-end test to the codebase now that feeds some data to makedng and checks if the result is a valid DNG file using the official Adobe tool.</p> <div class="highlight"><pre><span></span><span class="ch">#!/bin/bash</span> <span class="nb">set</span> -e <span class="k">if</span> <span class="o">[</span> <span class="nv">$#</span> -ne <span class="m">1</span> <span class="o">]</span><span class="p">;</span> <span class="k">then</span> <span class="nb">echo</span> <span class="s2">&quot;Missing tool argument&quot;</span> <span class="nb">exit</span> <span class="m">1</span> <span class="k">fi</span> <span class="nv">makedng</span><span class="o">=</span><span class="s2">&quot;</span><span class="nv">$1</span><span class="s2">&quot;</span> <span class="nb">echo</span> <span class="s2">&quot;Running tests with &#39;</span><span class="nv">$makedng</span><span class="s2">&#39;&quot;</span> <span class="c1"># This testsuite runs raw data through the makedng utility and validates the</span> <span class="c1"># result using the dng_validate tool from the Adobe DNG SDK. This tool needs</span> <span class="c1"># to be manually installed for these tests to run.</span> <span class="c1"># Create test raw data</span> mkdir -p scratch magick -size 1280x720 gradient: -colorspace RGB scratch/data.rgb <span class="c1"># Generate DNG</span> <span class="nv">$makedng</span> -w <span class="m">1280</span> -h <span class="m">720</span> -p RG10 scratch/data.rgb scratch/RG10.dng <span class="c1"># Validate DNG</span> dng_validate scratch/RG10.dng </pre></div> <p>This is launched from ctest in my cmake files for now since I'm developing most of this stuff using CLion which only properly supports cmake projects. This is why a lot of my C projects have both meson and cmake files to build them but only the meson project file has install commands in it.</p> <p>For more advanced testing it would be neat to have raw sensor dumps of several sensors in different formats which are all pictures of a colorchecker like the picture above. Then have some (probably opencv) utility that can validate that a colorchecker is present in the picture with the right colors.</p> <p>There also needs to be a non-adobe-propriatary validation tool that can be easily run as testsuite for distribution packaging so at build time it's possible to validate that the combination of libdng and the distribution version of libtiff can produce sane output. This has caused several issues in Megapixels before after all.</p> <h2>Overall architecture</h2> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1700232871/path4862-1-4.png" class="kg-image"><figcaption>I&#x27;ve spent too much time drawing this</figcaption></figure> <p>With the addition of libdng the architecture for Megapixels 2.0 starts to look like this. Megapixels no longer has any pipeline manipulation code, that is all handled by the library which after configuration just passes the file descriptor for the sensor node to Megapixels to handle the realtime control of the sensor parameters.</p> <p>The libdng code replaces the plain libtiff exporting done in Megapixels and generate the DNG files that will be read by postprocessd. Postprocessd reads the dng files with the help of the dcraw library which already has custom DNG reading code that does not use libtiff.</p> <p>The next steps now is to flesh out the library public interface for libdng so it can do all the DNG metadata that Megapixels requires and then hooking it up to Megapixels to actually use it.</p> <hr> <h3>Funding update</h3> <p>Since my <a href="https://blog.brixit.nl/adding-hardware-to-libmegapixels/">previous post</a> about the libmegapixels developments and the <a href="https://blog.brixit.nl/megapixels-2-0/">Megapixels 2.0 post</a> I wrote before that I've almost doubled the funding for actually working on all the FOSS contributions. I'm immensely thankful for all the new patrons and it also made me notice that the <a href="https://blog.brixit.nl/donations/">donations</a> page on this site was no longer being regenerated. That is fixed now.</p> <p>I'm also still trying to figure out if I can add some perks for patrons to all of this but practically all options just amount to making things slightly worse for non-patrons. I hope just making the FOSS ecosystem better one of code line at a time is enough :)</p> Megapixels 2.0https://blog.brixit.nl/megapixels-2-0/87LinuxMartijn BraamThu, 09 Nov 2023 18:33:39 -0000<p>The Megapixels camera application has long been the most performant camera application on the original PinePhone. I have not gotten the Megapixels application to that point alone. There have been several other contributors that have helped slowly improving performance and features of this application. Especially Benjamin has leaped it forward massively with the threaded processing code and GPU accelerated preview.</p> <p>All this code has made Megapixels very fast on the PinePhone but also has made it quite a lot harder to port the application to other hardware. The code is very much overfitted for the PinePhone hardware.</p> <h2>Finding a better design</h2> <p>To address the elephant in the room, yes libcamera exists and promises to abstract this all away. I just disagree with the design tradeoffs taken with libcamera and I think that any competition would only improve the ecosystem. It can't be that libcamera got this exactly right on the first try right?</p> <p>Instead of the implementation that libcamera has made that makes abstraction code in c++ for every platform I have decided to pick the method that libalsa uses for the audio abstraction in userspace.</p> <p>Alsa UCM config files are selected by soundcard name and contain a set of instructions to bring the audio pipeline in the correct state for your current usecase. All the hardware specific things are not described in code but instead in plain text configuration files. I think this scales way better since it massively lowers the skill floor needed to actually mess with the system to get hardware working.</p> <p>The first iteration of Megapixels has already somewhat done this. There's a config file that is picked based on the hardware model that describes the names of the device nodes in /dev so those paths don't have to be hardcoded and it describes the resolution and mode to configure. It also describes a few details about the optical path to later produce correct EXIF info for the pictures.</p> <div class="highlight"><pre><span></span><span class="k">[device]</span><span class="w"></span> <span class="na">make</span><span class="o">=</span><span class="s">PINE64</span><span class="w"></span> <span class="na">model</span><span class="o">=</span><span class="s">PinePhone</span><span class="w"></span> <span class="k">[rear]</span><span class="w"></span> <span class="na">driver</span><span class="o">=</span><span class="s">ov5640</span><span class="w"></span> <span class="na">media-driver</span><span class="o">=</span><span class="s">sun6i-csi</span><span class="w"></span> <span class="na">capture-width</span><span class="o">=</span><span class="s">2592</span><span class="w"></span> <span class="na">capture-height</span><span class="o">=</span><span class="s">1944</span><span class="w"></span> <span class="na">capture-rate</span><span class="o">=</span><span class="s">15</span><span class="w"></span> <span class="na">capture-fmt</span><span class="o">=</span><span class="s">BGGR8</span><span class="w"></span> <span class="na">preview-width</span><span class="o">=</span><span class="s">1280</span><span class="w"></span> <span class="na">preview-height</span><span class="o">=</span><span class="s">720</span><span class="w"></span> <span class="na">preview-rate</span><span class="o">=</span><span class="s">30</span><span class="w"></span> <span class="na">preview-fmt</span><span class="o">=</span><span class="s">BGGR8</span><span class="w"></span> <span class="na">rotate</span><span class="o">=</span><span class="s">270</span><span class="w"></span> <span class="na">colormatrix</span><span class="o">=</span><span class="s">1.384,-0.3203,-0.0124,-0.2728,1.049,0.1556,-0.0506,0.2577,0.8050</span><span class="w"></span> <span class="na">forwardmatrix</span><span class="o">=</span><span class="s">0.7331,0.1294,0.1018,0.3039,0.6698,0.0263,0.0002,0.0556,0.7693</span><span class="w"></span> <span class="na">blacklevel</span><span class="o">=</span><span class="s">3</span><span class="w"></span> <span class="na">whitelevel</span><span class="o">=</span><span class="s">255</span><span class="w"></span> <span class="na">focallength</span><span class="o">=</span><span class="s">3.33</span><span class="w"></span> <span class="na">cropfactor</span><span class="o">=</span><span class="s">10.81</span><span class="w"></span> <span class="na">fnumber</span><span class="o">=</span><span class="s">3.0</span><span class="w"></span> <span class="na">iso-min</span><span class="o">=</span><span class="s">100</span><span class="w"></span> <span class="na">iso-max</span><span class="o">=</span><span class="s">64000</span><span class="w"></span> <span class="na">flash-path</span><span class="o">=</span><span class="s">/sys/class/leds/white:flash</span><span class="w"></span> <span class="k">[front]</span><span class="w"></span> <span class="na">...</span><span class="w"></span> </pre></div> <p>This works great for the PinePhone but it has a significant drawback. Most mobile cameras require an elaborate graph of media nodes to be configured before video works, the PinePhone is the exception in that the media graph only has an input and output node so Megapixels just hardcodes that part of the hardware setup. This makes the config file practically useless for all other phones and this is also one of the reason why different devices have different forks to make Megapixels work.</p> <p>So a config file that only works for a single configuration is pretty useless. Instead of making this an .ini file I've switched the design over to libconfig so I don't have to create a whole new parser and it allows for nested configuration blocks. The config file I have been using on the PinePhone with the new codebase is this:</p> <div class="highlight"><pre><span></span><span class="k">Version</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="m">1</span><span class="err">;</span><span class="w"></span> <span class="k">Make</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;PINE64&quot;</span><span class="err">;</span><span class="w"></span> <span class="k">Model</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;PinePhone&quot;</span><span class="err">;</span><span class="w"></span> <span class="k">Rear</span><span class="err">:</span><span class="w"> </span><span class="p">{</span><span class="w"></span> <span class="w"> </span><span class="k">SensorDriver</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;ov5640&quot;</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">BridgeDriver</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;sun6i-csi&quot;</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">FlashPath</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;/sys/class/leds/white:flash&quot;</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">IsoMin</span><span class="err">:</span><span class="w"> </span><span class="m">100</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">IsoMax</span><span class="err">:</span><span class="w"> </span><span class="m">64000</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Modes</span><span class="err">:</span><span class="w"> </span><span class="p">(</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="w"></span> <span class="w"> </span><span class="k">Width</span><span class="err">:</span><span class="w"> </span><span class="m">2592</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Height</span><span class="err">:</span><span class="w"> </span><span class="m">1944</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Rate</span><span class="err">:</span><span class="w"> </span><span class="m">15</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Format</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;BGGR8&quot;</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Rotate</span><span class="err">:</span><span class="w"> </span><span class="m">270</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">FocalLength</span><span class="err">:</span><span class="w"> </span><span class="m">3</span><span class="k">.</span><span class="m">33</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">FNumber</span><span class="err">:</span><span class="w"> </span><span class="m">3</span><span class="k">.</span><span class="m">0</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Pipeline</span><span class="err">:</span><span class="w"> </span><span class="p">(</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Link&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">From</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;ov5640&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">FromPad</span><span class="err">:</span><span class="w"> </span><span class="m">0</span><span class="p">,</span><span class="w"> </span><span class="k">To</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;sun6i-csi&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">ToPad</span><span class="err">:</span><span class="w"> </span><span class="m">0</span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Mode&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Entity</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;ov5640&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Width</span><span class="err">:</span><span class="w"> </span><span class="m">2592</span><span class="p">,</span><span class="w"> </span><span class="k">Height</span><span class="err">:</span><span class="w"> </span><span class="m">1944</span><span class="p">,</span><span class="w"> </span><span class="k">Format</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;BGGR8&quot;</span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">)</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="w"></span> <span class="w"> </span><span class="k">Width</span><span class="err">:</span><span class="w"> </span><span class="m">1280</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Height</span><span class="err">:</span><span class="w"> </span><span class="m">720</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Rate</span><span class="err">:</span><span class="w"> </span><span class="m">30</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Format</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;BGGR8&quot;</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Rotate</span><span class="err">:</span><span class="w"> </span><span class="m">270</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">FocalLength</span><span class="err">:</span><span class="w"> </span><span class="m">3</span><span class="k">.</span><span class="m">33</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">FNumber</span><span class="err">:</span><span class="w"> </span><span class="m">3</span><span class="k">.</span><span class="m">0</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Pipeline</span><span class="err">:</span><span class="w"> </span><span class="p">(</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Link&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">From</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;ov5640&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">FromPad</span><span class="err">:</span><span class="w"> </span><span class="m">0</span><span class="p">,</span><span class="w"> </span><span class="k">To</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;sun6i-csi&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">ToPad</span><span class="err">:</span><span class="w"> </span><span class="m">0</span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Mode&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Entity</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;ov5640&quot;</span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">)</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="p">}</span><span class="w"></span> <span class="w"> </span><span class="p">)</span><span class="err">;</span><span class="w"></span> <span class="p">}</span><span class="err">;</span><span class="w"></span> <span class="k">Front</span><span class="err">:</span><span class="w"> </span><span class="p">{</span><span class="w"></span> <span class="w"> </span><span class="k">SensorDriver</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;gc2145&quot;</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">BridgeDriver</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;sun6i-csi&quot;</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">FlashDisplay</span><span class="err">:</span><span class="w"> </span><span class="k">true</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Modes</span><span class="err">:</span><span class="w"> </span><span class="p">(</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="w"></span> <span class="w"> </span><span class="k">Width</span><span class="err">:</span><span class="w"> </span><span class="m">1280</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Height</span><span class="err">:</span><span class="w"> </span><span class="m">960</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Rate</span><span class="err">:</span><span class="w"> </span><span class="m">60</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Format</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;BGGR8&quot;</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Rotate</span><span class="err">:</span><span class="w"> </span><span class="m">90</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Mirror</span><span class="err">:</span><span class="w"> </span><span class="k">true</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="k">Pipeline</span><span class="err">:</span><span class="w"> </span><span class="p">(</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Link&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">From</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;gc2145&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">FromPad</span><span class="err">:</span><span class="w"> </span><span class="m">0</span><span class="p">,</span><span class="w"> </span><span class="k">To</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;sun6i-csi&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">ToPad</span><span class="err">:</span><span class="w"> </span><span class="m">0</span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Mode&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Entity</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;gc2145&quot;</span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">)</span><span class="err">;</span><span class="w"></span> <span class="w"> </span><span class="p">}</span><span class="w"></span> <span class="w"> </span><span class="p">)</span><span class="err">;</span><span class="w"></span> </pre></div> <p>Instead of having a hardcoded preview mode and main mode for every sensor it's now possible to make many different resolution configs. This config recreates the 2 existing modes and Megapixels now picks faster mode for the preview automatically and use higher resolution modes for the actual picture. </p> <p>Every mode now also has a <code>Pipeline</code> block that describes the media graph as a series of commands, every line translates to one ioctl called on the right device node just like Alsa UCM files describe it as a series of amixer commands. Megapixels no longer has the implicit PinePhone pipeline so here it describes the one link it has to make between the sensor node and the csi node and it tells Megapixels to set the correct mode on the sensor node.</p> <p>This simple example of the PinePhone does not really show off most of the config options so lets look at a more complicated example:</p> <div class="highlight"><pre><span></span><span class="k">Pipeline</span><span class="err">:</span><span class="w"> </span><span class="p">(</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Link&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">From</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;imx258&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">FromPad</span><span class="err">:</span><span class="w"> </span><span class="m">0</span><span class="p">,</span><span class="w"> </span><span class="k">To</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;rkisp1_csi&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">ToPad</span><span class="err">:</span><span class="w"> </span><span class="m">0</span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Mode&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Entity</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;imx258&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Format</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;RGGB10P&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Width</span><span class="err">:</span><span class="w"> </span><span class="m">1048</span><span class="p">,</span><span class="w"> </span><span class="k">Height</span><span class="err">:</span><span class="w"> </span><span class="m">780</span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Mode&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Entity</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;rkisp1_csi&quot;</span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Mode&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Entity</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;rkisp1_isp&quot;</span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Mode&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Entity</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;rkisp1_isp&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Pad</span><span class="err">:</span><span class="w"> </span><span class="m">2</span><span class="p">,</span><span class="w"> </span><span class="k">Format</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;RGGB8&quot;</span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Crop&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Entity</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;rkisp1_isp&quot;</span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Crop&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Entity</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;rkisp1_isp&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Pad</span><span class="err">:</span><span class="w"> </span><span class="m">2</span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Mode&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Entity</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;rkisp1_resizer_mainpath&quot;</span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Mode&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Entity</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;rkisp1_resizer_mainpath&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Pad</span><span class="err">:</span><span class="w"> </span><span class="m">1</span><span class="p">},</span><span class="w"></span> <span class="w"> </span><span class="p">{</span><span class="k">Type</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;Crop&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Entity</span><span class="err">:</span><span class="w"> </span><span class="s2">&quot;rkisp1_resizer_mainpath&quot;</span><span class="p">,</span><span class="w"> </span><span class="k">Width</span><span class="err">:</span><span class="w"> </span><span class="m">1048</span><span class="p">,</span><span class="w"> </span><span class="k">Height</span><span class="err">:</span><span class="w"> </span><span class="m">768</span><span class="p">},</span><span class="w"></span> <span class="p">)</span><span class="err">;</span><span class="w"></span> </pre></div> <p>This is the preview pipeline for the PinePhone Pro. Most of the Links are already hardcoded by the kernel itself so here it only creates the link from the rear camera sensor to the csi and all the other commands are for configuring the various entities in the graph.</p> <p>The <code>Mode</code> commands are basically doing the <code>VIDIOC_SUBDEV_S_FMT</code> ioctl on the device node found by the entity name. To make configuring modes on the pipeline not extremely verbose it implicitly takes the resolution, pixelformat and framerate from the main information set by the configuration block itself. Since several entities can convert the frames into another format or size it automatically cascades the new mode to the lines below it.</p> <p>In the example above the 5th command sets the format to <code>RGGB8</code> which means that the mode commands below it for <code>rkisp1_resizer_mainpath</code> also will use this mode but the <code>rkisp1_csi</code> mode command above it will still be operating in <code>RGGB10P</code> mode.</p> <h2>Splitting of device management code</h2> <p>Testing changes in Megapixels is pretty hard. To develop the Megapixels code I'm building it on the phone and launching it over SSH with a bunch of environment variables set so the GTK window shows up on the phone and I get realtime logs on my computer. If there's anything that's going on after the immediate setup code it is quite hard to debug because it's in one of the three threads that process the image data.</p> <p>To implement the new pipeline configuration I did that in a new empty project that builds a shared library and a few command line utilities that help test a few specific things. This codebase is <code>libmegapixels</code> and with it I have split off all hardware access from Megapixels itself making both these codebases a lot easier to understand.</p> <p>It has been a lot easier to debug complex camera pipelines using the commandline utilities and only working on the library code. It should also make it a lot easier to make Megapixels-like applications that are not GTK4 to make it integrate more with other environments. One of the test applications for libmegapixels is <code>getframe</code> which is now all you need to get a raw frame from the sensor.</p> <p>Since this codebase is now split into multiple parts I have put it into a seperate gitlab organisation at <a href="https://gitlab.com/megapixels-org">https://gitlab.com/megapixels-org</a> which hopefully keeps this a bit organized.</p> <p>This is also the codebase used for <a href="https://fosstodon.org/@martijnbraam/110775163438234897">https://fosstodon.org/@martijnbraam/110775163438234897</a> which shows off libmegapixels and megapixels 2.0 running on the Librem 5.</p> <h2>Burnout</h2> <p>So now the worse part of this blog post. No you can't use this stuff yet :(</p> <p>I've been working on this code for months, and now I've not been working on this code for months. I have completely burned out on all of this.</p> <p>The libmegapixels code is in pretty good state but the Megapixels rewrite is still a large mess:</p> <ul><li>Saving pictures doesn&#x27;t really work and I intended to split that off to create libdng</li> <li>The QR code support is not hooked up at all at the moment</li> <li>Several pixelformats don&#x27;t work correctly in the GPU decoder and I can&#x27;t find out why</li> <li>Librem 5 and PinePhone Pro really need auto-exposure, auto-focus and auto-whitebalance to produce anything remotely looking like a picture. I have ported the auto-exposure from Millipixels which works reasonably well for this but got stuck on AWB and have not attempted Autofocus yet.</li> </ul> <p>The mountain of work that's left to do to make this a superset of the functionality of Megapixels 1.x and the expectations surrounding it have made this pretty hard to work on. On the original Megapixel releases nothing mattered because any application that could show a single frame of the camera was already a 100% improvement over the current state.</p> <p>Another issue is that whatever I do or figure out it will always be instantly be put down with "Why are you not using libcamera" and "libcamera probably fixes this". </p> <p>Some things people really need to understand is that an application not using libcamera does <i>not</i> mean other software on the system can't support libcamera. If Firefox can use libcamera to do videocalls that's great, that's not the usecase Megapixels is going for anyway.</p> <p>What also doesn't help is receiving bugreports for the PinePhone Pro while Megapixels does not support the PinePhone Pro. There's a patchset added on top to make in launch on the PinePhone Pro but there's a reason this patchset is not in Megapixels. The product of the Megapixels source with the ppp.patch added on top probably shouldn't've been distributed as Megapixels...</p> <p>What also doesn't help is that if Megapixels 2.0 were finished and released it would also create a whole new wave of criticism and comparisons to libcamera. I would have to support Megapixels for the people complaining that it's not enough... You could've not had a camera application at all...</p> <p>It also doesn't help that the libcamerea developers are also the v4l2 subsystem maintainers in the kernel. I have during development of libmegapixels tried sending a simple patch for an issue I've noticed that would massively improve the ease of debugging PinePhone Pro cameras. I've sent this 3 character patch upstream to the v4l2 mailing lists and it got a Reviewed-by in a few days.</p> <p>Then after 2 whole months of radio silence it got rejected by the lead developer of libcamera on debatable grounds. Now this is only a very small patch so I'm merely dissapointed. If I had put more work into the kernel side improving some sensor drivers I might have been mad but at this point I'm just not feeling like contributing to the camera ecosystem anymore. </p> <hr> <p><b>Edit:</b> I've been convinced to actually try to do this full-time and push the codebase forward enough to make it usable. This is continued at <a href="https://blog.brixit.nl/adding-hardware-to-libmegapixels/">https://blog.brixit.nl/adding-hardware-to-libmegapixels/</a></p> NitroKey disappoints mehttps://blog.brixit.nl/nitrokey-dissapoints-me/72PhonesMartijn BraamTue, 25 Apr 2023 18:39:47 -0000<p>There's an article making the rounds from NitroKey named "<a href="https://www.nitrokey.com/news/2023/smartphones-popular-qualcomm-chip-secretly-share-private-information-us-chip-maker">Smartphones With Popular Qualcomm Chip Secretly Share Private Information With US Chip-Maker</a>".</p> <p>This article is a marketing piece for selling their rebadged Pixel phones by picking a random phone and pointing at network traffic. It takes a look at a Sony Xperia XA2 and for some reason calls out Fairphone in particular.</p> <p>The brand of the device should not really matter if this is a chipset issue as the article claims but it goes even further than just calling out other brands, it also additionally uses a custom rom to check these things instead of software supplied by those brands.</p> <p>The second thing the article does is point out that WiFi geolocation exists and is done by Google and Apple by showing a screenshot from the Wiggle service that has nothing to do with that. Phones use Cell network, WiFi and network geolocation to get a rough location of a device, not for evil but for saving power. This prevents the need to run the GPS receiver 24/7 since most features don't need an exact location. There's no claims being made by NitroKey that their phone doesn't provide any of this.</p> <p>After this we get to the main claim in the title of the article. The Qualcomm 630 chipset supposedly sharing private information with the manufacturer. The author of the article has found that the device connects to izatcloud.net and instead of doing the logical thing and opening <a href="http://izatcloud.net/">izatcloud.net</a> in a browser they do a whois request and then figure out it's from Qualcomm, They also proceed to contact Qualcomm lawyers instead of following the link on this page. The webpage hosted on this domain does conveniently explain who owns the domain and what it's purpose is and it's associated privacy policy. But that doesn't sound nearly as spooky.</p> <p>The next section makes the claim that this traffic is HTTP traffic and is not encrypted. It proceeds to not show the contents of this HTTP request because it would show that it's not at all interesting. It does not contain any private data. It's just downloading an GPS almanac from Qualcomm for A-GPS.</p> <p>The A-GPS data is only there to make getting a GPS fix quicker and more reliable. GPS signals are pretty weak and getting a lock indoors from a cold start (the device has been off for some time) is hard. Inside the GPS signal sent by the satellites there's occasional almanac data that compensates for things like atmospheric distortions, without the almanac your GPS position wouldn't even get within a few hundred meters of your actual position. Since this signal is only occasionally broadcast and you need to listen to a GPS sattelite for an extended amount of time (the broadcast takes around 10 minutes) it's easier for these modern devices to just fetch this information from the internet. Qualcomm provides this as a static file for their modems.</p> <p>This feature isn't even only in the Qualcomm 630 chipset, it's in practically all Qualcomm devices. Some third party Android roms go as far as to obscure the IP address of your phone by proxying this http request with another server. The rom they have tested obviously didn't.</p> <p>This feature is not even limited to Qualcomm devices, this practice happens in practically all devices that have both GPS and internet because people don't like waiting very long for their position when launching their navigation software. The NitroPhone has their GPS provided by Broadcom chips instead of Qualcomm ones so obviously it won't make the same HTTP requests, doesn't make it any more or less secure though.</p> <p>Now the main issue, is this personal data? The thing that gets leaked is your IP address which is required because that's how you connect to things on the internet. This system does not actually send any of your private information like the title of the article claims. </p> <h2>I'm disappointed</h2> <p>The reason for articles like this is pretty obvious. They want to sell more of their phones for a massive profit margin. The sad thing about making these "Oh no all your data is leaking!!!" articles is that when there's actual leaks it won't stand out between all the marketing bullshit. The painful part is that it's actually working. See the outrage about companies not having ethics and not following laws.</p> <p>This feature is not breaking laws, it's not unethical, it's not even made for eeeevill.</p> Mobile Linux camera pt6https://blog.brixit.nl/mobile-linux-camera-pt-6/70PhonesMartijn BraamWed, 08 Mar 2023 15:57:25 -0000<p>The processing with postprocessd has been working pretty well for me on the PinePhone. After I released it I had someone test it with the dng files from a Librem 5 to see how it deals with a completely different input.</p> <p>To my suprise the answer was: not well. With the same postprocessing for the PinePhone and the Librem 5 the Librem 5 pictures are turning out way too dark and contrasty. The postprocessd code is supposed to be generic and has no PinePhone specific code in it.</p> <p>Fast forward to some time later I now have a Librem 5 so I can do more camera development. The first thing to do is the sensor calibration process I did with the PinePhone in <a href="https://blog.brixit.nl/pinephone-camera-pt4/">part 4</a> of this blog series. This involves taking some pictures of a proper calibration target which in my case is an X-rite ColorChecker Passport and feeding that into some calibration software.</p> <p>Because aligning color charts and making sure all the file format conversions with the DCamProf calibration suite from RawTherapee is quite annoying I got the paid graphical utility from the developers. By analyzing the pictures the software will generate a lot of calibration data. From that currently only a small part is used by Megapixels: the ColorMatrix and ForwardMatrix.</p> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1678280014/image.png" class="kg-image"><figcaption>Calibration output snippet</figcaption></figure> <p>These are 3x3 matrices that do the colorspace conversion for the sensor. I originally just added these two to Megapixels because these have the least amount of values so they can fit in the camera config file and they have a reasonable impact on image quality.</p> <p>The file contains two more important things though. The ToneCurve which converts the brightness data from the sensor to linear space and the HueSatMap which contains three correction curves in a 3 dimensional space of hue, saturation and brightness, this obviously is the most data.</p> <h2>What is a raw photo?</h2> <p>The whole purpose of Megapixels and postprocessd is take the raw sensor data and postprocess that with a lot of cpu power after taking the picture to produce the best picture possible. The processing of this is built on top of existing open source photo processing libraries like libraw.</p> <p>The expectations this software has for "raw" image data is that it's high bit depth linear-light sensor data that has not been debayered yet. The data from the Librem 5 is exactly this, the PinePhone sensor data is weirder.</p> <p>Unlike most phones that have the camera connected over MIPI-CSI which gives a nice high speed serial connection to push image data, the PinePhone is connected over a parallel bus. </p> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1678281697/image.png" class="kg-image"><figcaption>Rear camera connection from the PinePhone 1.2 schematic</figcaption></figure> <p>This parallel bus provides hsync/vsync/clock and 8 data lines for the image data. The ov5640 sensor itself has a 10-bit interface though:</p> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1678281824/image.png" class="kg-image"><figcaption>The D[9:0] is the 10 image data lines from the sensor</figcaption></figure> <p>Since only 8 of the 10 lines are available in the flatflex from the sensor module that has the ov5640 in it the camera has to be configured to output 8-bit data. I made the assumption the sensor just truncates two bits from the image data but from the big difference in the brightness response I have the suspicion that the image data is no longer linear in this case. It might actually be outputing an image that's not debayered but <i>does</i> have an sRGB gamma curve.</p> <p>This is not really a case that raw image libraries deal with and it would not traditionally be labelled "raw sensor data". But it's what we have. But instead of making assumptions again lets just look at the data.</p> <p>I have pictures of the colorchecker for both cameras and the colorchecker contains a strip of grayscale patches. With this it's possible to make a very rough estimation of the gamma curve of the picture. I cropped out that strip of patches from both calibration pictures and put them in the same image but with different colors. I also made sure to rescale the data to hit 0% and 100% with the darkest and brightest patch.</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1678284482/image.png" class="kg-image"><figcaption>Waveform for the neutral patches, green is the PinePhone and pink is the Librem 5</figcaption></figure> <p>The result clearly shows that the the data from the PinePhone is not linear. It also shows that the Librem 5 is also not linear but in the opposite direction.</p> <p>These issues can be fixed though with the tonecurve calibration that's missing from the current Megapixels pictures</p> <h2>postprocessd is not generic after all</h2> <p>So what happened is that I saw the output of postprocessd while developing it and saw that my resulting pictures were way too bright. I thought I must've had a gamma issue and added a gamma correction to the code.</p> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1678285132/image.png" class="kg-image"></figure> <p>With this code added it looks way better for the PinePhone, it looks way worse for the Librem 5. This is all a side effect of developing it with the input of only one camera. The correct solution for this is not having this gamma correction and have the libraw step before it correct the raw data according to the tonecurve that's stored in the file.</p> <h2>Storing more metadata</h2> <p>The issue with adding more calibration metadata to the files is that it doesn't really fit in the camera ini file. I have debated just adding a quick hack to it and make a setting that generates a specific gamma curve to add as the tone curve. This will fix it for my current issue but to fix it once and for all it's way better to include <i>all</i> the curves generated by the calibration software.</p> <p>So what is the output of this software? Lumariver Profiler outputs .dcp files which are "Adobe Digital Negative Camera Profile" files. I have used the profile inspection output that turns this binary file into readable json and extracted the matrices before. It would be way easier to just include the .dcp file alongside the camera configuration files to store the calibration data.</p> <p>I have not been able to find any official file format specification for this DCP file but I saw something very familiar throwing the file in a hex editor... The file starts with <code>II</code>. This is the byte order mark for a TIFF file. The field directly after it is not 0x42 though which makes this an invalid TIFF file. It turns out that a DCP file is just a TIFF file with a modified header that does not have any image data in it. This makes the Megapixels implementation pretty easy: read the TIFF tags from the DCP and save them in the DNG (which is also TIFF).</p> <p>In practice this was not that easy. Mainly because I'm using libtiff and DCP is <i>almost</i> a TIFF file. Using libtiff for DNG files works pretty well since DNG is a superset of the TIFF specification. The only thing I have to do is add a few unknown TIFF tags to the libtiff library at runtime to use it. DCP is a subset of the TIFF specification instead and it is missing some of the tags that are required by the TIFF specification. There's also no way in libtiff to ignore the invalid version number in the header.</p> <p>So I wrote my own tiff parser for this. Tiff parsers are quite hard since there's an enormous amount of possiblities to store things in TIFF files. Since DCP is a smaller subset of TIFF it's quite reasonable to parse it manually instead. A parser for the DCP metadata is around 160 lines of plain C, so that is now embedded in Megapixels. The code searches for a .dcp files associated with a specific sensor and then embeds the calibration data into the generated DNG files. If the matrices are also defined in the camera ini files then those are overwritten by the ones from the DCP file.</p> <h2>Results</h2> <p>The new calibration work is now in <a href="https://gitlab.com/postmarketOS/megapixels/-/merge_requests/30">megapixels#30</a> and needs to go through the testing and release process now. There's also a release for postprocessd upcoming that removes the gamma correction.</p> <p>For the Librem 5 there's <a href="https://source.puri.sm/Librem5/millipixels/-/merge_requests/88">millipixels#88</a> that adds correct color matrices for now until that has the DCP code added. </p> Why I left PINE64https://blog.brixit.nl/why-i-left-pine64/62f92ad587c35a5ee6af37d0PhonesMartijn BraamWed, 17 Aug 2022 09:47:04 -0000<p>Linux hardware projects are made or broken by their community support. PINE64 has made some brilliant moves to build up a mobile Linux community, and has also made some major mistakes. This is my view on how PINE64 made the PinePhone a success, and then broke that again through their treatment of the community.</p> <p>I want to start by pointing out that this is <i>me</i> leaving PINE64 and not the projects I'm involved in like postmarketOS. These opinions are my own yadda yadda...</p> <h2>Community Editions and the PinePhone's early life</h2> <p>The original PinePhone was brought up on the existing Linux Mobile projects like Ubuntu Touch, postmarketOS, and Maemo Leste, and also spawned new Linux distributions like Mobian and Danctnix ARM. This grew until there were 25 different projects working on the PinePhone — an apparently thriving community.</p> <p>Following the initial set of Developer Editions, intended for community hackers to build the software with, the first consumer-targeted PinePhone devices were the Community Editions. A batch of PinePhones was produced with a special branded back cover for five community projects: UBPorts, postmarketOS, Mobian, Manjaro, and KDE Plasma Mobile. Every Community Edition phone sold also sent $10 to their respective Linux distribution projects.</p> <p>Working together through these Community Editions, Linux distributions built a software ecosystem that works pretty well on the PinePhone.</p> <h2>The end of community editions</h2> <p>In February 2021, PINE64 <a href="https://www.pine64.org/2021/02/02/the-end-of-community-editions/" rel="nofollow noopener">announced the end of the community editions</a>. At this moment, PINE64's focus shifted from supporting a diverse ecosystem of distributions and software projects around the PinePhone to just supporting Manjaro Linux alone.</p> <p>The fact that a useful software ecosystem for the PinePhone exists at all is thanks to the diverse strategy employed by PINE64 in supporting many distributions working together on each of the many facets of the software required. Much of the original hardware bring-up was done by Ubuntu Touch. Mobian developers built the telephony stack via their eg25-manager project. And in my role for the postmarketOS distribution, I developed the camera stack.</p> <p>Manjaro Linux has been largely absent from these efforts. The people working on many of the Linux distributions represented in the community editions tend to work not just on packaging software, but on building software as well. This is not the case for Manjaro, which focuses almost entirely on packaging existing software. Supporting Manjaro has historically done very little to facilitate the development of the software stack which is necessary for these devices to work. In some cases the Manjaro involvement actually causes extra workload for the developers by shipping known broken versions of software and pointing to the developers for support. Which is why <a href="https://dont-ship.it/">https://dont-ship.it/</a> was started.</p> <p>Regardless, Manjaro is now the sole project endorsed and financially supported by PINE64, at least for the Linux capable devices. As a consequence it has a disproportionate level of influence in how PINE64 develops its products and manages the ecosystem.</p> <h2>The last straw</h2> <p>With community members influence in PINE64 diminished in favor of a Manjaro mono-culture, what was once a vibrant ecosystem has been reduced to a bunch of burnt-out and maligned developers abandoning the project. The development channels are no longer the great collaboration between various distributions developing PinePhone components and there are now only a small number of unpaid developers working on anything important. Many of PINE64's new devices, such as the PinePhone Pro, PineNote, and others, have few to no developers working on the software — a potential death blow for PINE64's model of relying on the community to build the software.</p> <p>Everyone has had a different "last straw". For me, it was the SPI flash situation.</p> <p>There is a substantial change to booting between the PinePhone and PinePhone Pro. Previously, each distribution could develop a self-contained eMMC or microSD card image, including a compatible bootloader and kernel distribution. Installation is as simple as flashing a microSD card with the desired distribution and popping it in.</p> <p>On the PinePhone Pro, the hardware works differently: it prefers to load the bootloader from the eMMC instead of the microSD. This means that when the PinePhone Pro shipped from the factory with Manjaro on the eMMC it will always boot the Manjaro u-Boot, even when booting from a microSD card. We no longer have any control over the bootloader for these devices.</p> <p>There is a solution, however. The hardware can have an SPI flash chip that gives a bit of storage to put U-Boot in and that storage is always preferred over the eMMC and microSD storage. The problem with this is that all the distributions need to agree on a U-Boot build to put in there, and agree to never overwrite it with a distribution-specific version.</p> <p>The solution to this is Tow-Boot: a distribution of U-Boot that can be put in that flash chip. With this the U-Boot firmware can just be treated like system firmware and be updated through fwupd independent of what distributions ship. This would work not only for the PinePhone Pro, but would also enable things like installing your preferred Linux distribution on a PineBook Pro by popping in a flash drive with a UEFI installer, much like you can on any other laptop.</p> <p>Negotiating this solution was hell. Manjaro is incentivized not to agree to this, since it cedes their sole control over the bootloader, and PINE64 listens to Manjaro before anyone else. Furthermore, PINE64 does not actually want to add SPI flash chips to their hardware. Apparently, there has been some issues with people using SPI flash as RW storage on the A64-LTS boards, which would be a support issue.</p> <p>After months of discussions between the community, Manjaro, and PINE64 leadership, we finally were able to convince them to ship the PinePhone Pro with an SPI flash chip with Tow-Boot installed on it.</p> <p>But the Pinebook Pro has a similar boot configuration, and thus a similar problem. Some time after the PinePhone Pro was shipped, it was time for a new Pinebook Pro batch, and this discussion started again. The same arguments were re-iterated by all sides all over again, and the discussion went nowhere. PINE64 representatives went so far as to say, quote, "people who want [an SPI chip] can just solder one on". This batch of Pinebook Pros has ended up shipping without Tow-Boot flashed.</p> <h2>So I left</h2> <p>This is the moment I left. I left all the official channels, stepped down as PINE64 moderator. Left the private developer chat rooms. PINE64 cares only about Manjaro, and Manjaro does not care about working with any other distributions. This is no longer a community that listens to software developers. As a representative of postmarketOS, there is no further reason for me to be directly involved with PINE64 if the only opinions that matter are those of Manjaro.</p> <p>Like many others, I have become burnt out on this ecosystem. So I quit. I am no longer getting random requests to make Manjaro's factory software work. No longer am I enduring the stress and frustration of these meaningless discussions behind the scenes, and after not being in the PINE64 for some weeks I can definitely say I'm way less stressed out.</p> <p>Now I can just focus on making postmarketOS work better. On the PINE64 hardware, and all the many other devices supported by postmarketOS.</p> <p>I hope that future vendors will make better choices, and listen to the actual whole community. Maybe even help with the software development side.</p> Automated Phone Testing pt.1https://blog.brixit.nl/automated-phone-testing-pt-1/62eb0dbe87c35a5ee6af369cPhonesMartijn BraamThu, 04 Aug 2022 11:14:24 -0000<p>Testing things is one of the most important aspects of a Linux distribution, especially one for phones where people rely on it being in a state where calls are possible.</p> <p>For postmarketOS the testing is done manually for a large part. There's CI jobs running that verify that packages build correctly but that's as far as the automated part goes. For releases like service packs there is quiet a manual test process that involves upgrading the installation and checking if the device still boots and that the upgraded applications actually work.</p> <p>With the growth of postmarketOS the things that need to be tested have grown massively. If there is a change that affects multiple devices then that process will take the majority of the time to get an image running with the changes on all the target devices, and in many cases it involves getting other device maintainers to try on their devices for big releases.</p> <h2>Automation</h2> <p>Automated testing for the devices postmarketOS supports is not very easy. Phones don't have the interfaces to fully automate a test process. To get a new image flashed on a phone it usually takes a few steps of holding the right key combinations and plugging in the usb cable at the right moment. This process is also significantly different for a lot of devices to complicate things further.</p> <p>The goals for the automated testing are quite ambitious. The plan is to get as many devices as possible in a server rack hooked up to a computer with wires soldered to the phone to trigger the key combinations to control the boot process.</p> <p>For the software side this would require some interface to let multiple developers schedule test jobs on the devices they need and an interface to keep track of the state of all the connected hardware. This is quite a lot like the scheduling and interface for a regular CI system and large parts of this system will be modelled after how Gitlab CI works. </p> <p>This whole system will consist of many components:</p> <ul><li>A central webapplication that keeps track of all the available hardware and schedules jobs. The webapplication does not contain any implementation details about hardware except for the names.</li> <li>A server application that connects to the central webapplication and registers connected devices. This application has the responsibilty of tracking the state of devices and asks for new jobs from the central server when a device is free. There can be many instances of this server application so there can be multiple test racks maintained by different developers.</li> <li>An application that is spawned for every job that actually executes the testcase and parses the output from the serial port of the device.</li> <li>A piece of hardware that can press the buttons on the phone and deal with plugging in the power at the right moments. This hardware should be mostly a generic PCB that can deal with the most common interfaces for devices. For devices with special requirements a new board can be made that controls that.</li> <li>A case design to hold many phones in a relatively dense configuration. It is estimated that there can fit around 8 phones in a 2U rack space with a generic enclosure and some 3D printed holders.</li> </ul> <p>Splitting the software up this way should make this test setup scalable. The most complicated parts seem to be the central webapplication that should present a nice webinterface and deals with making it easy to run a test job on multiple devices, and the runner application that actually needs to deal with the hardware specific implementation details.</p> <p>Since this is quite a setup to build I've decided to start as small as possible. First get a test running by making some prototype hardware and a prototype runner application that only supports the PinePhone.</p> <h2>The initial test hardware</h2> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072516/P1080468.jpg" class="kg-image"></figure> <p>For the initial test hardware I'm using an off-the-shelf Raspberry Pi pico on a breadboard. Initial design revisions were based around an Atmel atmega32u4 to implement an usb-to-serial adapter and a second serial port for hardware control. Due to the chip shortage the popular Atmel parts are practically impossible to get.</p> <p>The Pi Pico has a microcontroller that is able to emulate multiple USB serial adapters just like the Atmega32u4 and after dealing with getting the SDK running is actually quite easy to write firmware for.</p> <p>For this initial revision I'm only running a single USB serial device on the Pi Pico since the PinePhone has an easily accessible serial port with a PINE64 debug adapter. For controlling the buttons I have soldered a few wires to various test points on the PinePhone PCB.</p> <p>The buttons on the PinePhone normally work by shorting a signal to ground. This is easily emulated with a microcontroller by having the gpio the test point is connected to configured as an input so the pin on the Pico becomes a high resistence that doesn't influence the PinePhone. When pressing the button the gpio can be set to output 0 so the signal is connected to ground.</p> <p>After some testing with the Pico this works great, it took a while to figure out that the Raspberry Pi Pico enables internal pull-down resistors by default on the gpios, even when it's set to input. This caused the phone to think all the buttons were held down all the time.</p> <h2>Control protocol</h2> <p>To actually control the buttons from a computer a protocol is needed for sending those commands to the Pi Pico. After coming up with a custom protocol first I got pointed to <a href="https://github.com/andersson/cdba">cdba</a>. This is a tool for booting images on boards which is exactly what I need.</p> <p>This tool is designed to work with some specific control boards which I don't have, but the protocol used for those boards is incredibly simple.</p> <p>Every command is a single character written to the serial port. For enabling power to the board a <code>P</code> is sent. For disabling the power a <code>p</code> is sent instead. This uppercase/lowercase system is also followed for holding down the power button <code>B/b</code> and the button required to get into a flasher mode <code>R/r</code>. </p> <p>This is the protocol I implemented in my first iteration of this firmware. The nice thing is that the hardware should also work with cdba, if it is a fastboot device at least.</p> <p>The code at this point is <a href=" https://paste.sr.ht/~martijnbraam/aef5008538a141f7d80c5c719e304b9789470bde">in this paste</a>.</p> <h2>Test application</h2> <p>To verify this design I wrote a small test application in python. It connects to two serial ports and takes a PinePhone disk image to boot into.</p> <p>The code used for the first sucessful flash is <a href="https://paste.sr.ht/~martijnbraam/6e534069f77f3ded925a85c03768382178c8a469">in this paste</a>.</p> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072517/Screenshot-from-2022-08-04-03-10-22.png" class="kg-image"></figure> <p>This application does multiple things. It first connects to the serial port of the Raspberry Pi Pico and resets all the state. Then it will hold the power button of the PinePhone for 15 seconds to force the phone in a known state.</p> <p>It also connects to the PINE64 uart debug adapter and reads the serial debug logs of Tow-Boot and the kernel. By looking for specific lines in the serial output it knows where in the boot process the phone is and it uses that to hold the volume button to get Tow-Boot into the USB Mass Storage mode. </p> <p>It then simply dd-s the disk image on the eMMC of the phone and restarts the device. Now the phone will boot into this new installation because this time the volume button is not held down while booting.</p> <p>The things that needs to be implemented after this detecting when the device is booted and logging in on the serial console so it can run the test script.</p> <p>This iteration of the script also hardcodes a lot of sleep times and device paths. Hardcoding the path to the block device works somewhat reliably on my laptop but it will fail in production when multiple devices are connected and booting in a random order. This can be fixed by specifing which USB ports the phone and test board are plugged into instead and using udev to figure out what block device and serial device belongs to which phone.</p> <h2>Next steps</h2> <p>The most important next thing to figure out in the application is designing a job description system so developers can write testcases to run. Also using this setup finding quirks in the boot process can be ironed out, like the application only flashing the phone correctly half the time because it sometimes boots the OS instead of getting into the bootloader.</p> <p>I have also already written some major parts of the central webapplication that actually deals with registering devices and data about those that can be used as variables in the test jobs. </p> <p>Once those parts integrate it would be important to get a second device up and running in the test rig like the Oneplus 6 to avoid overfitting the design to the PinePhone.</p> PinePhone Camera pt5https://blog.brixit.nl/pinephone-camera-pt5/620934a515b5040189838db3LinuxMartijn BraamSun, 13 Feb 2022 20:20:05 -0000<p>It's been a while since I've written anything about the Megapixels picture processing. The last post still showcases the old GTK3 version of Megapixels even!</p> <p>In the meantime users figured out how to postprocess the images better to get nicer results from the PinePhone camera. One of the major improvements that has landed was the sigmoidal contrast curve in ImageMagick.</p> <pre><code>convert img.tiff -sharpen 0x1.0 -sigmoidal-contrast 6,50% img.jpg</code></pre> <p>This command slightly sharpens the image and adds a nice smooth contrast curve to the image. This change has a major issue though, this is a fixed contrast curve added to all images and it does not work that great for a lot of cases. The best result was running this against pictures that were taken with the manual controls in megapixels so they have the right exposure.</p> <p>On the PinePhone the auto exposure in the sensor tends to overexpose images though. Adding more contrast after that will just make the issues worse. In the header image of this post there's three images shown generated from the same picture. The first one is the unprocessed image data, the second one is the .jpg created by the current version of Megapixels and the third one is the same data with my new post-processing software.</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072511/image.png" class="kg-image"><figcaption>Waveform visualisation of the banner image</figcaption></figure> <p>This screenshot shows the waveform of the same header image. This visualizes the distribution of image data on the horizontal axis it's the horizontal position of the image and on the vertical axis it's the brightness of all the pixels in that column plotted. Here you can still see the 3 distinct images from the header image but with different distribution of the color/brightness data.</p> <p>One of the main issues with the data straight from the sensor is that it's mostly in the upper part of the brightness range, there's no data at all in the bottom quarter of the brightness range and this is visible as images that have no contrast and look grayish. </p> <p>The sigmoidal contrast curve in the middle image takes the pixels above the middle line and makes them brighter and pixels below the middle line and makes them darker. The main part that's improving is the data extending further in the lower part here, but due to the curve the bright parts of the image become even brighter and the top line shows that the data is clipping.</p> <p>The third image with the new algorithm instead moves the data down by keeping the bright pixels in the same spot but "stretching" the image to the bottom. This corrects for the blacklevel of the sensor data and also creates contrast without clipping the data.</p> <h2>How</h2> <p>This started with me trying to make the postprocessing faster. Currently the postprocessing is done with a shell script that calls various image manipulation utilities to generate the final image.</p> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072512/old.png" class="kg-image"></figure> <p>Megapixels will take a burst of pictures and saves those as seperate .dng files in a temporary directory. From that series the second one is always used and the rest is ignored. With dcraw the image will be converted to rgb data and stored as tiff. Imagemagick will take that and apply the sharpness/contrast adjustment and save a .jpg</p> <p>Because these tools don't preserve the exif data about the picture exiftool is ran last to read the exif from the original .dng files and save that in the final .jpg</p> <p>Importing and exporting the image between the various stages is not really fast, and for some reason the processing in Imagemagick is just really really slow. My plan was to replace the 3 seperate utilities with a native binary that uses libraw, libjpeg, libtiff and libexif to deal with this process instead. </p> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072512/postprocessd-v1.png" class="kg-image"><figcaption>version 1 of postprocessd</figcaption></figure> <p>The new tool is postprocessd (because it's supposed to run in the background and queue processing) It uses libraw to get rgb data, this is the same library that's used in dcraw. Then the resulting data is written directly to to libjpeg to create the final jpegs without any processing in between. This is what actually generated the first image shown in the banner. Processing a single .dng to a .jpg in this pipeline is pretty fast compared to the old pipeline, a full processing takes 4 seconds on the PinePhone.</p> <p>The downside is that the image looked much worse due to the missing processing. Also just having a bunch of .jpeg files isn't ideal. The solution I wanted is still the image stacking to get less noise. With the previous try to get stacking running with HDR+ it turned out that that process is way way way too slow for the PinePhone and the results just weren't that great. In the meantime I've encountered <a href="https://github.com/luigi311/Low-Power-Image-Processing">https://github.com/luigi311/Low-Power-Image-Processing</a> which uses opencv to do the stacking instead. This seemed easy to fit in.</p> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072512/postprocessd-v2.png" class="kg-image"><figcaption>Version 2 with opencv for stacking</figcaption></figure> <p>This new code takes all the frames and converts them with libraw. Then the opencv code filters out all the images that are low contrast or fully black, because sometimes Megapixels glitches out. The last .dng file is then taken as a reference image and all the other images are aligned on top of that with a full 4 point warping transform to account for the phone slightly moving between taking the multiple pictures. After the aligning the pictures are averaged together to get a much less noisy image without running an actual denoiser.</p> <p>This process produced an image that's exactly the same as the output files from v1 but with less noise. </p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072512/stacked.png" class="kg-image"><figcaption>Before and after stacking</figcaption></figure> <p>This is a zoomed in crop of a test image that shows the difference of noise. The results are amazing for denoising without having artifacts that make the image blurry. But for every upside there's a downside. This is very slow.</p> <p>Stacking 2 images together with the current code takes 38 seconds. For great results it's better to stack 2 images though.</p> <h2>Color processing</h2> <p>Now the opencv dependency is added it's pretty easy to just use that to handle the rest of the postprocessing tasks.</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072512/blacklevel-correction.png" class="kg-image"></figure> <p>The main improvement here is the automatic blacklevel and whitelevel correction. The code slightly blurs the image and then finds the darkest and brightest point. Then it's simply substracting the value of the darkest point to shift the colors in the whole image down and the colored haze is removed. Then the pixels get multiplied with a calculated value to make the brightest pixel pure white again which "stretches" the brightness range so it fills the full spectrum. This process adds the contrast like the old imagemagick code did, but in a way more carefully tuned way.</p> <p>After this a regular "unsharp mask" sharpening filter is run that's fairly agressive, but tuned for the sensor in the PinePhone so it doesn't look oversharpened.</p> <p>A last thing that's done is a slight gamma correction to darken the middle gray brightness a bit to compensate for the PinePhone sensor overexposing most things. The resulting contrast is pretty close to what my other Android phones took, except the resolution for those phones is a lot better.</p> <h2>What's left to do</h2> <p>The proof of concept works, now the integration work needs to happen. The postprocessing is quite CPU intensive so one of the goals of postprocessd is to make sure it never processes multiple images at the same time but instead queues the processing jobs up in the background so the CPU is free to actually run Megapixels. There's also still some bugs with the exif processing and the burst length in the current version of Megpixels is a bit too short. This can probably be made dynamic to take more pictures in the burst when the sensor gain is set higher.</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072513/compare.jpg" class="kg-image"></figure> <p></p> Running the kernel.org tree on the PinePhonehttps://blog.brixit.nl/running-the-kernel-org-tree-on-the-pinephone/61a39d2915b5040189838c1cPhonesMartijn BraamSun, 28 Nov 2021 20:17:49 -0000<p>To make postmarketOS run on the PinePhone Pro I started with the kernel that was already packaged for the Pinebook Pro since that is also a Rockchip RK3399 based device. The kernel to make the PinePhone Pro run initially for me is just a patch that adds a driver for the new display and a new device tree describing the hardware for the PinePhone Pro. That's it, two patches on top of the 5.13 kernel from kernel.org.</p> <p>I've decided to see how the original PinePhone runs if a kernel source directly from kernel.org was used, it should be better in theory. The display driver for the original PinePhone is already in mainline and the device trees for the various revisions also have been for quite a while. So I made a postmarketOS build with a completely unpatched kernel to try it.</p> <p>The current PinePhone kernel on postmarketOS will use the <a href="https://megous.com/git/linux/">kernel from Megi</a>. It is at 5.14 at the moment and adds 16 more patches on top of that kernel. Most of these extra patches are for supporting the original developer edition of the PineTab and the Dontbeevil devkits for the PinePhone and some other devices with Allwinner chips in them.</p> <p>For the test setup I used the 5.15.5 kernel. I also made a new build of u-boot that has crust disabled because there's warnings of crust causing issues in Megi's changelog.</p> <h2>The results</h2> <p>The first thing that's noticable on the kernel.org build is that the orientation of the accelerometer is wrong as can be seen in the picture at the top of this post. The sensor is offset by 90 degrees so rotating the phone will keep moving the UI in the wrong direction.</p> <p>The second big issue I noted is that the display is really unstable, it still has the bugs from 1+ year ago where the panel would stop displaying correctly and would only show the odd or even columns and have a lot of streaking. This fine detail is pretty hard to capture in a picture but the streaking is somewhat visible:</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072509/PXL_20211128_182906802.jpg" class="kg-image"></figure> <p>Turning the display on and off a few times usually fixes this. This has already been fixed in the megi kernel.</p> <p>The PinePhone also contains some complicated drivers that have been written from scratch like the driver for the USB-C controller that handles the PD charging and all the convergence features. This is not upstream yet and that's partially because it's complex and doesn't fit neatly in one driver category.</p> <p>Convergence not working is not a huge dealbreaker for most use, but things that are also not working is the wifi, bluetooth, modem, usb and charging indication. All in total, it's not really usable on an unpatched kernel at the moment.</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072509/image.png" class="kg-image"><figcaption>The latest 5.16 branch for the PinePhone</figcaption></figure> <p>This is a lot of code to keep rebasing on top of the latest kernel releases. Hopefully more will flow upstream instead.</p> Does this come with dead pixels?https://blog.brixit.nl/does-this-come-with-dead-pixels/6169cc4b4e67c223849306d7PhonesMartijn BraamFri, 15 Oct 2021 19:16:26 -0000<p>This question seems to come up a lot. On the pine64 storage pages for devices with a display there's a disclaimer about dead pixels. This makes people think these devices regularly ship with dead pixels.</p> <p>From my experience with having a lot of pine64 devices here and having seen even more at events like FOSDEM, this isn't really the case. This disclaimer is actually hidden somewhere every single time you buy a product with a display in it. The main difference is that PINE64 doesn't hide it.</p> <p>A lot of websites seem to also just refer to the manufacturer warrenty of displays for dead pixels and those also have similar claims. Here's some examples from display manufacturers:</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072508/image.png" class="kg-image"><figcaption>BenQ pixel policy</figcaption></figure> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072508/image-1.png" class="kg-image"><figcaption>Dell monitor pixel policy</figcaption></figure> <p>But what about phones and laptops? well Dell conveniently lists it on the same page:</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072508/image-2.png" class="kg-image"></figure> <p>or a phone manufacturer, in this case Samsung:</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072508/image-3.png" class="kg-image"></figure> <h2>But reality</h2> <p>In reality none of these manufacturers regularly ship dead pixels. Neither does PINE64. This is just legal stuff for the sake of legal stuff. Having this notice on the website does not mean the dead pixels are more common. The main reason is that refunds are more painful for smaller companies.</p> Running Octoprint on the PinePhonehttps://blog.brixit.nl/running-octoprint-on-the-pinephone/61532baa4e67c22384930669LinuxMartijn BraamTue, 28 Sep 2021 15:14:18 -0000<p>So I decided to get a 3d printer, the Creality Ender 3 pro. These things are really neat :)</p> <p>One of the big missing things about the cheap 3d printers is that they are missing remote control and monitoring solutions. <a href="https://octoprint.org/">Octoprint</a> seems to be the most popular solution, especially with the Octopi images which are pre-configured Raspberry Pi images you can just hook up to the printer and make work.</p> <p>I did this as a first step and it does work neatly, except for that the printer I got has some issues with the serial port making it not reliable to print through. Also I wanted to have the camera stream so I can view the print progress remotely since the printer is 2 rooms over because its loud.</p> <p>I do have a raspberry pi camera and that seems to be the go-to solution for this, except that involves printing a case for the raspberry pi, something to hold up the camera, and dealing with the fact that the camera I have is an ebay knockoff that does not fit in the normal cases since it has a larger lens mount.</p> <p>Then I realized that the PinePhone is pretty much perfect for this. It has the touchscreen to run a UI on, it can do ethernet and the usb connection with the dock, it has a built in camera, and it is fast enough to run all this.</p> <h2>Installation</h2> <p>I started with installing postmarketOS edge on the PinePhone. I chose the Phosh UI so I can easily set up the phone using the touchscreen. Then the first step was setting up octoprint:</p> <div class="highlight"><pre><span></span><span class="go">Install the dependencies to build and run octoprint in Alpine Linux</span> <span class="gp">$ </span>apk add python3 py3-pip python3-dev alpine-sdk linux-headers <span class="go">[enjoy the speed at which apk installs software here]</span> <span class="gp">$ </span>sudo pip3 install octoprint <span class="go">[octoprint will now be installed as /usr/bin/octoprint]</span> <span class="go">The postmarketOS firewall will block the traffic by default, disable it for now:</span> <span class="gp">$ </span>sudo service nftables stop <span class="gp">$ </span>octoprint serve </pre></div> <p>Now the webinterface for octoprint should be reachable on the IP address of the PinePhone on port 5000. From there you can run through the install steps of octoprint itself.</p> <p>For the camera feed I used mjpg-streamer which seems to be a nice lightweight tool for streaming the camera over http for a stream that's nice and low-cpu for the phone so it can spend more CPU on the Octoprint things.</p> <div class="highlight"><pre><span></span><span class="gp">$ </span>sudo apk add mjpg-streamer v4l-utils </pre></div> <p>To set up the camera in the PinePhone I wrote this small shell script:</p> <div class="highlight"><pre><span></span><span class="ch">#!/bin/sh</span> media-ctl -d /dev/media1 -r media-ctl -d /dev/media1 --links <span class="s1">&#39;&quot;ov5640 4-004c&quot;:0-&gt;&quot;sun6i-csi&quot;:0[1]&#39;</span> media-ctl -d /dev/media1 --set-v4l2 <span class="s1">&#39;&quot;ov5640 4-004c&quot;:0[fmt:UYVY8_2X8/640x480@1/15]&#39;</span> mjpg_streamer -i <span class="s2">&quot;input_uvc.so -d /dev/video2 -r 640x480 -f 15 -u&quot;</span> -o <span class="s2">&quot;output_http.so&quot;</span> </pre></div> <p>This will switch the video system in the phone to the rear camera, set the resolution and mode and start mjpg streamer. I saved this as <code>/usr/local/bin/camera</code></p> <p>The URLs that are needed for Octoprint are:</p> <pre><code># The mjpeg video stream http://ip-of-the-pinephone:8080/?action=stream # The snapshot URL for the timelapse http://127.0.0.1:8080/?action=snapshot</code></pre> <p>To start this I made 2 init scripts for the two services this needs to run:</p> <div class="highlight"><pre><span></span><span class="ch">#!/sbin/openrc-run</span> <span class="c1"># This is /etc/init.d/camera</span> <span class="nv">supervisor</span><span class="o">=</span>supervise-daemon <span class="nv">name</span><span class="o">=</span><span class="s2">&quot;camera server&quot;</span> <span class="nv">command</span><span class="o">=</span><span class="s2">&quot;/usr/local/bin/camera&quot;</span> depend<span class="o">()</span> <span class="o">{</span> need net localmount after firewall <span class="o">}</span> </pre></div> <p>and</p> <div class="highlight"><pre><span></span><span class="ch">#!/sbin/openrc-run</span> <span class="c1"># This is /etc/init.d/octoprint</span> <span class="nv">supervisor</span><span class="o">=</span>supervise-daemon <span class="nv">name</span><span class="o">=</span><span class="s2">&quot;print server&quot;</span> <span class="nv">command</span><span class="o">=</span><span class="s2">&quot;/usr/bin/octoprint&quot;</span> <span class="nv">command_args</span><span class="o">=</span><span class="s2">&quot;serve&quot;</span> <span class="nv">command_user</span><span class="o">=</span><span class="s2">&quot;user&quot;</span> depend<span class="o">()</span> <span class="o">{</span> need net localmount after firewall <span class="o">}</span> </pre></div> <p>Now to do the final setup, the account needs to have permission to use usb serial adapters and the right services should be enabled:</p> <pre><code>$ groupadd user dialout $ rc-update add camera default $ rc-update add octoprint default $ rc-update del nftables default</code></pre> <p>After rebooting you should have a fully functional octoprint setup.</p> Packaging music in postmarketOShttps://blog.brixit.nl/packaging-music-in-postmarketos/607609317725960d859ca932LinuxMartijn BraamTue, 13 Apr 2021 22:17:55 -0000<p>Since I'm running postmarketOS on my phone I had to find a way to sync my music collection to my phone. There's a lot of solutions to do that. </p> <p>But since I have a full package-managed Linux distro on my phone, and it's very easy to create repositories, I decided to just package my music.</p> <p>Upsides:</p> <ul><li>Music collection gets neatly synced to all my devices</li> <li>Music collection updates together with my system updates</li> </ul> <p>Downsides:</p> <ul><li>Need to run a repository</li> <li>This is not really efficient</li> </ul> <p>Since I already have a local repository on my server and my music collection is not terribly large, I just ignored the downsides and did it anyway.</p> <h2>The software</h2> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072504/Screenshot-from-2021-04-14-00-01-16.png" class="kg-image"></figure> <p>I made a python script that scans through the music directory on my server and generates the required APKBUILD package definitions for Alpine Linux. Since I manually make sure that the directory structure and metadata is perfect I just don't deal with that at all in the script and always assume music is in $dir/$artist/$album/$track. It also packages every file in the album directory even if it's not related to the music.</p> <p><a href="https://paste.sr.ht/~martijnbraam/8e65134a083d32b6039889344be4bdaac60bfe59">You can find the python script here</a></p> <p>The script takes 2 arguments, one is the directory the music is stored in, the other is a directory where it will create the package definitions, the music files will be symlinked between those.</p> <pre><code>$ python3 musicrepo.py /mnt/music /mnt/files/pmaports/custom-music $ pmbootstrap build musicdir-all</code></pre> <p>I already have pmbootstrap on my server for building my custom packages I use on various devices, in this case there's just an apache vhost that reads the packages from the pmbootstrap builddir and this repository is added on my postmarketOS phones and Alpine desktop machines.</p> <p>The only thing I need to do now is installing the <code>musicdir-all</code> package on my devices to get all the music synced or add the <code>musicdir-$artist</code> package to get a subset.</p> <h2>On a phone</h2> <p>As you've probably seen on the top image, the updates for the music collection just show up as software updates. Sadly it only shows that there's an update for the meta package for the artist and not that it will pull in a new album as dependency. But the important part is that it <i>works</i>.</p> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072504/DSC0006.jpg" class="kg-image"></figure> <p>The music in these packages are stored inside <code>/usr/share/music</code> after installing. In Lollypop I just added that path as additional directory in the collection so I can both have music in my homedir and in the system packages. After that and restarting Lollypop a few times it will show the music</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072505/DSC0007.jpg" class="kg-image"></figure> <p>So not very CPU efficient to compress the music into archives and unpack them again, and I have the music stored twice now on the server. But aside from that it works great :D</p> Do you really want Linux phoneshttps://blog.brixit.nl/do-you-really-want-linux-phones/605b3ab27725960d859ca78aPhonesMartijn BraamWed, 24 Mar 2021 14:23:10 -0000<p>The community around Linux phones is interesting. The phones do sell to a lot of people, but it seems a lot of those people come back to complain that Linux phones isn't what they expected it is.</p> <p>For some reason all the distributions for the PinePhone are bending over backwards to provide an android or ios experience on the phone. The operating systems are judged on the amount of apps preinstalled and every tiny issue labels the distribution as completely unusable. </p> <p>Stability doesn't matter at all, as long as there are features! more features! Doesn't matter there are 20 patches on top of every package and things aren't upstreamed. Doesn't matter if the kernel is full of hacks with no upstream in sight for most things.</p> <h2>Development</h2> <p>There are 6000+ people in the PINE64 discord, 1300+ in the Telegram room for the PinePhone, 3000+ in the Matrix channel and probably there are people in the IRC room. You'd expect some people to actually pick up app development but as far as I can see it's still mostly the same people as a year ago that are developing the operating systems.</p> <p>Megapixels isn't the best camera application because it's terribly great code or is well designed. It's basically my first C project, first v4l2 project and first GTK3 project but I started it because nobody who actually knows any of these things want to, and so far nobody who knows these things has stepped in to do better (don't worry, I get enough comments that my code is horrible and performance is bad, it's just that nobody has every been able to improve the performance except for Benjamin)</p> <p>Megapixels has a fully user-configurable post-processing pipeline basically since the start. It allows you do anything you want after clicking the shutter button in the app since it's a shell script. Still people complain about how they don't like that there's an extra file they don't want (the dng) or photo upload is missing. ITS A SCRIPT, CHANGE IT. Do you really want a Linux phone?</p> <h2>Distributions</h2> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072501/image.png" class="kg-image"></figure> <p>There are 18 distributions now for the PinePhone, slightly more if you count derivatives of those distributions. Still people want to create more distributions (mostly from scratch trying to emulate an android/ios experience more) instead of actually implementing missing features in the existing distributions or fixing issues in the upstream applications those use.</p> <p>Even the distributions that exist don't seem to follow the philosophy of the distributions they are based on. Arch mobile uses pacman and alarm repositories but doesn't ship like Arch Linux does, empty, ready to set-up like you want. Mobian also ships with every app that works preinstalled so it looks less empty. postmarketOS would also be considered bloated by Alpine Linux standards. People even want to use Flutter to make UIs, was the phone lacking in google technologies for your liking?</p> <p>Having an "one true way" to use the UI and tons of preinstalled apps is the reason I dislike the android ecosystem. But even weirder, there's tons of people asking for Android roms for the PinePhone. What's the point? Do you believe in the church of papa Alphabeticus, the pope of the internet? Have you used Linux at all? Do you really want a Linux phone? </p> <h2>The versus culture</h2> <p>A lot of people still think they need to defend the PinePhone against the Librem 5 or vice versa. Breaking news: you're totally allowed to own both and be happy about it. People keep saying the PinePhone would be nowhere without Purism and that the Librem 5 wouldn't have lasted this long without the PinePhone. The truth is that we need both to succeed, these aren't exclusive options, you're allowed to like both.</p> <p>It's the community that against "the other side". It's the developers that are actually working together to make something.</p> <p>There are so many options of software you can install, there are multiple hardware options. People are still complaining to other people that they have chosen wrong. This is Linux, you can do what you want with your device. Do you really want a Linux phone?</p> PinePhone Camera pt4https://blog.brixit.nl/pinephone-camera-pt4/5f79b6a67725960d859ca4fdPhonesMartijn BraamMon, 05 Oct 2020 19:39:01 -0000<p>I keep writing these because things keep improving. One of the main improvements is visible in the picture above, autofocus is working. </p> <p>The OV5640 sensor in the PinePhone is pretty small, this limits possible image quality but as upside it means the camera has a way larger area of the picture in focus. Due to this it can get away with not having autofocus at all. The camera just sets the lens to infinity focus when starting (which produces the clicking sound when accessing the camera) and then the focus would be mostly fine for landscapes.</p> <p>The downside is that things that are close to the camera aren't in focus. This is quite a problem for me because half the pictures I take with my phone is photo's of labels on devices like routers so I don't have to write the password down to enter it on a device in another room. These photo's would be quite out of focus on the PinePhone. </p> <p>The autofocus support is actually only a single line change in Megapixels to make the basic functionality work. The main changes for this have been done in the kernel driver for the ov5640. The sensor chip has a built-in driver for the focus coil in the camera module. It only needs some firmware loaded to make the focus work and some commands need to be sent to configure the focussing. The firmware upload is needed because the sensor doesn't have any built-in storage for the 4kB of firmware that can be sent, it's just stored in RAM when the sensor is powered up by Linux.</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072499/focus.jpg" class="kg-image"></figure> <p>The firmware itself is a closed blob from Omnivision. It basically runs on the embedded 8051 core (that hasn't been used at all so far) and it gets a 80x60 pixels downscaled version of the camera view, from there it sends values to the focus distance register in the sensor that normally would be used for manual focussing. To trigger focus there's a few registers on the ov5640 you can write to to send commands to the firmware. The datasheet for example defines command 0x03 to focus on the defined area and 0x07 to release the focus and go back to infinity.</p> <p>After implementing this Megi figured out that there's an undocumented command 0x04 to trigger continuous autofocus in the sensor. This is way more user friendly and is what's now enabled by default by Megapixels.</p> <p>One of the remaining issues is that V4L2 doesn't seem to have controls defined to select <i>where</i> the focus should measured. The current implementation just makes the ov5640 focus on the center of the image but the firmware allows defining the zones it should use to get focus. </p> <h2>User facing manual controls</h2> <p>One of the new developments that's in Megapixels is an UI that allows users of the app to switch from automatic exposure to manual controls.</p> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072499/ui.png" class="kg-image"></figure> <p>In the top row of the image the current state of the controls is shown. In this case it's the gain and exposure controls from V4L2. These controls don't have defined ranges so that's set by the config file for Megapixels.</p> <p>If you tap the control in the top the adjustment row in the bottom of the screenshot will open allowing you to change the value by dragging the slider, or enableing the in-sensor automatic mode for it by clicking the "Auto" toggle button.</p> <p>These controls also slightly work for the GC2145 front camera, the main issue is that the datasheets don't define the range for the built-in analog and digital gain so it can't really be mapped to useful values in the UI. The automatic gain can also only be disabled if you first disable the automatic exposure. Something that can't really be enforced in Megapixels currently so it's not super user friendly.</p> <p>The next step for this would be implementing the whitebalance controls for the cameras. That would involve some math since the UI would show the whitebalance in a color temperature and tint but V4L2 deals with whitebalance with R/G/B offsets.</p> <h2>Color calibration</h2> <p>Another huge step in image quality for the rear camera is the color calibration matrices.</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072499/lumariver.PNG" class="kg-image"></figure> <p>The calibration for this is done by making correctly exposed pictures of a calibrated color target like the X-Rite colorchecker passport above. Even the slightest amount of exposure clipping ruins the calibration result but due to the manual controls in Megapixels I was now able to get a good calibration picture.</p> <p>For the calibration two photos are needed. Those need to be as far away from eachother as possible in the color spectrum. A common choice for this is using a Standard D65 illuminant and a Standard A illuminant, which is a fancy way of saying daylight on a cloudy day and an old tungsten lightbulb. By knowing the color/spectrum of the light used and the color of the paint in the calibration target it's possible to calculate a color matrix and curve to transform the raw sensor data in correct RGB colors.</p> <p>To do this calculation I used <a href="http://lumariver.com/">Lumariver Profile Designer</a>, which is a closed source tool that gives a very nice UI around <a href="http://rawtherapee.com/mirror/dcamprof/dcamprof.html">DCamProf</a> from the RawTherapee people. The license is paid for by the donations from <a href="https://www.patreon.com/martijnbraam?">my patreon</a> sponsors and the license cost is used by Lumariver to continue development on DCamProf. </p> <p>After running all the steps in the calibration software I get a .dcp file that contain the color matrices and curves for the sensor/lens combination. The matrices from this file are then written to the hardware config file in Megapixels in the colormatrix and forwardmatrix keys. Megapixels doesn't actually process any of this enformation itself, it only gets added to the exported raw files as metadata and the post processing tools will use it to produce better output files.</p> <p>The result of the matrices is that it now takes way less messing with raw tools like RawTherapee and Darktable to get a good looking picture. The pictures just look better by default.</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072499/color-demo.jpg" class="kg-image"></figure> <p>It might look like it's just a saturation change, but it also corrects a few color balance issues. The second thing the calibration tool outputs are calibration curves for the sensor. These do brightness-dependend or hue dependent changes to the image. These are a bit too large to store in the config file so to use those I need to find a way to merge the calibration data from a .dcp file into the generated .dng raw files instead.</p> <h2>New post processing pipeline</h2> <p>Just writing the metadata doesn't magically do anything. To take advantage of this I rewrote the processing pipeline for the photos taking advantage of the DNG code that I wrote to test the burst function. I removed the debayer code from Megapixels that did the good quality debayer when taking a photo and I removed the jpeg export function. Megapixels now only writes the burst of photos to a temporary directory as .dng files with all the metadata set. After taking a photo Megapixels will call a shell script which then takes those files and run it through the new processing pipeline.</p> <p>Megapixels ships with a postprocess.sh file by default that is stored in /usr as the default fallback. You can overwrite the processing by copying or creating a new shell script and storing that in ~/.config/megapixels/postprocess.sh or /etc/megapixels/postprocess.sh.</p> <p>The included processing script takes the burst and saves the first .dng file into ~/Pictures as the raw photo. Then it runs the same file through dcraw to do a very good quality debayer and at the same time apply the color matrices stored in the .dng files. Then it will run imagemagic to convert the resulting tiff file into a .jpg in ~/Pictures as the final jpeg image.</p> <p>The output .jpg files are what is used in the garden pictures above.</p> <h2>HDR+ and stacking</h2> <p>Last but not least is the further development into the <a href="https://www.timothybrooks.com/tech/hdr-plus/">hdr-plus</a> implementation. After my previous blog post one of the developers in the #pinephone channel (purringChaos) made the hdr-plus repository build that I couldn't get working last time.</p> <p>The hdr-plus tool takes a burst of raw frames and aligns those to do noise reduction and get a larger dynamic range. then it runs a tonemap on the HDR file and make a nice processed image. It's basically an implementation of the photo pipeline that google made for the Pixel phones.</p> <p>So far running the hdrplus binary on the photo's has resulted in very overexposed or very weirdly colored images, which might be a result of the camera not providing the raw files in a way the hdr tool expects it. Hopefully that can be solved in the future.</p> <p>The hdr-plus repository does have another tool in it that just does the stacking and noise reduction though and that binary <i>does</i> work. If the postprocess.sh script in Megapixels detects the stack_frames command is installed it will use it to stack together the 5 frames of the burst capture and then use the .dng from that tool in the rest of the post-processing pipeline. It seems to reduce the noise in images quite a lot but it also loses most of the metadata in the .dng file so this would also need some improvements to work correctly.</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072499/demo.jpg" class="kg-image"></figure> <p>I think with these changes the ov5640 is basically stretched to the limit of what's possible with the photo quality. The rest of the planned tasks is mainly UX improvement and supporting more different camera pipelines. Since postmarketOS supports a lot of phones and more and more run mainline Linux it should be possbile to also run Megapixels on some of those.</p> <p>I've also received a few patches to Megapixels now with interesting features, like support for more than 2 camera's and a proof-of-concept QR code reading feature. Hopefully I can also integrate that in this new code soon.</p> PinePhone camera adventures, part 3https://blog.brixit.nl/pinephone-camera-adventures-part-3/5f6cb41a7725960d859ca40cPhonesMartijn BraamFri, 25 Sep 2020 17:15:32 -0000<p>Armed with the knowledge gathered by making the python-pinecamera implementation I started making a GTK3 version. This time using C instead of python, partially for performance, partially because docs are hard enough for C and the python abstraction makes it way harder.</p> <figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072498/megapixels-bar.png" class="kg-image"></figure> <p>The app is far from perfect and still has a bunch of low hanging fruit for performance improvements. At this moment it's still decoding the images in the UI thread making the whole app not feel very responsive, it's also decoding at another resolution than it's displayed, which can probably be sped up by decoding at exactly the right resolution and rotating while decoding.</p> <p>One of the ways the image quality is increased on the sensors is by using a raw pixelformat instead of a subsampled one. Resolution on cameras is basically a lie. This is because while displays have subpixels, sensors don't. When you display an image on a 1080p monitor, you're actually sending 5760x1080 subpixels to the display and those subpixels are neatly in red/green/blue order. </p> <p>On camera sensors when you have a 1080p sensor, you actually only have 1920x1080 subpixels. These subpixels are laid out out in a bayer matrix where you have (in the ov5640 case) on the first line a row of blue,green,blue,green... pixels and on the next row green,red,green,red,green,red. Then camera software takes every subpixel, gets the nearest other colored pixels and make that one "real" pixel. This makes for a slightly blurrier image than a 4k sensor being downscaled to 1080p.</p> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072498/subpixels.png" class="kg-image"></figure> <p>To make the quality issue worse the image will be subsampled to have a 1080p grayscale image and the color channel will be quarter resolution. To make the quality even worse the camera will do quite agressive noise reduction, causing the oil paint look in bad pictures. </p> <p>The fun part of debayering the image in the sensor and then subsampleing it is that the bandwidth required for the final picture is about twice as much as the original raw sensor data. On higher resolutions this causes the sensor to hit a bandwidth limit when sending the video stream at a reasonable framerate. This is why it wasn't possible before to get a 5 megapixel picture from the 5 megapixel sensor. But by using a raw mode you can get the full sensor resolution at 15fps. This is what Megapixels does.</p> <h2>Speed</h2> <p>One of the major improvements of the previous solutions for the cameras is that the preview has a way higher framerate. The ffmpeg based solutions could run in the 0.5-2 FPS region based on rotation and resolution the sensor was running at. These were also running in YUV mode instead of raw. The issue with YUV for the preview is that the sensor takes the RGB subpixels and returns the image in YUV pixels. The display in the phone is RGB again thus this has to be converted again, which takes quite some CPU power. </p> <p>The easiest way to speed up data processing in computers is just process less data, so that's what it does. instead of debayering a full frame properly at 5 megapixels, it just does a quick'n'bad debayer by taking a single red, green and blue pixel from every 12x12 block of raw pixels and discarding the rest, no interpolation at all.</p> <p>This is technically not a correct solution, it also produces way worse image quality than proper debayering. That's why when you take a picture the whole raw frame is run through a proper debayering implementation. </p> <iframe width="100%" height="360" src="https://www.youtube.com/embed/n_BfUV0v3UM?feature=oembed"> <h2>Image quality</h2> <p>The main issue with the image quality currently is the lack of focussing making all images slightly soft if you're photographing things far away or very blurry if you try to take a close-up. The second big issue is that the auto exposure algorithm in the ov5640 is pretty horrible by default overexposes large parts of the image to try to get the average exposure correct.</p> <p>To test exposure improvements I made a local build of megapixels that sets a fixed exposure on the sensor. I also added some WIP .dng raw exporting code so the sensor data can actually be loaded in photo processing software as raw data.</p> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072498/darktable.jpg" class="kg-image"></figure> <p>The resulting image quality is a lot better. With the post-processing happening after taking the photo the processing software (In this case Darktable) can run way more cpu intensive debayering algorithms and you also have way more control over the exposure and colors of the final image.</p> <p>This is how the image looks with all the post processing disabled in Darktable:</p> <figure class="kg-card kg-image-card"><img src="https://blog.brixit.nl/image/w1000//static/files/blog.brixit.nl/1670072498/raw.JPG" class="kg-image"></figure> <p>Another feature I added for testing is burst capture, the photo above is one of the 5 images captured in a burst (at full resolution 15fps). One of the possibilities is running this burst through the HDR+ software google runs, but I haven't found any open source implementation that actually compiles. Another posiblity is the Handheld-Multi-Frame-Super-Resolution algorithm that's also used by google. I also haven't been able to compile any of those implementations. </p> <p>If you want to try to post-process a PinePhone raw image yourself, there's the 5-frame burst used for the images above: <a href="http://brixitcdn.net/pinephone-burst/">http://brixitcdn.net/pinephone-burst/</a></p> <h2>Conclusion</h2> <p>Lots more to do, but it's getting better every release</p>