Apple keeps the M1 processor family’s inner workings hidden from the public, but dedicated developers have reverse-engineered it to create open-source drivers and a Linux distro, Asahi Linux, for M1 Macs. They’ve discovered some interesting features along the way.
Alyssa Rosenzweig discovered a bug in the render pipeline of the M1’s GPU while working on an open-source graphics driver for it.
She was rendering increasingly complex 3D geometries until she came across a bunny that caused the GPU to crash.
The issue begins with the GPU’s inability to access memory. It’s a powerful GPU, but like the A-series iPhone SoC it’s based on, it takes shortcuts to maximize efficiency.
Instead of rendering directly into the framebuffer, as a discrete GPU might, the M1 divides a frame into two passes: the first finds the vertices, and the second does everything else.
Because the second pass is obviously much more intensive, dedicated hardware segments the frame into tiles (mini-frames, basically) and the second pass is taken one tile at a time between passes.
Tiling solves the problem of insufficient memory resources, but the GPU must keep a buffer of all per-vertex data in order to piece the tiles back together into a frame later.
Rosenzweig discovered that if this buffer overflowed, the render would fail. See the first bunny here.
The partial renders can be combined to create a full-body render, but with a dozen extra steps in between.
However, this rendering is still not quite right. Artifacts can be seen on the bunny’s foot. This is due to the fact that different parts of the frame are split between a color buffer and a depth buffer, and the latter behaves incorrectly when loaded with partial renders.
A reverse-engineered configuration from Apple’s driver resolves the issue, allowing you to render the bunny below.
It’s not just Rosenzweig’s open-source graphics driver for the M1 that makes you jump through hoops to render an image: this is how the GPU works.
Its architecture was probably not designed with 3D rendering in mind, but Apple has turned it into something that can rival, if not outperform, the latest discrete GPUs, as Apple claims.
To read our blog on “Apple is no longer the most valuable firm in the world,” click here.