• engineering

Last December we released Pencil, a natural stylus that activates three special features when connected with Paper: Erase, Palm Rejection, and Blend. With Pencil, we wanted to eliminate barriers between your ideas and your tools so that you can stay in the creative flow without distractions.

Pencil’s Bluetooth LE connection, coupled with our advanced touch classification, allows Paper to disambiguate a Pencil stroke from one made with the finger. This opens up the possibility of working with two separate tools (and a third if you flip the Pencil around) in rapid succession, minimizing the interruption required to switch tools, and hence keeping you in the creative flow.

Because the tools we choose are true to their real-world counterparts, they require no further interface in Paper, and work together seamlessly in a way we subconsciously expect. We’re accustomed to using our fingers to smudge colors or charcoal sketches, we know to flip a pencil around to use the eraser, and we don’t think twice about resting our hand on the page. Actions that are second nature to us, however, don’t always translate digitally and getting a stylus and touch screen device to behave as they do in “real life” is far from simple.

Switch effortlessly between drawing and Blend

In this post we will focus on the tools and techniques we developed for Blend, starting with a general overview of its features. We’ll then delve into more technical information about managing complexity in our prototypes, and conclude by examining some of the intricate details that make Blend’s OpenGL shaders work across devices.

Will It Blend?

While we knew a smudge gesture would feel immediately familiar, the desired Blend appearance was far less obvious. Stroke speed alters the characteristics of every tool in Paper, and it was clear to us that Blend would be no exception. The final look and feel emerged slowly after months of experimenting with a wide range of ideas, from a wet, watercolor-like look to a dry, almost chalk-like appearance.

Ultimately, we settled on three distinct Blend phases for slow, medium, and fast strokes that seamlessly transition into one another as you vary the speed at which you swipe across the screen:

  • At slow speeds, Blend looks more like smudging, i.e. colors are pulled along the stroke; colors mix and you can “shape” objects, as the stroke has a very defined outline.
  • For medium range speeds, Blend gradually loses its ability to transport color and instead blurs the image under the mask; this is great for smoothing color gradients and removing unwanted detail or noise.
  • At higher speeds, we dial down Blend’s opacity, making the stroke more transparent. Use this for very faint, subtle blur effects, for instance if you want to imitate out-of-focus photography or haze effects.

Blend Mask

Unlike blur tools you may know from Photoshop or Pixelmator, Blend only affects the area under your finger. One might think of this area as painted out or “masked” by an invisible brush.

Just like the opacity and the ability to transport color, the mask shape also changes with the stroke speed. At slow speeds it has a sharp fall-off, creating crisp transitions between the blended areas and the underlying image. As the speed increases, blended areas appear to feather out, resulting in softer transitions.

Mapping Velocity to Blend Parameters

Every feature has its secret sauce. We mentioned a few aspects of how Blend changes with a stroke’s velocity. But while Paper’s Draw tool, for instance, boils down to a single velocity-dependent scalar parameter, Blend requires six of them in total, each with their own characteristics. This required us to hand-tune six separate velocity-to-parameter curves simultaneously. Put differently, Blend’s parameters span a six-dimensional space, in which we were looking to carve out the one-dimensional curve of velocity-to-parameter mapping that feels just right.

We started with one of the simplest velocity-to-parameter curves, a clamped linear ramp that’s dependent on four adjustable values (min/max ramp velocity and corresponding min/max parameter values). Even though a simple model like this already results in 4 x 6 = 24 distinct values waiting for tuning, this mapping did not prove to be expressive enough for what we had in mind.

To keep the tuning process manageable, we first had to find a suitable family of curves with as few tweakable values as possible, and then build a visualization interface that would make fine-tuning each curve fast and intuitive.

Gamma Corrected Linear Interpolation

We settled on a gamma-corrected version of the linear ramp above that allowed us to create a non-linear behavior at the cost of only one additional fixed-range parameter:

        float gammaRamp(float t_min, float t_max, float y_min, float y_max, float gamma, float t)
        {
            t = clamp((t - t_min) / (t_max - t_min), 0.0, 1.0);
            float u = ((gamma >= 0.0)
                       ? pow(t, pow(10.0, gamma))
                       : 1.0f - pow(1.0 - t, pow(10.0, -gamma)));
            float y = mix(y_min, y_max, u);

            return y;
        }
    

The new gamma parameter has a value in a fixed range [-1…1]; gamma=0 reduces to the simple linear ramp above, negative values bend the curve downward and positive values upward.

This gamma-corrected ramp allowed us to dial in the desired velocity curves for every Blend parameter, at a moderate increase of a total of 5 x 6 = 30 tweakable values.

A Graph Says More Than a Thousand Sliders

The slider-based interface we used so far for fine-tuning Paper did not scale to the amount of adjustable variables in Blend. So we broke out the fine-tuning into a prototype written with OpenFrameworks (using ofxUI), and set up a separate interface element for each Blend parameter, laying out the variables in a way that let the design team quickly zero in on the desired velocity curves:

Having these controls on the iPad screen beside the Blend canvas proved to be crucial: when tuning the feel of a feature, visual reasoning about the exact shape of a response curve is difficult. The right values usually emerge through a lot of trial and error, so quick iteration times are essential.

Shaders and 16-Bit Blur Buffers

Unlike any other brush tool in Paper, Blend composites the current image pixels with a blurred version of the same image. This creates a feedback loop, much like an audio feedback. And similarly to audio feedback noise, RGB values start to drift to the extremes of their spectrum when the feedback algorithm is unstable.

Left: Unstable Blur. Right: Stabilized Blur.

The main culprits for instability in Blend were rounding and quantization errors when shader values were written back into video memory. For this reason we switched the internal blur buffers to 16-bit floating-point RGBA textures, which are only converted back to 8-bit RGBA at the end of each stroke. This stabilizes the blur even for very long strokes.

iPad Air Hardware Challenges

Just as we were wrapping up development, our blur feedback became unstable once again on the newly released A7 chip used in iPad Air and Retina iPad mini. After some investigation, we found that the framebuffer conversion specs had been changed for OpenGL ES 3, and no longer guaranteed accurately rounded results:

OpenGL ES 3.1 Specification: 2.3.4.2 Conversion from Floating-Point to Normalized Fixed-Point

        The conversion from a floating-point value f to the corresponding unsigned normalized fixed-point value c is defined by first clamping f to the range [0, 1], then computing

        f′ = convert_float_uint(f×(2b−1), b) (2.3)

        where convert_float_uint(r,b) returns one of the two unsigned binary integer values with exactly b bits which are closest to the floating-point value r (where rounding to nearest is preferred).
    

To test our theory that the lack of accurate rounding was the source of our blur shader’s instability, we built a test app that showed that A7 GPU hardware indeed truncates extra bits instead of rounding them. This also happens when you open an OpenGL ES 2 context.

To restore a stable blur feedback on A7 GPUs, we resorted to performing the rounding operations in the shader. For a fixed-point frame buffer, this is simple enough to do. However, because our blur buffers are 16-bit floating-point, ensuring a consistent rounding behavior is more tricky: the precision depends on the represented number, hence the term “floating-point.”

In the end, we decided to sacrifice some accuracy for a consistent rounding behavior. 16-bit half-floats can represent the integers 0-2048 exactly, so we settled on the following rounding scheme in the shader:

        finalColor = floor(finalColor * 2048.0 + 0.5) / 2048.0
    

This restored a robust blur without color drift for many iterations on all devices. The manual rounding is only turned on for A7 GPUs, which luckily have enough computational power that we didn’t have to worry about the impact on performance.

The Perfect Blend

Since Pencil’s launch, creators around the world have used Blend to make sketches that pushes its boundaries far beyond what we thought possible. Whether Blend is used for subtle shading, depth-of-field effects, or playing up colors on the screen, creators have used this feature to come up with a distinctly new visual style in Paper.

Paper sketch by Michael Rose

We believe this creative proliferation is largely due to the uninterrupted workflow that Pencil and Paper facilitate. A comparably dynamic Blend experience simply wouldn’t be possible if it was implemented as a separate tool.

And we are only getting started. With iOS 8, we will be introducing Surface Pressure to add an entirely new dimension to the creative experience in Paper.

If you are interested in tapping into the technology we have built for Pencil, sign up to get access to our FiftyThree SDK.

Makers

Thanks