alexwidua Profile Banner
Alex Widua Profile
Alex Widua

@alexwidua

Followers
12K
Following
802
Media
32
Statuses
80

Human Interface Designer at Apple

California
Joined May 2010
Don't wanna be here? Send us removal request.
@alexwidua
Alex Widua
2 months
excited and proud to share what we've been working on!.
@chan_k
Chan Karunamuni
2 months
The latest project I’ve been working on is the design of Liquid Glass, alongside an army of designers and engineers. We're designing it to bend and shape light while feeling like an elastic, flexible material that can dynamically shape shift, to make apps feel fluid and organic.
59
12
714
@alexwidua
Alex Widua
2 years
When designing the app icon I learned about ‘crease patterns’ – structural representations of intricate origami folds. The app icon shows the crease pattern for an Origami Crane – a nod to Origami Studio’s original app icon.
Tweet media one
9
16
308
@alexwidua
Alex Widua
2 years
In my day-to-day prototyping I use it mostly to create little utility patches and ‘jigs’ for all sorts of things (transforms, validate strings, randomize this and that etc.). Just describing the desired inputs and outputs of a patch helps a lot – it makes you break down the logic.
1
0
15
@alexwidua
Alex Widua
2 years
I’ve made a little AI copilot for Origami Studio. Think of it as a tiny code editor that sits on top of Origami's canvas and lets you generate patches using GPT-4 – all within the same surface. It’s a native macOS app and you can get it here:
19
26
330
@alexwidua
Alex Widua
2 years
PS. little accidental find:. You get a directional Motion Blur for free if you blur the layer *before* applying the distortion shader. The shader compresses the blur horizontally but stretches/emphasizes the blur vertically
1
3
36
@alexwidua
Alex Widua
2 years
The animation progressively stretches and squeezes the layer at the same time. One trick was to delay each value a little bit to give each value room to breathe and do its thing. The code is up here:
1
6
58
@alexwidua
Alex Widua
2 years
Couldn't resist and had to remix @jmtrivedi's mesh animation and tried to recreate a 'genie effect' using SwiftUI
13
27
585
@alexwidua
Alex Widua
2 years
The blur strength responds to the drag/scale velocity. I'm using @jmtrivedi's fantastic Wave package which makes it easy to work with the velocity + animate the shader values. The code is up here:.
1
2
89
@alexwidua
Alex Widua
2 years
Prototyped a subtle Zoom Blur and Motion Blur SwiftUI shader this weekend. It's a little detail that changes the feel of the dragged element entirely (and is fun to play around with)
11
16
531
@alexwidua
Alex Widua
2 years
It’s an amalgam of SpriteKit particle layers, gradients and blend modes. Together they create a nice materiality. I liked the idea of using a turbulence burst to change between the material's colors
7
8
175
@alexwidua
Alex Widua
2 years
Tried to re-create the new iPhone’s Titanium particle effect last night and made this interactive ‘material sample’ in SwiftUI
62
177
3K
@alexwidua
Alex Widua
2 years
@ollybromham For the time, I just had a hunch how it should move and feel. I pulled some of the variables into sliders and built a little 'design tool' which allowed me to fidget around with the values and feel it out (this makes SwiftUI such a neat prototyping tool!) :-)
4
2
114
@alexwidua
Alex Widua
2 years
@ollybromham The stick itself is a spline that gets more flexion the longer it gets (like a flimsy piece of wood). It's a v subtle detail that gives the interface more materiality and makes it feel less like a slider – even if you're not consciously aware of it
1
0
44
@alexwidua
Alex Widua
2 years
@ollybromham I liked the idea of having a flick 'matchstick' gesture to start the timer. It punctuates the moment and contrasts the otherwise calm and ephemeral feel. Relying on drag velocity alone to detect a flick wasn't enough – I found that flick gestures fired less touch events, so I
1
1
24
@alexwidua
Alex Widua
2 years
@ollybromham Realized that the thread didn't post, so here's a appendum: . The smoke is created with SpriteKit (Emitter and a Turbulence Field). I found that quickly oscillating the strength creates a dispersion effect, which goes nicely together with the device's accelerometer (the
4
6
59
@alexwidua
Alex Widua
2 years
Riffed on @ollybromham's Incense timer in SwiftUI. A pinch of gestures, haptics, motion, blurs and particles
24
116
2K
@alexwidua
Alex Widua
2 years
One little detail is the grid animation while detecting a frame. I wanted to go for something that feels like something is being assembled/constructed. It’s pretty simple and almost entirely driven by a spring delay value (SwiftUI). (Commented gist:
7
21
413
@alexwidua
Alex Widua
2 years
. once detected, a high-res version is loaded, mapped onto a plane and stored as a reference for future updates. It's a lot of hot-glue and cobbled together, but it feels pretty magic to just pull frames from the screen like that :-) 2/2.
1
1
121
@alexwidua
Alex Widua
2 years
It's just a little idea I had while toying around with the visionOS SDK. How it works: it's a Figma plugin that talks to an iOS app via WebSockets. All Figma frames are loaded as AR reference images which makes them detectable by ARKit (works surprisingly well) 1/2.
3
5
256
@alexwidua
Alex Widua
2 years
Prototyped a little app that allows you to take frames from @figma and 'pull them into space'. The frames stay linked to the canvas and update on any changes
214
2K
13K