Hiding actions in buttons with UIKit
May 13, 2024
Today's post was inspired by this fancy list of hidden actions on Instagram that appears when a user holds their finger on the share button.
We won't reproduce the component style verbatim, but we'll deal with its animations, composition, and study what UIKit has to offer that is not obvious at first glance.
The composition of this component is relatively simple: a primary action and a few more secondary actions in a hidden list. And all this is accompanied by animations and transformations. You can't see it through the video, but there's also a haptic feedback when showing the list and hovering over an action in this list.
Let's dive into details and take each part individually.
Laying the basis
The component in the example has two types of actions:
- a primary action that is triggered when the button is pressed
- a secondary action that corresponds to the selected item in the list
Working with SwiftUI, you get used to the fact that all components can be composited with all other components. Coming back to UIKit, you try to bring similar approaches. In our case, the UIInteraction
protocol is a suitable tool. It describes an interface for interactions that can be associated with arbitrary views.
One standard example is the UIContextMenuInteraction
type, which shows a context menu with actions when held on an item:
Starting with describing a new type SecondaryActionsInteraction
and conform it to the UIInteraction
protocol. The compiler will force us to inherit from the NSObject
. Don’t be afraid, it is a standard practice for UIKit, because most of it has its roots in Objective-C.
The willMove
and didMove
methods represent the UIInteraction
lifecycle, through them we will customise the view for secondary actions. To handle the finger-hold gesture on the mapping, we will add an instance of UILongPressGestureRecogniser
.
To display a list of secondary actions, we define an instance of UIStackView
. Its filling will be built with the help of UILabel
instances.
💡 This view can be replaced with a more similar one from the demo or any other custom view. For the sake of brevity, we will continue with the labels stack.
The implementation of lifecycle methods contains gesture and actions container re-binding between views.
To see the intermediate progress, we will make the stack opaque at the beginning of the gesture, and hide it back at the end.
As a test bench, create an instance of UIButton
and put the interaction on it by calling the addInteraction
method.
💡
addInteraction
is a part ofUIView
interface, so the interaction can be added to any other view, it doesn't necessarily have to be a button.
Moving the flow
In the Instagram demo, the list of secondary actions moves diagonally from the bottom left corner to the top right corner, alongside changing its transparency.
For this purpose, we will use CGAffineTransform
. Add applyPinTransform
method to the UIView
extension. It’s implementation makes a translation on both axes to the height of the actions list.
💡 We put
-bounds.height
, because in iOS the origin of the coordinate system is at the top left corner and extends to the opposite corners, so we need to move the element closer to the origin of the axis to shift it up.
We also not forget the animation so that the transformations we apply don’t happen choppy.
To the transparency changes, we will add transformation calls. At the beginning of the gesture the list will expand, and at the end it will shrink and return to its initial location.
In the original animation, the list doesn't just move out to the corner, but also scales from a point to its full size.
Adding an appropriate scaling to the transformation method by introducing the scale
parameter. We we will need to set different scale value at different stages of the gesture so it makes the configuration more convenient.
Before starting the gesture, we set the scale of the list to 0.1
so that it always starts its display compressed. Inside the animation block, we reset it to full scale, and at the end of the gesture we reset it back to 0.1
.
💡 We use a value of 0.1 rather than 0.0 because with the latter, the shrink to the initial state happens without animation as the framework fails to interpolate it.
But it's still not completely similar to the original animation. In it, the list is not scaled at its centre, but somewhere from a corner. Let's fix this by tweaking the view’s anchorPoint
.
Changing anchorPoint
also affects the location of the view.
To compensate for the possible offset, we calculate an additional translation.
Interacting with secondary actions
Next we are going to breathe some life into the interaction with items from the secondary actions list. At the stage when the user moves a gesture across the screen, only the .changed
gesture state is involved. With it we will be interested in two scenarios.
The first one is when the user holds his finger on top of the action view. In this case, we memorise this particular view and apply a small transformation to it with lifting. For optimisation, we do this only when the active action view changes.
💡 The transformation to highlight the active action can be adjusted, for example by adding scaling
The second scenario is when the user moves their finger outside the list. Then we reset the currently active action view and its transformation to put the view back in place.
All that remains is to teach this interaction to work with different actions and call their respective handlers. For this purpose we define the Action
type, which holds the action title and its handler. The array of these values will be used to populate the actionsStackView
.
💡 You can also add an icon instance, colour, etc. to the
Action
type.
The action will be considered triggered if the long press ends and we have an instance of activeOptionView
. Let's define a separate scenario for the .ended
state, which will check for the presence of activeOptionView
and through its index in actionsStackView
get the corresponding action to be called.
We can now populate the interaction initialiser with actions, as well as specify a handler for each specific action.
Feeling the feedback
The UIImpactFeedbackGenerator
type is used when we want to engage the haptic engine. Its initialiser accepts a vibration style that differs in its strength.
In our case, it is sufficient to trigger the vibration once when the hold gesture has started, and every time the user moves his finger to a secondary action.
Conclusion
What could have been done differently? Well, for example not to use UIInteraction, but inherit from UIControland the same way handle long press gesture and list view display.
Our implementation allows us to add this pin interaction to any view. It could be a plain UIView instance.
Or some UILabel with text.
Depending on the conditions, it may be necessary to change the initial parameters of the pin layout. Attention should also be paid to existing gestures and interaction availability to avoid ambiguous behaviour and gesture conflicts.
Thanks for reading, see you in next articles 🙌