Illustrated avatars have become a great way to customize and add some fun to the user experience within an app or website and establish a unique online presence. With Rive, you can give those avatars life by adding animation and interactivity.
Feel free to download and play around with the Rive file here.
Design and Planning
Starting with a strong design goes a long way, so we’ve developed a simple yet effective look for the avatar we are planning to animate. There are some things to keep in mind when it comes to planning a character animation like this:
- Start with simple shapes. You can add details later if you need, but because the size varies when displaying avatars across the web, the design should read at a small size.
- Think about what parts of the avatar you’d like to animate. For this design, we will give the character the ability to look around. Things like adding some parallax to the hair and glasses will sell the idea of 3d movement. We’d also like to add some emotes (happy, sad) to the character, so we need to add features like a mouth and eyes.
- List out the animations you’ll need to breathe life into the avatar. We are planning on creating an idle state, a happy state, and a sad state.
Rigging and Setup
Here we’ve got our character rigged and ready to animate. Let’s break down the steps we took to make this happen:
- To make controlling and animating our character’s head a bit easier to manage, we added an empty group and set its style to “Target”. This gives us a control point that we can parent the features of the head to (hair, glasses, eyes, mouth, ears, etc.).
- Instead of just grouping or nesting the features, we’ve applied transform constraints. By adjusting the strength of the constraint on each feature or layer, we can quickly add some depth to our character. This helps give the feeling of 3d to our animations.
- The ears move in the opposite direction of the rest of the features, and we can achieve this by using a negative value for the strength of our transform constraint.
- We’ve also added two bones to the rig and have parented the glasses hinges to one bone and the temple tips to the other bone. This allows the glasses to move in “3d” along with the movement of the head and face.
- Adding a “distance” constraint to the face control group limits how far the face control group can move. This lets us keep the facial features on the face where they belong.
The way the face is rigged up is similar to how a 2d video game might add depth by using a parallax effect. Objects in the foreground (the hair and glasses in the case of our avatar) tend to move at a much greater rate than objects off in the distance (surface of the face itself) (which appear to move slower).
We’ve gone ahead and created an “idle” animation state for the avatar, which consists of keying the position of the face control group to make the character look around in a somewhat random manner. We’ve also animated the opacity of the eyes open and eyes closed groups, which makes our character blink a few times during the idle animation. Using the “hold” interpolation for these keys makes it easy to turn the visibility of each group on and off. This animation would be displayed whenever the user isn’t interacting with anything onscreen. One thing to note: because we have tied the facial animation to a target group, on desktop, we could have the avatar’s gaze follow the mouse cursor (we could also have the avatar track the movement across an input field.
Now we want our animated avatar to emote and react to different actions the user might take within an app. Maybe our app contains a feature where users can like or dislike something. We’ve gone ahead and animated a “happy” emote animation and a “sad” animation as well, giving our character an excellent range of emotion.
Interactivity (State Machine)
To make this avatar interactive, we will add a State Machine to the project. We’ll rename this “avatar”. It's good to note here that if you are planning on handing the project off to an engineer, you should sync with them on what naming conventions should be used within your project. Let’s drop our idle animation onto the graph and hook up the entry to our idle state. This way, when the State Machine loads, it will start playing the idle animation. Next, we want to be able to trigger the happy and sad animations, so let’s pull those onto the graph as well. We’ll connect the idle to the happy state and then the idle to the sad state.
In order to trigger these states, we’ll need some inputs. Let’s add a couple of boolean inputs and rename them isHappy and isSad. Over in the conditions panel for the transition between idle and happy, we will select our input isHappy. We want to trigger the animation when isHappy is set to true. Let’s also make sure that we can only activate one emotion at a time, so we will add an additional condition that checks to make sure isSad is set to false. Once these two conditions are met, our happy animation will play!
We also need to transition from happy or sad back to our idle state, so let’s drag a transition from happy back to our idle node. For our conditions, let’s add one that requires the isHappy input to be set to false and one that requires isSad to be set to false as well.
We can repeat these steps for our sad state as well. For our conditions, let’s make sure that isSad is set to false, and isHappy is set to false as well. Let’s play our State Machine and test these out!
Not bad, but our avatar jumps from our idle state right to the happy state. It’s a bit jarring, so let’s set the duration between idle and happy, as well as idle and sad, to around 150ms. This gives us a nice blend between the two animation states.
OK, much better, but you might have noticed that the opacity for our mouth groups is animating during that transition, which isn’t the behavior we’d like to see. We can solve this by creating animations for each mouth state (closed, happy, sad) where we key the opacity of the mouth group we want visible to 100%, and the other mouth groups we key the opacity to 0%. We then want to add these animations to a new layer within our State Machine. We will hook these up to the “Any State” node and set these animations to trigger along with the main character animations. We won’t set any duration time for these transitions, which will make the mouth jump to each state without fading in or out. This gives a much more natural look to the mouth transitions. Let’s take our State Machine for a test drive... yep, everything is looking great!
Now that we’ve got a few emote states for our avatar, our developer can render the avatar within our app and trigger animations using the State Machine inputs! All we need to do is go to the “Export” icon and select “Download → For newest runtimes” to get our .riv file.