If you want to get an understanding of the importance, scale and ubiquity of computer animation, all you need to do is take a look at the top 50 highest-grossing films of all time.

The number one spot is taken by James Cameron's Avatar, a film that relies so much on computer animation that its development was delayed by years in order to allow technology to catch up.

There are numerous fully computer-animated movies in the list – Frozen, Toy Story, The Lion King, Minions, and loads more. In fact, we think you would be hard pushed to find a single movie in the entire list that doesn’t involve some type of computer-generated imagery.

It’s hard to imagine a time before computer animation, and without it, many modern films would be unrecognizable, if they would even exist at all. The video game industry would be a very different place too.

How did this technique become so huge and important? Let’s take you on a journey to find out.

Jumpstart your ideas with Linearity Curve

Take your designs to the next level.

Hold up, what is computer animation?

First off, we need to pin down what is meant by computer animation. Computer animation is a general term that refers to the art of creating animated images through the use of computers and digital tools.

It follows the same basic principle as other kinds of animation, such as stop-motion animation and illustrated 2D animation, where tiny steps are made on a frame-by-frame basis to create the illusion of movement.

In the case of computer animation, an image is displayed on a computer monitor before being rapidly replaced by a new image that is almost identical, but just a very tiny step forward in time.

This process takes place at a rate of 24, 25, or 30 frames per second and as a result, the human eye perceives it as continuous movement.

2D animation vs. 3D animation

The way computer animation works is different depending on whether it is 3D or 2D. In 3D animation objects are digitally produced and then placed on conceptual frameworks or ‘skeletons’.

In the animation of 2D illustrations, separate objects and separate transparent layers are used in the animation and they can be used with or without a virtual skeleton.

Whether it is 2D or 3D, the technique by which the animation is produced is the same. Take the example of a figure animation: the placement of the limbs and facial expressions of a character are moved on what are called key frames.

The differences in appearance between the key frames is then automatically worked out by the computer, and it fills in the gaps, so to speak. This key frame illustration process is known as tweening or morphing.

The final stage in the computer animation process is rendering. For 3D animations, the rendering process takes place after all of the modeling is complete. For 2D animations, the key frame animation process is the rendering and tweened frames are rendered as needed.

Computer animation has massive advantages over other techniques. A key benefit is that it requires much less workforce than stop-motion or conventional hand-drawn cartoon animation.

Rather than painstakingly painting every single frame, or moving 3D figures on a step-by-step basis, computing power can be utilized to achieve both of these things much more quickly, which is the idea for animation studios.

This has led computer animation to be regarded as the digital successor to stop-motion and hand-drawn animation, and it's now far and away the most popular animation technique.

When the ball started rolling

There were some experiments in computer graphic animations during the 1940s and 1950s, with the most notable coming from John Whitney in 1958 when he created what is credited as the first-ever computer animation.

Collaborating with legendary graphic designer Saul Bass, Whitney created the title sequence for Alfred Hitchcock’s Vertigo using an analog computer. Whitney is now considered one of the fathers of computer animation and has earned a spot in animation history.

The real acceleration of modern computer animation came after the advent of digital computers in the 1960s.

Universities like the University of Ohio and the University of Utah established departments to support computer animation, and other institutions like the National Film Board of Canada began experimenting with the new discipline and many shared the goal of creating an animation program.

These early explorations in computer animation were mainly directed towards scientific, research, and engineering purposes. One of the first ever computer-generated films came out of Bell Laboratories in 1963. 

It had the catchy title A Two Gyro Gravity Gradient Attitude Control System and showed a box with edge lines that represented a satellite orbiting around a sphere that represented the Earth.

In 1973, computer animation made it out of the lab and onto the big screen when digital image processing was used in a feature film for the first time.

Transform Your Ideas into Animations

Dive into the world of animation with Linearity Move. Perfect for beginners and professionals alike, our course guides you through creating stunning animations for any purpose.

Director Michael Crichton enlisted John Whitney Jr (son of John Whitney) and Gary Demos of Information International, Inc. for some of the scenes in Westworld. To reflect the point of view of the android in the movie, they digitally processed the film to make it appear pixelated.

Soon after, wireframes appeared in films such as George Lucas’ Star Wars and Ridley Scott’s Alien. George Lucas was very interested in pursuing advances in CGI around this time. In 1979 he took some of the top talent from the highly respected Computer Graphics Laboratory at the New York Institute of Technology and set up his own special effects division.

This division was later split off to form an independent division using funding from Apple’s Steve Jobs. And what was this new division called? Pixar.

Throughout the 1970s and early 1980s, technological advances continued to be made, with the introduction of the framebuffer. By the 1980s, this new technology was pushing digital animation into new places.

More than acceptable in the 80s

Advances in computer power combined with an increase in affordability and new developments in commercial software meant that throughout the 1980s, the quality and prevalence of computer animation and computer-generated imaging kept increasing.

This was the era in which solid 3D CGI was improved and developed to the point where it could be used in a movie for the first time. Walt Disney’s Tron was released in 1982, and it is now regarded as a real milestone in the movie industry, with its use of solid 3D CGI a first for a film representing a giant step forward.

The vehicles and digital terrains of the film are all produced by CGI, and showed what could be achieved with the technology. From here on in, we see CGI being used in more and more movies right up to the present day.

Morphing or tweening also improved dramatically in the 80s. Up to this point, morphing was mainly used with vector animations, but by the early 1980s, the technology was enabling morphing to happen between photographic images to create photorealistic animation.

The first public example of this in action came from the New York Institute of Technology in 1982 when at a conference Tom Brigham from the institute presented a video sequence of a woman morphing into a lynx.

By 1988, morphing had made its way onto the big screen in Ron Howard’s movie Willow, and it was also used to great effect in Terminator 2: Judgment Day in 1991. The technique probably reached the peak of its trend when Michael Jackson used it in his music video for Black or White.

The video premiered simultaneously in 27 countries to reach an audience of 500 million people and brought morphing to the forefront of public consciousness. Computer animation was heading for the big time.

Breaking on through


The 1990s was the decade in which computer animation really started to take over and become a significant part of the film and TV industry. The CGI and morphing used in Terminator 2: Judgment Day was regarded as the most substantial use of CGI in a movie since Tron way back in 1982. The 90s also saw the first 2D animated movies to be produced using only the Computer Animated Production System (CAPS).

In 1990 Walt Disney released The Rescuers Down Under, which was created using just the CAPS system. Walt Disney then followed up with Beauty and the Beast in 1991. It was once again made using only CAPS, but it took it even further and incorporated 3D CGI effects too. The movie was a huge box office success and also became the first animated film to be nominated for an Oscar for Best Picture.

CGI is used in an increasing number of movies and TV shows, like Jurassic Park, Babylon V and The Lion King. Then, in 1995, another huge milestone was reached – Disney-Pixar released the first ever fully computer-animated feature film, Toy Story.

A runaway success, Toy Story became one of the highest-grossing films of all time, revealing the true potential of computer animation and 3D characters. Now, computer animation is far and away the most prevalent type of animated film.

Motion capture

Another big development in computer animation in the 1990s came from improvements in motion capture. In short, motion capture records the movements of people or external objects.

For human motion capture of people, a person wears a series of markers placed near each joint to track the movement of the markers. Often, a special suit with markers will be worn. Motion capture can also be used to record details such as facial expressions and hand movements. The data that is captured from the movements is then mapped to a 3D model and new graphic and animation elements can be placed over it.

Motion capture’s initial application was as a biomechanics research tool. It was first used for commercial purposes in video game production in the late 1980s before being adopted by the film industry in the late 1990s.

A notable example of this time was the use of motion capture to create the Jar-Jar Binks character animation in Star Wars: Episode I – The Phantom Menace. Many people strongly disliked this character, so perhaps it wasn’t the best use of the new computer animation technique after all.

One of the most significant breakthroughs in motion capture was Andy Serkis’ performance as Gollum in The Lord of the Rings: The Two Towers. This was the first feature film used a real-time motion system. The technique was able to transform nuances of Serkis’ performance into the facial animation of Gollum and gave the CGI a real human character.

Computer animation today

Nowadays, computer animation and computer-generated images are the absolute norm in the television, film, and video game industries. Incremental improvements in technology have increased the capabilities of CGI while simultaneously making it more accessible.

Increased processing power and better software mean that high-quality computer animation is no longer confined to major players with powerful workstation computers.

Many home computers are now capable of producing computer animations that would have previously required giant dedicated rendering machines. This has created new opportunities for individuals and companies to experiment with animation.

The intersection of computer-generated animation and traditional animation techniques has significantly shaped the entertainment industry. Companies like DreamWorks Animation and Pixar Animation Studios have mastered the art of blending these methodologies, creating films rich in both visual effects and emotional depth.

Ready to create brand assets that pack a punch?

Visit our Academy for free marketing design courses.

This fusion has been further enhanced by advancements in Affective Computing, which allow for more nuanced digital characters capable of complex motion and realistic animations.

The content types in animation have diversified, spanning from animation in films to real-time video games and even procedural animation for visual material in interactive techniques.

This expansion has required compatibility across multiple file formats and industry-standard software.

While file size used to be a constraining factor, the advancement of visualization technologies has made it easier to maintain a complete image without compromising on quality.

Graphical user interfaces in animation software have also evolved, becoming more user-friendly while offering a more extensive range of technical skills, from tweaking the frame rate to intricate facial motion adjustments.

The animation series and movies produced today are nothing short of dynamic images brought to life, with intricate motion techniques ranging from the simple to the sublime.

And whether it's the realistic ambiance created by Industrial Light & Magic or the heartwarming characters from Pixar, the industry has come a long way in defining what animation can achieve.

The blend of traditional and modern—whether it's a simple 2D animation or a complex 3-D animation—highlights how far the animation world has come and sets the stage for future innovations.

If you want to learn how to animate, the barrier to entry has never been lower, and the potential has never been greater. You no longer need an animation degree to get started in the field. You can start right now.

The reason why CGI and computer animation have become so popular is because it makes almost anything possible – the only limit is your imagination.

The same goes for illustrators and graphic designers using Linearity Curve (formerly Vectornator) and animators using Linearity Move. Thanks to our unique tools and cool templates there’s no limit to what you can create!

Jumpstart your ideas with Linearity Curve

Take your designs to the next level.

The history of computer animation | Linearity
The history of computer animation