Recent from talks
Contribute something to knowledge base
Content stats: 0 posts, 1 articles, 0 media, 0 notes
Members stats: 0 subscribers, 0 contributors, 0 moderators, 0 supporters
Subscribers
Supporters
Contributors
Moderators
Hub AI
Live2D AI simulator
(@Live2D_simulator)
Hub AI
Live2D AI simulator
(@Live2D_simulator)
Live2D
Live2D is an animation technique (not to be mistaken with software used to create Live2D animation such as Live2D Cubism) used to animate static images—usually anime-style characters—that involves separating an image into parts and animating each part accordingly, without the need of frame-by-frame animation or a 3D model. This enables characters to move using 2.5D movement while maintaining the original illustration.
Live2D models consist of layered parts saved as a Photoshop file (.psd format). Layers are separately moved to show the whole animation and expression of the character, such as tilting the head. Parts can be as simple as face, hair, and body, or they can be detailed to eyebrows, eyelashes, and even effects like glinting metal.
The number of layers influences how the Live2D character moves and how three-dimensional the result appears, with simpler models having less layers, such as around 50 layers, and larger, more complex projects having more, up to hundreds of layers, with no hard upper limits. Unlike a 3D model there isn't a traditional skeleton, but instead the flat layers are warped and rotated.
Live2D can be used with real-time motion capture to track movements such as head movements, eye movements, and perform lip syncing for real-time applications such as vtubing or motion capture. The downside of the technology is that there is little capability for 360° rotation of complex objects and body tracking.
Live2D has been used in a wide variety of video games, visual novels, virtual YouTuber channels, and other media. Well-known examples of Live2D media and software include FaceRig, VTube Studio, VTuber Legend, Nekopara, Azur Lane, and virtual YouTubers (as popularized by Hololive, Nijisanji, and VShojo).
Live2D was first introduced in 2008 to resolve the need for interactive media. Since then, the technology has also changed how games enhance user experience through lively characters and expressions.
In 2009, Cybernoids Co. Ltd. (now Live2D Ltd.) released their very first Live2D application, Live2D Vector. The application transforms vector graphics to make flat character images achieve three-dimensional head turning and moving effects. Although such characters can only perform limited activities, they perform much better than static pictures or slideshows. Users can also customize their moving character by adjusting parameters through software or collecting materials such as images of different angles of a character. Although the occupied capacity resources are reduced, the rendering of complex images consumes a lot of CPU and RAM. Also, while it can be difficult to work with more traditional art styles such as oil painting or gouache styles, creators have been experimenting with these styles with success.
The first application of the Live2D technique is HibikiDokei released by sandwichproject (株式会社レジストプランニング), an alarm app released in 2010. The alarm app has a girl character named "Hibiki" who talks and moves.
Live2D
Live2D is an animation technique (not to be mistaken with software used to create Live2D animation such as Live2D Cubism) used to animate static images—usually anime-style characters—that involves separating an image into parts and animating each part accordingly, without the need of frame-by-frame animation or a 3D model. This enables characters to move using 2.5D movement while maintaining the original illustration.
Live2D models consist of layered parts saved as a Photoshop file (.psd format). Layers are separately moved to show the whole animation and expression of the character, such as tilting the head. Parts can be as simple as face, hair, and body, or they can be detailed to eyebrows, eyelashes, and even effects like glinting metal.
The number of layers influences how the Live2D character moves and how three-dimensional the result appears, with simpler models having less layers, such as around 50 layers, and larger, more complex projects having more, up to hundreds of layers, with no hard upper limits. Unlike a 3D model there isn't a traditional skeleton, but instead the flat layers are warped and rotated.
Live2D can be used with real-time motion capture to track movements such as head movements, eye movements, and perform lip syncing for real-time applications such as vtubing or motion capture. The downside of the technology is that there is little capability for 360° rotation of complex objects and body tracking.
Live2D has been used in a wide variety of video games, visual novels, virtual YouTuber channels, and other media. Well-known examples of Live2D media and software include FaceRig, VTube Studio, VTuber Legend, Nekopara, Azur Lane, and virtual YouTubers (as popularized by Hololive, Nijisanji, and VShojo).
Live2D was first introduced in 2008 to resolve the need for interactive media. Since then, the technology has also changed how games enhance user experience through lively characters and expressions.
In 2009, Cybernoids Co. Ltd. (now Live2D Ltd.) released their very first Live2D application, Live2D Vector. The application transforms vector graphics to make flat character images achieve three-dimensional head turning and moving effects. Although such characters can only perform limited activities, they perform much better than static pictures or slideshows. Users can also customize their moving character by adjusting parameters through software or collecting materials such as images of different angles of a character. Although the occupied capacity resources are reduced, the rendering of complex images consumes a lot of CPU and RAM. Also, while it can be difficult to work with more traditional art styles such as oil painting or gouache styles, creators have been experimenting with these styles with success.
The first application of the Live2D technique is HibikiDokei released by sandwichproject (株式会社レジストプランニング), an alarm app released in 2010. The alarm app has a girl character named "Hibiki" who talks and moves.
