The latest issue of IEEE Computer Graphics is focused on sketch-based interaction. In "a sketch-based interface for clothing virtual characters (PDF format, 10 pages, 1.61 MB), the publication reports that an international team led by French computer scientists and fashion designers has worked on an intuitive way to design virtual clothing. Their method "determines a garment's shape and how the character wears it based on a user-drawn sketch. The system then uses distances between the 2D garment silhouette and the character model to infer remaining distance variations in 3D." This method could soon be used not only for real garments, but also by the video-game industry.
Before going further, here are some examples of garments created using this sketch-based approach. It took less than 5 minutes to the designer to draw each of these garments. "The strokes' jagged appearance in the drawings resulted from our using a mouse as the input device, rather than a more adequate graphics tablet." (Credit: IEEE Computer Graphics)
The research team, which includes Laurence Boissieux, an engineer at INRIA Rhône-Alpes in Grenoble, France, also showed its results at EUROGRAPHICS 2006. Their presentation was called "Virtual Garments: A Fully Geometric Approach for Clothing Design" and is included in the EUROGRAPHICS 2006 proceedings (Volume 25, Number 3, September 2006 Here are two links to the abstract and to the full paper (PDF format, 10 pages, 1.94 MB).
The following introduction clarifies the team's approach.
Modeling dressed characters is known as a very tedious process. It usually requires specifying 2D fabric patterns, positioning and assembling them in 3D, and then performing a physically-based simulation. The latter accounts for gravity and collisions to compute the rest shape of the garment, with the adequate folds and wrinkles. This paper presents a more intuitive way to design virtual clothing. We start with a 2D sketching system in which the user draws the contours and seam-lines of the garment directly on a virtual mannequin.
Our system then converts the sketch into an initial 3D surface using an existing method based on a precomputed distance field around the mannequin. The system then splits the created surface into different panels delimited by the seam-lines. The generated panels are typically not developable. However, the panels of a realistic garment must be developable, since each panel must unfold into a 2D sewing pattern. Therefore our system automatically approximates each panel with a developable surface, while keeping them assembled along the seams. This process allows us to output the corresponding sewing patterns.
Below is a picture describing the different steps between an initial sketch and the final garment. From left to right, you can see the sketched contours, the 3D shape computed using distance field, the piecewise developable surface, the final virtual garment that you can compare with the real one sewn from the 2D patterns generated by the research team (Credit: EUROGRAPHICS 2006).
Finally, here are some of the researchers' conclusions about how their method could be used for dynamic animation of garments.
The 2D patterns we generate can be used by a cloth animation system to compute the rest lengths of the springs that model the cloth material. Thus, the garments we design can be animated using standard techniques. Another option would be to take advantage of our procedural modeling of fabric folds during animation. Then, only the control mesh would need to be animated using physically-based simulation, while fine details such as folds, costly to simulate since they are caused by stiff, buckling phenomena, would be added procedurally at no cost prior to rendering.
But the team also thinks about other applications, such as the prototyping of real garments. "A fashion designer could use our sketching system to quickly sketch some clothing, automatically get different 3D views of the garment, edit the model as necessary, and finally print the corresponding 2D patterns."
Best wishes to the team for 2007: these researchers deserve it.
Sources: IEEE Computer Graphics, January/February 2007 issue; and various websites
You'll find related stories by following the links below.