We developed an interactive mesh painting interface in JavaScript. Users can create texture maps by painting directly onto 3D meshes in a viewer, or by painting on UV coordinates in a separate drawing application window. Additionally, users can import their own meshes and texture maps, save their newly created textures, and generate default meshes for painting.
Our platform offers a fast and simple user interface for texturing meshes, catering to quick and casual needs, unlike more complex solutions such as 3D Substance Painter or Blender which are often used in a more professional context. We achieved this by leveraging front-end tools like the Node.js environment, the Three.js API (which provides methods for creating a WebGL viewer, loading meshes, textures, raycasting, etc.), and the Canvas HTML element.
We break our project down into several subparts:
We followed the ThreeJS documentation to render on the screen with a WebGL viewer, a camera, lights, materials, a mesh object, interactivity with the viewer, and setting up a raycaster. We also created a list of file paths that referred to a list of mesh objects that we can load when the user presses “Generate Mesh.”
We expanded upon the code from CodingNepal for our drawing app. This implementation is based on the HTML canvas element. Modifications we made (other than the CSS styling) include displaying brush/eraser size as a circle around the moving cursor and supporting texture import and UV display features. We also fixed some details about the original drawing application such as smoother lines and correct alignment of the brush on resized windows.
ThreeJS supports the use of reading a texture file off of an HTML canvas element. We utilized this to directly render a texture map onto our mesh from the drawing app. Because it would be costly to have to update the texture every frame, we chose to only update it when the user draws on the screen.
We had to debug this by rendering a cube rather than an imported mesh, since we were running into loading errors simultaneously. This problem was later fixed as described here.
To implement support for importing files, we began an instance of createElement('input'). This allowed us to define the type of file that we can import and what to do with the contents. Inside the function for the texture images, we mapped it onto the loaded mesh. For loading in a GTLF/GLB, the variables that define the mesh are adjusted to the new mesh that was imported.
This was one of the more challenging parts of the project. Upon researching techniques on how to paint over meshes, we stumbled across a few resources. This demo takes inspiration from the following paper. It dynamically creates and unwraps the UVs of a mesh so that painting on the mesh is fast and efficient as opposed to drawing over a mesh that is already UV unwrapped and static. However, the paper states that the limitations of this program can only be used to paint on meshes casually rather than for animated film or game assets, which require carefully-placed UVs from the start for texture mapping and skinning. Thus, we decided to base our implementation by drawing on a static UV unwrapped mesh, because we wanted our program to mirror other professional 3D paint programs and teach others how UV mapping works.
The diagram below illustrates how we ended up implementing drawing directly onto the mesh via raycasting. ThreeJS has its own raycaster class, and handles intersections with faces on a mesh. Upon calling the intersect method, we can directly obtain the face in which the mouse intersected, and its corresponding interpolated UV coordinates. Once the UV coordinates are obtained, we can calculate their position on the drawing app by scaling with the corresponding width and height of the canvas. We scale (1-v) by the height instead of v because ThreeJS reads the top left UV coordinate as (0,0).
In a face, we can connect all of its vertices for form a triangle, where each vertex in a face corresponds to a UV coordinate. To render the UV shell layer, we iterate through the faces of the mesh, find the corresponding UV coordinates and connect them into a triangle. The difficulty comes in identifying faces. In code, the faces are represented as a buffer of indices where each consecutive trio of indices form a face. We can use these indices to get the corresponding UV coordinates from the buffer.
To make the UV shell toggleable, we used two canvas elements: “drawingapp” is the drawable texture layer and “uv-mesh” isolate the UV shells. By setting the position to “absolute” in CSS, we were able to stack the two canvas elements on top of each other, thus treating them as canvas “layers”. Now, we are able to draw and erase them separately.
Along the way, we encountered various issues with our project that we had to solve. We highlight the two most prominent problems we successfully tackled below:
We experienced a lot of problems initially with timing. Sometimes assets are called before they are loaded. For example, the init function should be called first, where we initialize the mesh and lights, then the main function loop, where we read off of the information from the mesh. However, print debugging suggests that the assets haven’t yet finished loading until after the main function loop has already started. We solved this via Javascript’s Promise API holding the async and await system. Handling this was especially challenging when we had to combine the WebGL renderer and the canvas together.
Originally, we intended to expand the functionality of the MeshEdit code for our project. After working on this for three weeks, we opted to switch platforms because MeshEdit relies on OpenGL 2.1, which is largely deprecated on Windows and Linux computers. Additionally, we encountered suspected errors within the texture coordinate and vertex buffers, causing our textures to render as individual triangles on the mesh.
After consulting with Prof. Ng and conducting our own research on the best graphics API for our purposes, we stumbled across the Three.js API. This high-level framework provides a comprehensive set of tools for implementing core graphics programs and algorithms in WebGL. Utilizing Three.js enables us to integrate our project into the UCBUGG: 3D Modeling and Animation course webpage, where two of us serve as facilitators. This integration aims to enhance students' understanding of UV mapping and texturing in a more intuitive manner, while also providing a faster and more accessible way to texture meshes for their semester-long animated short film.
Throughout the time we spent working on this project, we took away the following lessons:
Please watch the final video explanation here for the best overview of our project!
|
|
---|---|
|
|
|
|
Created the 3D WebGL interface and controls using the ThreeJS API, implemented cross-compatibility with the drawing app and 3D viewer , implemented raycasting and UV coordinate transformations for drawing on mesh directly, created the list of meshes to generate, implemented the opacity slider, and attempted to UV map on the deprecated MeshEdit version of our project. Also modeled the anime head.
Implemented promise handling for preloading assets, added a toggleable UV mesh layer for the canvas, did general debugging, and developed draw code and import code in deprecated MeshEdit version.
Implemented brush/erase, drawing app, texture export, website styling.
Loading in gtlf/glb files, implementing imports so users can input their own texture or models