With all the hubbub around generative AI, it isn’t a stretch to start wondering in what new areas of making we might see this stuff proliferate. You can easily have ChatGPT write text for you or analyze your writing. You can instruct Midjourney, Dall-E, and other image generators to draw highly detailed, pixel-perfect creations in a variety of styles. What about 3D printing though? Can you type into a text box and obtain the perfect custom 3D printable model? Right now the answer is: kind of. However, in the very near future, that answer might be a resounding yes.
As of winter 2023–24, there really aren’t any systems advertised with the intent of 3D printing, so I’ll talk about the general concept of text to 3D model. This goal was out of reach a year ago when we published our guide to “Generative AI for Makers” (Make: Volume 84). In the short time since, the landscape of AI has been changing extremely fast and now we have a few different options for playing with text-prompted 3D model generators.
Ultimately, these tools are primarily focused on video game assets, so there are issues with 3D printing. While they do technically work, what you’ll see is that the current generation of AI model generators relies on the color layer to convey many details that simply will not exist when you 3D print. This means your print may be blobby, lacking details, or even oddly formed.
There are a few places where you can try this kind of thing, such as 3DFY.ai, Sloyd, Masterpiece X, and Luma AI. Since Luma is free and easy, I tried it.
Text Prompt to 3D Model
In Figure A you can see the results of the prompt “cute toad, pixar style, studio ghibli, fat.” (Don’t judge me, I know what I like.) The textured version looks OK from certain angles, but we can see the feet and belly have some issues (Figure B), and fine detail is lacking (Figure C).
3D Model to 3D Print
I had to convert the GLB file that was output by Luma AI to an STL file using Blender (Figure E),
but aside from that, it was ready to print. What you see in Figure is the result of a successful print from my Bambu X1 Carbon.
While we can now say that we have used AI to generate a 3D printable model, we can also see that the geometry around the belly is very messed up. Printing it this way resulted in trapped supports that caused a mess when trying to remove them. I could bring this into modeling software and rebuild the feet and belly but at that point, with those skills, what do I need the AI for in the first place?
We’ve already seen 2D AI generative tools built into laser cutter software such as the xTool Creative Space. As these 3D tools improve, I can envision a near future where this kind of AI is built into slicers. Very soon you might just open your slicer, tell it what object you want, pick the best result, and hit Print!
This article first appeared in Make: Volume 88.
ADVERTISEMENT