2014 has been a full-throttle year for 3D printing since January’s Consumer electronics show (CES) introduced us to dozens of new machines. It’s clear that additive fabrication has caught the attention of major brands in all sectors (Adobe, Microsoft, Hasbro, Dremel) and the push for the mainstreaming of this technology has hit a new plateau.
Although there’s been a lot of hoopla, most of the changes to actual functionality have been small; with slow and steady improvements being made to existing (and sometimes cloned) hardware, software, and documentation. Many machines are still in the adolescent stage, but a few have blossomed early, and their polished appearance has begun to attract wider consumer attention.
When unpacking the machines that were tested in our third annual Shootout weekend, I immediately noticed a dramatic, consumer-product-style change in machine packaging and overall fit and finish. Printers once arrived in packing peanuts and were made of laser-cut plywood, now most are shipped with custom foam inserts reminiscent of desktop computer packaging with bodies made of injection-molded plastic. These machines are slowly evolving, but does their performance meet the expectations set by their consumer-ready facades?
We were keen to find out. The core group of 3D-printing test-team veterans (some of whom have been present at all three Shootouts) began preparing more than a month before our trip to this year’s new location at America Makes in Youngstown, Ohio. With the addition of 3D-printing research scientist Andreas Bastain, our test methods advanced from mere visual inspections of Thingiverse objects. We drafted a flexible evaluation protocol and created parametric models that could be quickly adapted to any unexpected situation. These preparations, combined with the onsite, real-time, data-crunching diligence of Kacie Hultgren (aka Pretty Small Things) has yielded quantified comparison data that we could only dream of previously.
DOWNLOAD AND PARTICIPATE!
The 2015 3DP Test Geometries, created by Andreas Bastain, are available from Make:’s YouMagine and Thingiverse accounts. Print them yourself and report your settings and scores!
As you read through our reviews, you will see two distinctly different, complementary types of data: the quantified print-quality scores and the qualitative evaluation of our team’s personal experience with each machine. As with last year’s testing, each machine was run by several different 3DP experts to ensure that personal preferences did not skew the results, and we systematically and anonymously contacted customer support. The materials, host, and slicing software listed on each review are manufacturer recommended, but we verified hardware and software openness by tracking down the source files and their licenses.
We’re proud of what we’ve accomplished during this year’s testing, although there’s always room for improvement. We used Ultimachine orange PLA as a control variable (the team agreed that it was a solid, widely available choice, representative of what would commonly run through desktop machines), some exceptions had to be made (noted in our print-quality summary) for machines that refused to function or jammed without proprietary filament.
In addition, our fused filament fabrication XY and Z resonance mechanical tests did not yield the granularity they were designed to collect and were downgraded to weighted Pass/Fail scores. Many of our SLA tests proved to be too far too ambitious and were abandoned. That may sound bleak, but it was all part of the plan — as Andreas relates on page 34, these models were designed to fail.
Why does all this matter? Because — as Kacie states “Print-in-Place: The Additive Holy Grail — “consumers want accurate prints at the push of a button” and consumer adoption of 3DP (with lower prices and widespread technological transformations that their adoption could enable) is directly dependent on how we answer two key questions: “What is print quality?” and “What should we expect from our 3D printers?”
ADVERTISEMENT