Data Driven Multi-material Prosthetic Socket
Over the next months following Adaprox‘s pivot, I embarked on a project to create a prototype of the aforementioned socket as a portfolio piece.
This work began by finding open source CT scan data online. I found some through the software provider OsiriX, which I will be using to view and manipulate the data. After importing just a selection of data from the knee region, I isolated the bone and the outer skin geometry.
After isolating the bones and outer geometry, I imported them into a CAD package called Rhinoceros. Here, I would be able to use the data as the foundation for creating a socket.
However, I would need to perform “digital surgery” by editing this data to resemble that of an amputee. I used a mesh editing tool called mesh mixer to do this. I cut down the tibia and fibula, as well as capping off the outer geometry. This approximation would be good enough for the proof of concept.
Now that I had the needed input data, I would need to start having this data drive the CAD model. I did this through a Python script I wrote. The inclusion of Python into Rhino is the reason why I chose it for this project. I would like to mention that I knew neither Rhino nor Python before starting this work, and taught myself through online courses. It took about two weeks to become proficient in both.
The Python script I wrote would color code an outer mesh based on the relative distance of an inner mesh. Essentially, a number of objects would be contained by an outer object. Areas where the inner objects were close to the outer surface would appear red. Objects that were far away would turn green. This would be useful in mapping which parts of the leg had bone near the surface, and therefore would benefit from softer, more flexible material.
With this new information now in hand, the different parts of the socket could be separated. The goal at this point is to create a multi-material socket (with both soft, flexible parts and hard, sturdy parts) that is intelligently placed based on the CT scan data.
As of August, I have successfully printed a 1/3 scale prototype from this model. It was printed on a Stratasys Connex 500 in three different materials of arbitrary shore value (for demonstration purposes). The white is a rigid plastic, while the black is more of an elastic rubber. The blue material is a mixture between the two.
This is, however, only the first step in what I hope to be a three step process. While the current design takes advantage of additive manufacturing’s ability to create custom objects, tailored to an individual, it has yet to take advantage of its ability to create extraordinarily complex objects. I think this socket can be made even better. I think that it can hold even more information.
Steps two and three involve using finite element analysis (FEA) to determine which parts of this socket are taking a reasonable amount of stress from the dynamic loads of someone walking and which parts are not. More so than just making sure the part will not fail, as FEA is most commonly used, I want to integrate the method into a positive feedback loop where the analysis results actually affect the model’s geometry. Essentially, this loop will remove parts of the socket that are not needed and build up parts that are. After running this loop a number of times, a complex geometry will form that is just as strong, or stronger, than the original piece, yet uses less material.
I was first exposed to this idea two years ago at the London 3DPrintShow. There I saw a steel structural piece that, as best I can describe it, resembled tree roots. I find it very curious that the results of the optimization reflected nature. Either the researcher tended the design that way (thinking nature should be emulated to achieve great results) or, which I would find to be fascinating, the optimization process itself tended towards nature on its own, promoting nature has had it right all along. At the time of this writing, the group at the University of Nottingham, who created the piece I saw in London pictured below, is still preparing for their first publication on the subject.
As for how this applies to my project and steps two and three, I plan to write my own FEA program (most likely in Python) that will affect the digital model (most likely in Rhino). Step two will resemble what I saw in London, as the analysis would be discrete between and unique to each material.
However, for the final step, I would like to have the entire multi-material piece undergo a single analysis that blends the materials together versus keeping them separate. As far as I can tell, this has yet to be achieved by anyone. I can only imagine what the results would look like.
I am currently teaching myself the math behind FEA through MIT’s OpenCourseWare, most specifically this series by Dr. Bathe on structural FEA.