Brainstorm by Nick Youngson CC BY-SA 3.0 Alpha Stock Images
Last fall/winter Humanitarian Makers members participated in an Item Testing pilot that opened items designed in the field (focusing on a curated list of Field Ready’s items) to testing by the members around the world. We had shared about that experience after the pilot and you can read about it here. One initiative that spun out of this effort was an informal, digital brainstorm among three of the members who participated in the testing. The brainstorm was focused on a perceived key question that came out of the pilot:
How might humanitarian makers advance item design and development at the post-testing stage?
The three individuals invited to delve into this question were:
Jonathan Rushton (Made In...Shoes / Trillium - UK)
Kobina Abakah-Paintsil (Klaks3D - Ghana)
Tommy Sebastian (MIT - USA)
The brainstorming conversation was kept small in attempt to keep the dialog rich and efficient. These three were selected because of their initiative with the testing, vision for doing this better, and shared goals with Humanitarian Makers. Naiomi Lundman facilitated the brainstorm. Many thanks to these three for their efforts and time to delve into this.
This question was asked within this context:
HM and Field Ready piloted community engagement on the item testing & feedback piece. A next step was to have intentional thought on the feasibility of community engagement at the next item development step: evaluating testing feedback and developing the next iterations. Process visual here (thanks to Field Ready).
A main challenge: remote item development is several steps away from understanding the initial problem-statement; and, as such, the next iteration developed remotely may not be suitable for the initial intent. (This may be okay?)
The ultimate goal: shared items are usable and reliable (quality) in the humanitarian context [achieve the 4A target of accessible, affordable, appropriate, available].
version control software is key yet a challenge
flexibility with file revisions is important
multi production process application desired (ex CNC apps)
cloud based offers lower barrier to use, provided have good internet access
integrated simulation is desirable
easy to use interface is attractive
Testing feedback needs to account for the variability in materials and production technologies
Testing needs to be functional, inclusive of qualitative assessments and of load testing
Load testing may be possible by testing intended use (ex. IV Hook tested by holding an IV bag)
How do this in a representative and useful way? (ex. am I worried about cyclic failure of a part if it only needs to last a little while?)
Balancing complexity v. simplicity may always be a challenge (to what length do we test?)
To enable wide production and use, hobbyist level CNC manufacturing methods may need to be utilized predominately
Sharing source files is critical
Co-development will enable stakeholders to determine the appropriate testing as well as negate the communication of risk
Photo by: Monique Randolph
If you’d like to read the full brainstorming (completed via email), it is shared below. Have additional thoughts and insights you’d like to share? Please do so in the comment section.
TS: Version control of parts is a hard challenge, practically with fabrication possible using a wide range of materials and fabrication technologies. Feedback needs to account for this somehow, I think. Testing needs to be functional in nature as well… along with qualitative assessments, it seems to be that load testing (for many of the parts of interest) is necessary… that said, it may not be possible for many to execute.
KA: With version control I'll suggest the use GrabCad (https://grabcad.com/workbench/). It will be able to offer flexibility on file revisions etc. If all files are available there, different revisions for different technologies can be saved there without losing the original files.
For the load testing, I think much of the load testing that can be undertaken by most people is using the files for their actual purposes. For example, IV bags for testing IV hooks. Perhaps we can also look into getting some common artifacts that can be used as loads for testing.
NL: Thanks for getting this ball rolling you two. Kindly, allow me to chime in with some questions in response to your comments.
Kobina - there's an analysis we did of many different file sharing platforms. You're welcome to take a look through if you'd like, considering your interest in GrabCad. (I've shared the links below) Also, how do you see people you work with in Ghana use online file sharing platforms? Is this something they easily access and utilize currently?
Considering the points Tommy raised on load testing, what type of artifacts are you thinking Kobina?
Tommy, why do you think many may not be able to execute load testing?
There is a thought that testing equipment itself can be made open online. Feasible for many to make and calibrate their testing machines so that there's some standardization or reliability in testing achieved? I imagine the robustness of testing depends on the item purpose and risks. How might this be navigated or communicated?
As noted above, here is the documentation we have on digital collaboration type platforms for open hardware related functions.
1. Platform analysis spreadsheet
2. Criteria for determining "field readiness" (important because factors into what needs to be communicated on these platforms)
3. Specific platform/documentation reviews:
- Wevolver analysis
- Docubricks analysis
JR: Just to throw a recommendation into the version control question, I have been using a CAD software called Onshape (https://www.onshape.com/) this software is cloud based and has integrated simulation and CNC apps. The version control is within the document which can be edited live by multiple people. They are the new kids on the CAD block but I haven’t found an easier interface to learn. Because of the cloud it doesn't require an expensive computer to run the software and simulation/rendering, also its free for public projects (which I assume will be for most Humanitarian items).
However, it needs the internet to function so might not work in remote locations.
On the testing question. I might be biased but I believe to circulate the solutions to the most people will force the manufacture method to predominately utilize hobby scale CNC manufacture methods such as 3D printing, CNC Routing and Laser Cutting. I have made many items since starting work 5 years ago and believe that with these combinations of tools you can manufacture pretty much anything. These manufacture methods lend themselves to lightning fast iteration and this prototyping methodology allows for quick incremental improvements after testing. In short I agree that the testing may have to be more practical in nature. Smaller companies tend to work in this manner so it should be suitable to solve a lot of issues without involving complexed testing rigs and simulation.
TS: Not so much in ability to conduct load testing, but rather, doing it in a representative and useful way… Example: am I worried about cyclic failure of a part if it only needs to last a little while?
Nifty article on HackADay about tangential topics – how to apply open-source software approaches better to open-source hardware: https://hackaday.com/2018/02/27/can-open-source-hardware-be-like-open-source-software/
NL: Thanks you two.
Great article. Looks like we're asking the key question(s). :)
In light of the article's comments on modular design and sub-units, how might this possibility with version control effect your collaborative CAD work. Perhaps reflect on your experiences with onshape and grabcad - which I'm assuming don't have this capability? Note, I read in the comments section that some folks don't see this as realistic - parts, sub-units are too interdependent on the others to mix and match into other items. Thoughts?
Regarding testing, it seems there's a kinda consensus that practical testing will need to be done. Is the question then how to communicate risk and/or item expectations?
Does this criteria list we created seem to address this last question? Or does it take it in a direction of complexity verse simplicity?
TS: Balancing complexity v. simplicity will always be a challenge I think. Perhaps in situations where there are no other viable workarounds, a part solution that’s “good enough” may be sufficient, but that needs to be balanced with the regrets associated with part failure (injury, property damage, etc).
Also, came across a nifty paper in PLOS One by researchers in Western University in Ontario that outlines the design and testing of a printable stethoscope for $2.40 that is comparable to a ~$100 Littmann Cardiology III stethoscope in performance, with parts files available on GitHub. There are a few ideas here on process I think.
NL: This stethoscope was on HM's first "business" card as a visual of what's possible. Have been following Tarek's work for a while. Field Ready and Tarek had a mini informal collaboration around otoscope design.
They offer very thorough documentation on PLOS. I especially appreciate these points:
Hospitals in Gaza are self-sufficient producers of these stethoscopes.
Low cost doesn't does not mean low quality.
and the point you highlighted: comparable to Littmann Cardiology model in performance.
Now what would be very interesting is inviting a spectrum of makers to test this design, considering how well-done it’s been prepared. Test the instructions, making, using. What are their findings? Consistent with Tarek's (and his team's)? How would this new knowledge be incorporated so others could see it, build off it? I noticed that on GitHub, they also include the source files. In discussing the possibility of adding source files to Field Ready’s open documentation with a FR-Nepal intern, he expressed concern of how to control the integrity of the source files if many people are tweaking it. Thoughts on this?
TS: There would have to be a process in place to link mods to the original source to log files that outline the changes, with the ability for an end-user to move back through the revision chain to the original. I think having the source files available is critical.
JR: I may be a little naive but I feel that if you involve the communities involved in the development (the co develop route) you can find suitable testing and also negate the communication of risk as you ensure the knowledge is shared suitably.
Again, many thanks to Tommy, Jon and Kobina for getting this ball rolling.
After reading the above brainstorm, what thoughts do you have to share?