From Singapore to the USA and all around Europe, Edible Innovations profiles food makers that engage in improving the global food system at every stage, from production to distribution to eating and shopping. Join us as we explore the main trends in the industry from a maker perspective. Chiara Cecchini of Food Innovation Program — an ecosystem with a strong educational core that promotes food innovation as a key tool to tackle the great challenges of the future — introduces you to the faces, stories, and experiences of food makers around the globe. Check back on Tuesdays and Thursdays for new installments.


Setting sustainable goals in your home can be hard to achieve when worrying about busy schedules. However, between the many hungry mouths to feed and continuous meal prep, the average American household throws out over $2,000 of food waste a year. Country-wide, that averages out to about $165 billion a year! Even if you take into account retailers skewing that statistic with the amount of produce they throw out, that is still an uncomfortably large amount of food to go to waste.

Not content to sit back and simply hope that number goes down, Gustav Nipe and Abi Ramanan turned to technology to help reduce the average household’s overall food waste. Realizing that how humans judge a food’s freshness and nutrition is flawed, as it typically begins and ends with how we perceive the food with our naked eye, the two created a software that uses hyperspectral imaging to analyze pictures of food. This analysis presents a better, more detailed understanding of the food’s freshness and nutritional value. They called this software ImpactVision.

You have probably found yourself Googling how long you can keep your favorite proteins fresh, as it can be difficult to visibly notice any physical change in any protein until it is too late. ImpactVision works with cameras to see what you eyes cannot, so you will know exactly how fresh your food is. To ensure that the technology would be truly effective, Nipe and Ramanan developed this new technology to be both easily accessible and affordable.

The two started their project by looking at the food industry’s largest population of land usage: beef. Raising cattle for protein has caused vast forestry emissions, huge changes in landscapes, and massive effects on climate change. As beef goes through production, equipment determines whether or not the food is edible. The technology that determines its safety is often unavailable to individual consumers and retail stores, due to its cost and its difficulty to use. ImpactVision changes that.

Rather than tossing out food because you are uncertain of its quality, ImpactVision’s camera and software technology looks for any markers that might suggest whether or not the food is off. While we may believe we can determine if our beef has gone off with a cursory look, our eyes are simply not powerful enough to detect all of the colors and qualities of safe meat. ImpactVision gives our eyes a much needed upgrade.

Both Nipe and Ramanan want ImpactVision to help on the production side of food as well. Imagine being an avocado farmer about to ship your produce to locations around the world. Naturally, you would want the food to be edible by the time it reached its final location. The type of imaging software that ImpactVision can provide would allow you to make decisions on the avocado’s ripeness and usability. You would be able to determine which of your produce could be shipped far away and still remain fresh, and which needed to be sold in local retail stores because they needed to be sold and eaten sooner rather than later. Not only does this help build profit for you and your farm, it means that less of your produce is going to be thrown away before it even hits the markets where it is supposed to be sold. The accuracy that ImpactVision can provide goes above and beyond what the human senses can do.

What does the future hold for our two co-founders? For now, lowering both the size and cost of the hyperspectral sensors, so more food retailers will be able to afford these types of products and incorporate them into their stores. After that, they want to sell the sensor at a consumer level and find a way of integrating this type of software into smartphones. They also envision the sensors being installed into refrigerators, scanning produce and proteins for their ripeness and sending out alerts when it is best to use those ingredients.