Fruit Sizing App

Collecting Image Data For Apple Recognition

By July 12, 2019 No Comments
Reading Time: 2 minutes

Machine Learning requires a large amount of data to be input to the system in order for it to learn what is correct/incorrect output. In other words: our system needs to learn how to identify an apple in various conditions and how to measure its size as accurately as possible. Our approach required a field trip to collect large amounts of data along with a lengthily annotation process to teach the system what an apple really looks like. But this allowed for greater flexibility and robustness when measuring fruit size under varying conditions including weather/illumination, scale and angle.

The 2-day field trip led us to the beautiful Nelson area on the South Island of New Zealand where Hoddy’s Fruit Company, Tyrella Orchards & McLean Orchard kindly agreed in having us take approximately 3000 suitable apple bin images. Working with forward thinking and generous growers like this is critical for the success of a project like this. We are humbled to be able to work closely together to build Spectre.

Unfortunately, we were not quite lucky with the weather, although this area is known for lots of sunshine: we have more than sufficient images in the ‘rainy’ and ‘overcast’ weather categories, whereas ‘sunny’ conditions were seen only briefly.

Custom data-set image of full apple bin in the orchard

Custom data-set image of a full apple bin in overcast condition

Again a very special thank you to Hoddy’s Fruit Company, Tyrella Orchards & McLean Orchard for all your help, it would not have been possible without you!

The next blog post will go into more details of how we built the initial apple detector. Keep in touch!