As soon as Google Clips was announced (at Oct 4th Google Event), the first thought that stuck to my head was — did Google purchased Narrative clip or were they inspired by the wearable cameras by Narrative clip? For those who don’t know Narrative clip is a wearable camera for moments that matter – perfect always-on HD camera for lifelogging purposes. I’ve been using one for 16 months now.
Narrative Clip captures HD videos and photos – accessible locally or via the cloud based web and mobile app whereas Google’s machine learning Clips camera seems highly inspired from Narrative Clip wearable camera claiming as — a new way to capture big moments and little ones, too.
Google Clips is ultra-portable just like the Narrative but seems not much comfortable to wear yet have 130-degree field of view, 15 FPS and 16GB of inbuilt storage. Google Clips features Moment IQ, a machine learning algorithm that’s smart enough to recognize great expressions, lighting and framing. And it’s always learning – Google says.
On the other side, Narrative clip is primarily purposed at lifelogging – picks up the best photo from a series of burst pics and is always on. Narrative is compatible both on Android and iOS. Google Clips app is compatible only on select mobile phones including Google Pixel, Samsung Galaxy S7/S8 running Android 7.0 Nougat or greater.
And here is the actual photo taken with a worn Narrative Clip 2 and how you take one:
I leave it to you to judge between the purposes solely meant for both of Google Clips and Narrative Clip: in which the later is meant for lifelogging purpose while the former is of course Google’s machine learning built-in.
Probably Narrative Clip is ending their businesses and Google Clips is here to fulfil the gap, literally! May or maynot these two be compared but definitely they both have something in common – taking photographs in action!