Many of our favorite photos are very hard to get: they're the candid, natural moments that happen in between the posed photos, like the fleeting yet adorable looks our pets or kids give us. Last October we shared that we’ve been working on Google Clips, a lightweight, hands-free camera that uses on-device machine learning to help you capture beautiful and spontaneous moments of family, friends, pets, and yourself. Simply turn the camera on and it will capture and edit clips of these moments, while letting you join in as well.
Starting today, Clips is available in the U.S. for $249 from the Google Store, Best Buy, B&H and Verizon.
Clips isn’t designed to replace your smartphone camera or your DSLR. It’s a new type of camera that captures the moments that happen in between posed pictures by using on-device machine learning to look for great facial expressions from the people—and pets—in your life. It turns these into short clips without you having to use video editing software. Clips comes with a companion app on Android or iOS that lets you share your content with friends or other apps. You can also pick any frame from these clips to save as a high-resolution still photo.
Designed for privacy and control
From day one working on Clips, we knew privacy and control were extremely important, and we’ve been careful to design and engineer Clips to uphold those principles:It looks like a camera and has an indicator light, so everyone around knows what it does and when it’s on. It also works best when it’s less than 10 feet away from what it’s capturing so you can see where it is in the room.It doesn’t need a data connection to function, nor does it require an account. We miniaturized machine learning models to run locally on the device.Just like a traditional point-and-shoot camera, none of your clips leave your device until you decide to save or share them. If you decide to save clips to Google Photos, then clips will be backed up to your Google Account if you have Backup and Sync turned on in the Google Photos app.
Tips for using Clips
We think you’ll find that the camera is one part familiar to the point-and-shoot you’ve used in past, and one part brand new. Here are a few pointers on how it works:
Clips looks for stable, clear shots of people and then looks for good facial expressions, such as joy. We also trained it to recognize dogs and cats, and it prefers when there’s some motion in the scene. Once the lens is twisted clockwise, it’ll turn on and start capturing.
People and pets look best when they’re three to eight feet away from the camera—think playroom, not soccer field.
The Clipstand that comes with the Clips device makes it easy to set down, hold, or clip the camera to things like a chair or vase to get unique vantage points. There’s also a feature called Live Preview in the Clips app to provide a clear view of the action as it happens.
Clips comes with a shutter button on the device (and in the app) which lets you capture something specific. Clips understands faces, smiles, dogs and cats, but doesn’t know a surfboard from a ski slope (there’s a great phone for those pictures!). For those times you want to capture a photo manually, you can use the button on the front of the device or in the Clips app.
Over time, Clips will learn who you want to photograph frequently. You can also give it a head start by letting it learn from photos downloaded from your Google Photos library, or by taking a portrait using the shutter button.
Use the Clips app to view, save, delete and share your clips, and to choose any frame to save as a high-resolution still or motion photo. If you use Google Photos, finding your clips and stringing them into beautiful movies is a snap. And Pixel users get unlimited backup.
Since Clips has machine learning at its core, it will keep getting better over time. We’re excited to help you capture more of the moments you love with Clips.