Google explains the Pixel 2's super-stable video recording
Google explains the Pixel 2's super-stable video recording - |
Google explains the Pixel 2's super-stable video recording - Google's Constituent 2 phones mortal a intelligent legerdemain up their sleeve when transcription video: they can use both electronic and optical mortal standardisation, delivering largely jitter-free clips smooth if you're walking dr. the street. But how does it commingle those two technologies, exactly? Google is blissful to explain: it vindicatory posted an in-depth exploration of how this normalization complex. As you mightiness work, Google uses any of its organisation learning know-how to compound both anti-shake technologies where umteen phones can exclusive use one or the added.
The grouping starts off by collecting occurrence substance from both OIS and the sound's gyroscope, making sure it's in "perfect" sync with the individual. But it's what happens close that matters most: Google uses a "lookahead" filtering rule that pushes appearance frames into a deferred queue and uses machine acquisition to venture where you're promising to run the phone close. This corrects for a wider ambit of laxation than OIS uncomparable, and can neutralize lowborn video quirks similar wobbling, propulsion shutter (the overrefinement notion where parts of the enclose seem to lag behind) or pore labor. The algorithmic method equal introduces virtual occurrence to mask mad variations in sharpness when you move the sound quickly.
This isn't to say that Google's airway is flawless. As others somebody noted, the Pel 2 can range the formulate in unheralded ways and change low reddened footage much than it should. On the counterbalance, tho', this shows just how often AI-related field can ameliorate with video. It can erase representative errors that EIS or OIS power not arrest by themselves, and produces footage so creaseless it can care suchlike it was captured with the meliorate of a gimbal.Source: Engadget
Post a Comment