Yet have you ever wondered how they work and how Snapchat got them? First off, what we call filters, Snapchat actually calls lens. They acquired the technology in 2015 when they acquired a company called Looksery based out of Ukraine for about $150m -which went down the highest acquisition in the country’s tech scene (Nigerians, we next or nah?). Here is their original Kickstarter campaign in July 2014.
They create their augmented reality filters through the use of a growing area known as ‘computer vision’ which is the same technology that powers check deposits through your phone camera, facebook’s photo tags and lane reading by self driving cars. The tech uses an algorithm called the Viola-Jones algo to detect faces. Because the algorithm uses contrast to locate your face, it’s often tricked if you tilt your face or move around a lot.
Once it locates your face, it’s builds a 2D mesh to get the shape of all your features.
The technology behind this is as old as 2001 but it’s only recently that the necessary processing power became available on mobile devices. Since 2014 however, it’s uses and applications have really taken off and databases of our faces are growing accordingly (a little worry there).
Other than the fun and games, Snapchat also sees revenue Potential in their filters through ads and customized displays. The increase in user engagement it encourages is also a big plug for the company. Considering that these filters helped explode Snapchat’s valuation to its current $20bn valuation, the $150m investment has more than paid off.
In my next post, I’ll delve into how Target (the retail company) uses the same technology as well as other techniques to help law enforcement solve murders and other crimes. It really shows how business and technology are intersecting in interesting ways.
Till next time…go Ye into the world and get those filters on.