There are a ton of filters you can use with Instagram Stories. But how do you go about making an Instagram filter?
Software Engineer and Designer Stephanie Harris is not only one of my oldest friends but also a very talented creative technologist! She filled me in on how she recently made this vote glasses design and how you can make one like it too!
1. What is the first step in creating an IG filter?
The first step for anyone looking to create an Instagram filter is to download Facebook’s Augmented Reality software, Spark AR. It’s a pretty intuitive platform so if you’re familiar with products such as the Adobe Suite or Maya 3D, Blender, etc. you’ll feel right at home!
What you do from there sort of depends on the effect you’re trying to create, but there are plenty of extremely helpful tutorials and sample projects/templates made available to you by Spark AR as soon as you open the app. Peeling through some of those project files really helped me get my bearings. I think I was up and running within 20 minutes!
MORE: Instagram Reels updates and longer videos
2. What is the entire process in a nutshell?
One of the coolest parts about innovating in Spark AR, as with most creative platforms, is that there really isn’t a right or wrong method of doing anything (so long as it meets Facebook’s guidelines, which I would highly recommend skimming before starting). There are so many awesome and inspiring projects people in the community have built ranging from 2D interactive games to effects with particle systems. Personally, I just started learning with some simple makeup/photo filter tutorials and worked my way up.
My project, specifically, included a combination of 3D objects (the glasses), 2D animation & texture mapping (also, for the glasses), some basic retouching/blush, and the addition of a color LUT (Lookup Table). I began with the retouch/makeup layer and then had a very basic version of the initial glasses animation working before attempting to add in the color LUT and “tap to change” interaction.
Breaking things up into smaller projects helped, as did working in environments with techniques I was already familiar with to create the related project assets. For example, the glasses texture animation sequence is actually just a simple frame animation I created and exported from Photoshop.
There was definitely a point at which I thought I might not be able to get a second pair of glasses to work and the file size would be too large when I exported, etc. I had to just keep tweaking and testing components. It wasn’t too complicated – just a lot of trial and error (especially when it came to the Patch Editor).
3. Did you encounter anything during the process you hadn’t expected?
For some reason I thought more programming would be involved in creating a simple AR filter, but this really felt like more of a design and animation project. I was able to dust off the illustration cobwebs a tad, though – which was fun!
One unanticipated technical issue I ran into was the difficulty of combining face retouching with a color LUT face mesh. The two meshes would intersect and create some strange and undesirable behavior. Apparently this is a known issue within the Spark AR design community, but it was news to me at the time. Luckily, people have found work-arounds and someone even created a free patch to help with this issue specifically.
4. How does it get “approved” and rolled out on IG?
Once you’re happy with your filter, you simply export the project and, assuming it all checks out, upload the file to your Spark AR Hub. You are then asked to submit an icon and demo video for review, along with some other information about the effect and that’s it!
When you submit an effect for an Instagram filter, they say it can take up to 10 days for approval, but mine was approved in less than 24 hours. It will then automatically appear in a new “effects” tab under your Instagram profile and people will be able to use it.
MORE: Sign up for media consulting services!
5. Anything else we forgot to ask?
A lot of people believe that AR (specifically smart glasses) is going to be the next computing platform in the same way smart phones are now. Everyone from Apple to Google to The New York Times is hiring for augmented reality jobs and universities are now offering XR (extended-reality) related courses – something that obviously did not exist when I was in school.
Most devs and designers currently employed in AR (and VR for that matter) are entirely self-taught so it’s exciting to think about what the implications are for the future and how spatial computing may be used to enhance our experience with reality.