Among the new wave of communication and social apps to hit the Appstore lately, Secret has piqued my interest most. Anyone who has the guts to embrace and advocate anonymity at the age of the omni-present blue button demands our respect.
Secret’s main functionality and UI is dead simple. You type in some text to express your thoughts (Anything from “I have a crush on my friend” to “Sometimes I eat just to stay awake at work”), spice it up with a colorful background or a custom photo, then choose who to share the post with. No superfluous features like geo-tagging or hashtags to get in your way (yet).
The ingenuity of the app lies in the design – a swipe in one direction is all it takes to change the background, and a long press and pan opens up powerful features like real time blurring. I’m sure it would have taken much less effort for the developers to achieve the same functionality with modals and sliders, but the impact and intuitiveness wouldn’t be nearly as close. It really brought to mind the “Good enough isn’t” mantra Apple engineers emphasized in WWDC 2012 Session 243.
And that’s why I wanted to dig deeper and try to re-create their text view.
The source code for this project is up on my Github.
Dissecting the Text View
Secret’s text view has the following components:
- Color chooser: swiping horizontally on the empty areas of the text view changes the background color. A default of 6 is available.
- Texture chooser: swiping vertically on the empty areas of the text view changes the background texture, which is superimposed on the selected color.
- Image editor: after selecting an image from the image picker, panning up and down would modify its brightness and left and right its blurriness. Filters are applied to the selected image in real time.
Building out the components
The impulse was to start subclassing UITextView and adding subviews to manage the background color or photo. I didn’t like this approach because the view will get bloated up in no time. Instead I wanted to modularize the components and have a view controller to handle each layer. In the end I should be able to bring them together with UIViewController containment.
In Secret, the color choosing feature isn’t immediately obvious as the background doesn’t change until the swipe gesture is complete. I wanted to improve on this by implementing the view with a scrollview (more specifically a UICollectionView) so the user can get a glimpse of the adjacent color as he scrolls.
Essentially this is what my version of the color chooser looks like as a UICollectionView using Flow Layout. Each color background is a vanilla UICollectionViewCell.
An additional benefit to using UICollectionView is that we can switch layouts on the fly. This gives us the opportunity to do something dynamic like collapsing all the cells on a long press:
The KTSecretColorChooserViewController class is responsible for this logic. Clients of this class can adhere to the KTSecretColorChooserViewControllerDelegate protocol and implement the didSelectColor:name: method to get hold of the selected UIColor and its name.
This falls in the same vein as the Color Chooser except the swipe direction is vertical. I could use a UICollectionView also but for the initial version I went for the simple approach of having a UISwipeGestureRecognizer on the container view to handle the texture choosing. The KTSecretTextureChooserViewController then displays the correct texture.
The Photos Editor view exists at the same level as the Color Chooser and one is hidden if the other is visible. The Photos Editor view consists mainly of a UIImageView and a UIPanGestureRecognizer which applies a combination of Core Image filters on the image.
To improve performance and memory usage, I re-scaled the images acquired from the UIImagePickerController. It works fine on the iPhone 5 without out-of-memory issues, but it might be worth switching to GPUImage altogether. Information about the applied filters are passed to the delegate via the method didUpdateFilters:brightnessLevel and the filtered image as a readonly property.
Stitching it together
It wasn’t enough to stack the views together as the top most view (in our case, the UITextView) would capture all the touch gestures. To get around this, I had to override hitTest in a UITextView subclass and ask the containing view controller to decide who should respond to the touch.
The containing view controller can in turn ask the ColorChooser and PhotosEditor view controllers to see if they wanted to handle the event:
So the classes are organized like this, with KTSecretViewController as the facade and mediator for touch events propogation.
This exercise of replicating Secret’s Text View proved again the more intuitive and natural a UI control appears to be, the greater amount of effort and time must have gone into its design and implementation. Thanks to the team at Secret and all fellow iOS developers out there who worked hard to explore new patterns of interaction. Your efforts are well appreciated.
Do checkout this post where @natashatherobot replicates Secret’s text animation to great effect.
Again the source code to this post is up in this repo – https://github.com/kenshin03/KTSecretTextView. Please feel free to fork or raise issues or suggestions. Thanks!