[Rails] Mask an image using ImageMagick, Paperclip and S3

A project I’m currently working on requires image masking. I had already developed the site with plans to deploy to Heroku using Paperclip for image attachments using Amazon S3 for storage (you can read how I set that up here). I scoured the web for existing tutorials and documentation but found little that was relevant to my situation. My first inclination was to write a post-process method – grab the image, mask it and write it back to S3 which proved to be a dead-end (you can see the problem I ran into here – granted it’s possible you could still go that route). After a bit of Googling I ended up using a processor. Here’s what I did (I must confess the “boilerplate” code for the processor was some I found, unfortunately I did not keep the link – if you recognize it please let me know!):

In my model:

  has_attached_file :image,
  :styles =>{
    :main_feature => {:geometry => "1020x470", :processors => [:masker] },
    :large => "1020x470",
    :event_page => "460x212", 
    :top_feature => "345x159",
    :smallest => "229x131"#,
  },
  :storage => :s3,
  :s3_credentials => "#{Rails.root}/config/s3.yml",
  :path => ":attachment/:id/:style.:extension",
  :url => "/:id/:style/:basename.:extension",
  :bucket => "yo-bucket-name"

Note line 3, where I’ve added a processor called “masker”. I created a folder called ‘paperclip_processors’ inside my lib directory and created masker.rb. In that same folder I included the png of my mask (mine is simply called mask.png). I’m using an alpha mask. In masker.rb I placed the following code:

module Paperclip
  class Masker < Processor
    def initialize file, options = {}, attachment = nil
      super
      @format = File.extname(@file.path)
      @basename = File.basename(@file.path, @format)
    end

     def make  
          
      src = @file
      dst = Tempfile.new([@basename, @format])
      dst.binmode

      begin
        parameters = []
        
        parameters << ':source'
        parameters << ':mask'
        parameters << '-alpha'
        parameters << 'on'
        parameters << '-compose'
        parameters << 'CopyOpacity'
        parameters << '-composite'
        parameters << ':dest'

        parameters = parameters.flatten.compact.join(" ").strip.squeeze(" ")

        mask_path = File.expand_path('lib/paperclip_processors/mask.png')
        success = Paperclip.run("convert", parameters, :source => "#{File.expand_path(src.path)}[0]", :mask => "#{mask_path}[0]", :dest => File.expand_path(dst.path))
      


       rescue PaperclipCommandLineError => e
         raise PaperclipError, "There was an error during the mask for #{@basename}" if @whiny
       end

       dst
     end

  end
end

Lines 18-25 show the arguments we’ll be using to interface with ImageMagick (the documentation for this is found here). Basically we’re inputting the source and mask, turning the alpha flag on, using the compose method with CopyOpacity to copy the opacity of the mask to the final composite (masked) image and finally the destination. Now when an image is added to my model, a new image is sized and created – best of all, it works on Heroku!

iOS 6 UIRefreshControl – Pull To Refresh Like Mail App

viewDidLoad:

self.refreshControl = [[UIRefreshControl alloc] init];    
[self.refreshControl addTarget:self action:@selector(refreshView:) forControlEvents:UIControlEventValueChanged];

Refresh:

- (void)refreshView:(UIRefreshControl *)sender {
    //Refresh...
    NSLog(@"Refresh");
    
    [sender endRefreshing];
}

Source: http://stackoverflow.com/questions/12607015/uirefreshcontrol-ios-6-xcode

Extract (uncrush) Images from an iOS App

Often I’ll see an app incorporate a new UI implementation that really impresses me. Still being a learner, I always wonder how these elements are coded or put together — is it an image trick? Maybe a CGGradientLayer? How are they able to achieve so much speed? It’s times like these that it helps to get a small peek under the hood. Like Mac apps, iOS apps are bundled together basically as a zip file — you can actually change the IPA extension to ZIP and unarchive the app very easily. The limitation here is Xcode compresses (or ‘crushes’) the png’s used in an app to keep the file size down. Luckily, since iOS 3.2, the command line tool that compressed these images included the added functionality of decompressing them. Peter Boctor (of iDevRecipes) wrote a Ruby script called App Crush that would uncrush the png files (with instructions here) but it hasn’t been updated in about a year. Since then, Apple has changed how Xcode is deployed (through the App Store these days) and the location in which they put it. It’s very easy to update the Ruby script to find the new Xcode and several have. I wanted to make it just a tad easier and created an Applescript Droplet that allows you to drag the IPA onto it and uncrushes the png files. You can download it here. I’ve also made this version’s source available here. This is my first attempt at an AppleScript Droplet (which seems like a dying tech) so if you see something that could be done better, let me know!

Usage instructions:

  1. If you don’t know the location of the IPA (app) you’d like to uncrush, open iTunes and click the apps tab, right click on the app you’d like and click “Reveal in Finder”
  2. Drag IPA onto the App Uncrusher
  3. It’ll take up to a few minutes to finish and you’ll have a folder called “APPNAME images” on your desktop that should be full of pngs.

Be ethical with this. Don’t steal another app’s images and call them your own.

Load it Faster: Speed Up Your iOS App Loading Time

I’m always thinking about how to make my apps run and load faster. It’s incredibly important, especially to your users. Earlier today I ran across a Twitter conversation (http://twitter.com/flyosity/status/239044820394471424) between a few guys I consider to be top notch: Sam Soffes, Jake Marsh and Mike Rundle. They were discussing best practices for getting your app loading speed down.

There’s a great article by Brent Simmons in which he discusses his methods for making Glassboard 2.2 load faster, it’s available here:
http://inessential.com/2012/08/23/an_iphone_app_startup_performance_techni

Mike also pointed out Sam’s code from Cheddar:
https://github.com/nothingmagical/cheddar-ios/blob/master/Classes/CDIAppDelegate.m#L68

In his didFinishLaunching method, he performs only the most essential tasks and puts everything else in an async queue to run in the background without blocking the main thread. This frees the app up to get the UI and other elements rolling.

On top of this optimization you can do a few things to make your app APPEAR to load faster, namely, a proper Default.png. It’s easy to fall into the temptation to create a cool splash screen but it’s my belief a good Default.png will help “lead” your user into the app and reduces the amount of perceived load time. Apple recommends (http://developer.apple.com/library/ios/#DOCUMENTATION/iPhone/Conceptual/iPhoneOSProgrammingGuide/App-RelatedResources/App-RelatedResources.html) you use a screenshot of your initial view with the text and buttons removed. Here are a few examples of good Default.pngs:

WordPress

Colloquy

Cheddar

In reality, Default.png will only show for a second or so but when properly done, in conjunction with a few of the techniques mentioned by the other guys, can help make your app load feel snappier.

Using custom fonts on iOS (iPhone iPad)

This is a quick walk-through on using a font other than those supplied by Apple on an iOS device.

For reference purposes, you can find out what fonts are available to you “out of the box” by checking out this comprehensive list:
http://iosfonts.com/

I’m going to be using Bebas for my example, a great font created by Dharma Type. You can pick it up here: http://www.dafont.com/bebas.font or use a font of your own choice. It’s important to note you should check a font’s license before you use it in an app you intend to distribute in the app store.

If your font’s not installed on your Mac, go ahead and install it. Before we get too deep into coding and while you’re in or around Font Book let’s go ahead and get the PostScript name of your font. You can do this by selecting your font from the list inside of Font Book and pressing Command + I to toggle the font information. The right side of the window will look like this:

The PostScript name is listed on the top, with Bebas, the PostScript name is simple… it’s Bebas but most are more complicated. Take the PTSans family for example: PTSans-Regular to PTSans-CaptionBold. Keep this PostScript name handy as we’ll reference it later.

Moving on let’s get the ttf file into an Xcode project.

I started with a Single View Application template, go ahead and get that going as normal. Inside my Supporting Files folder I’m going to create a group named “Fonts”. I’m going to drag BEBAS___.TTF into that directory and make sure “Copy items into destination group’s folder (if needed)” is checked. Click finish.

Next, open your app’s plist. Right click and add a row, we’re going to add the key “Fonts provided by application” which is an array of the ttf font files. Toggle that down and for Item 0 add BEBAS___.TTF.

Now you need to head over to your project’s build phases tab. Click to the “Copy Bundle Resources” and click the + icon to add a new item and choose BEBAS___.TTF.

Now, when your window looks like this, you’re ready to use the font in the application:

I put some simple code to create a UILabel in my viewDidLoad method like this:

    UILabel *bebasFlavoredLabel = [[UILabel alloc] initWithFrame: CGRectMake(0, 0, 320, 44)];
    bebasFlavoredLabel.text = @"Bebas on iPhone";
    [bebasFlavoredLabel setFont: [UIFont fontWithName:@"Bebas" size:15]];

    [self.view addSubview: bebasFlavoredLabel];

On line 3 you see where we use [UIFont fontWithName:@"Bebas" size:15]. The name you use there is the PostScript name you found at the beginning. Go ahead and run:

Viola! Your font is ready to be used as you wish!

Twitter on iOS: Tweeting a Tweet, The TweetSheet

Tweeting on iOS hasn’t been easier since iOS 5.

To get started, add the Twitter.framework to your project:





Next, import Twitter.h into your ViewController:

#import <Twitter/Twitter.h>


You’ll probably want to create a new action on a UIButton, so we’ll call that tweetButtonPress:


- (void) tweetButtonPress:sender {
    
}

Inside that method we’ll create a TWTweetComposeViewController, this is what is referred to as the “TweetSheet” and looks something like this:

if([TWTweetComposeViewController canSendTweet]){
//Create the tweetsheet
    TWTweetComposeViewController *tweetSheet = [[TWTweetComposeViewController alloc] init];

//Set initial text of the tweet
[tweetSheet setInitialText: @"Hello Twitter World"];

//Add a completion handler for the tweetsheet
    tweetSheet.completionHandler = ^(TWTweetComposeViewControllerResult result){
        [self dismissModalViewControllerAnimated:YES];
    };

//Show the tweetsheet
[self presentModalViewController:tweetSheet animated:YES];
}

else{
 NSLog(@"Handle inability to send Tweet");
}

Most of this is pretty straightforward but on line 6 we set the initial text of the Tweet. This will be user editable but you can certainly have a “recommended tweet”. On line 1 we check to see that the user can even send a Tweet. There are a few reasons they might not be able to: they may not have a Twitter account setup on their device or they may not currently have an internet connection — you can handle these issues however you please in the else starting on line 17. On lines 9-11 we setup a completion handler that will dismiss the TweetSheet when the user is done. Finally, on line 14 we bring the TweetSheet into the view.

Now, there are a few additional methods you can call to attach images and links to the user’s tweets.

To add a URL to the Tweet:

[tweetSheet addURL:[NSURL URLWithString:@"http://jaysonlane.net"]];

Twitter will automatically shorten this to a t.co link.

To add an image to the Tweet:

[tweetSheet addImage:[UIImage imageNamed:@"image.png"]];

And Twitter will handle the uploading for you. These types of attachments will be displayed on the right side of the TweetSheet held on by the paperclip to let the user know they’ve been added.

Lock screen “Now Playing” with MPNowPlayingInfoCenter

Note: Example project is available here

One of the great additions iOS 5 brought us is the ability to display meta data about currently playing tracks on the lock screen. This is a great convenience to users and a must if your app has background audio playing. I know I use this to great extent, especially when driving. Best of all, it’s actually quite simple to get going.

For the sake of this tutorial, we’re going to be focusing mainly on the MPNowPlayingInfoCenter and not much on how to play streaming audio. If you have questions, as always, please feel free to leave a comment. I, as I’ve stated in the past, am still fairly new to the iOS/Objective-C world so if you see something that makes you say ‘UR DOING IT WRONG!’, please let me know. If you’d like to review the details of MPNowPlayingInfoCenter, you can read the Apple documentation.

To get started, create a new project. I created one using the single view template but feel free to do whatever you’d like. Once you’ve created the project there are a few things we need to do to get the project setup. By default, simply playing audio won’t persist if the application leaves the foreground. We need to tell iOS we’d like to play background audio. To get started doing this, open up your application’s info.plist file and add a new row: “Required background modes” (UIBackgroundModes). This creates an array with Item 0 change the value to “App plays audio” (audio). I’ve got a screen shot of what this should look like:

Next, there are a few frameworks we need to link to our project:

  1. AVFoundation.framework
  2. MediaPlayer.framework
You can do this by click on your project, and selecting the Target and swiveling down “Link Binary with Libraries”, click the + at the bottom and begin typing those names. Once you’re done, it should look like this:
Now we’re ready to start playing audio. In my view controller, I’ve created a simple IBOutlet UIButton called playButton and linked it in the nib. I’ve also attached an IBAction, playButtonPress to the button’s touch up inside event.

You can see those items in myViewController.h:


@interface ViewController : UIViewController {

IBOutlet UIButton *playButton;

}

@property (nonatomic, retain) IBOutlet UIButton *playButton;

-(IBAction)playButtonPress:(id)sender;

Next I’m going to import the MPMoviePlayerController header to add a player to my view controller class:

#import <MediaPlayer/MPMoviePlayerController.h>

To add the player controller we’ll add these lines to the ViewController.h:

MPMoviePlayerController *audioPlayer;

and

@property (nonatomic, retain) MPMoviePlayerController *audioPlayer;

Swing over to your ViewController.m file and we’re going to add a few more headers:

#import <MediaPlayer/MPNowPlayingInfoCenter.h>
#import <MediaPlayer/MPMediaItem.h>
#import <AVFoundation/AVFoundation.h>

And we’ll synthesize our playButton and audioPlayer:

@synthesize playButton, audioPlayer;

In viewDidLoad we’ll initialize our audio session and audioPlayer and pre-load it with content from the web. We’re going to use Oliver Drobnik’s Cocoanetics podcast as our audio feed for this tutorial. Oliver has an awesome podcast packed with great info for iOS developers, you can find out more about it at http://cocoanetics.com.


[[AVAudioSession sharedInstance] setDelegate: self];

NSError *myErr;

// Initialize the AVAudioSession here.
if (![[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&myErr]) {
    // Handle the error here.
    NSLog(@"Audio Session error %@, %@", myErr, [myErr userInfo]);
}
else{
    // Since there were no errors initializing the session, we'll allow begin receiving remote control events
    [[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
}

    //initialize our audio player
    audioPlayer = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL URLWithString:@"http://www.cocoanetics.com/files/Cocoanetics_031.mp3"]];
    
    [audioPlayer setShouldAutoplay:NO];
    [audioPlayer setControlStyle: MPMovieControlStyleEmbedded];
    audioPlayer.view.hidden = YES;
    
    [audioPlayer prepareToPlay];

It’s important we have line 12 as this is what tells iOS we want to receive any remote control events from the lock screen (i.e. backward, play, pause, forward). I’ve found that without that line your information will never display (here’s a link to my Stack Overflow question chronicling my journey to that discovery).

Moving onward, next we need to create our action that actually plays the audio and posts the information to the lock screen:


- (IBAction)playButtonPress:(id)sender {
    
    [audioPlayer play];
    
    Class playingInfoCenter = NSClassFromString(@"MPNowPlayingInfoCenter");
    
    if (playingInfoCenter) {

        
        NSMutableDictionary *songInfo = [[NSMutableDictionary alloc] init];
        
       
        MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage: [UIImage imagedNamed:@"AlbumArt"]];
        
        [songInfo setObject:@"Audio Title" forKey:MPMediaItemPropertyTitle];
        [songInfo setObject:@"Audio Author" forKey:MPMediaItemPropertyArtist];
        [songInfo setObject:@"Audio Album" forKey:MPMediaItemPropertyAlbumTitle];
        [songInfo setObject:albumArt forKey:MPMediaItemPropertyArtwork];
        [[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:songInfo];


    }
}

Let’s go through this line by line starting with line 5. Lines 5 & 7 of our code make sure the class MPNowPlayingInfoCenter exists since this functionality was just added in iOS 5. Next, we create an NSMutableDictionary called songInfo that will contain the information for the lock screen. Next we create an MPMediaItemArtwork item that will store the image. We set our information on lines 13-16. These are the properties that display on the lock screen. Something to keep in mind is there are several 3rd party peripherals that interact with the iPhone/iPod that play music that might pull this information. There are a few additional properties you can assign that may be accessed by other devices and you can read about those in the Apple docs. Last and certainly not least, the magic happens with line 17 as we add the information to the lock screen! Viola!

NSDateFormatter

NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];

[dateFormatter setDateFormat:@"YYYY-MM-dd"];

I’ve enjoyed PHP’s date() documentation but have yet to find comprehensive documentation for NSDateFormatter. I have not verified all of the following so if you find an error, leave a comment and I’ll try to get it fixed as quickly as possible. A ton of this information was gathered from Alex Curylo.

Character Description Example Returned Value
a Ante Meridiem and Post Meridiem AM/PM
A Millisecond of the Day 0..86399999
c/cc Numeric representation of day of the week 1..7
ccc Abbreviated day of the week Sun, Mon, Tue…
cccc Written day of the week Sunday, Monday, Tuesday…
d 0 padded Day of Month 1..31
D 0 padded Day of Year 01..366
e Day of Week with leading zero 01..07
E..EEE Sun, Mon, Tue…
EEEE Sunday, Monday, Tuesday…
F Week of Month, first day of week = Monday, with leading zero 1..5
g Julian Day Number (number of days since 4713 BC January 1)
G..GGG Era Designator Abbreviated BC, AD
GGGG Era Designator Before Christ, Anno Domini
h Hour (12 hr) with leading zero 1..12
H Hour (24 hr, starting at 0) with leading zero 0..23
k Hour (24 hr, starting at 1) with leading zero 1..24
K Hour (12 hr) with leading zero 0..11
m Minute with leading zero 0..59
s Second with leading zero 0..59
S Rounded sub-second
v..vvv General GMT Timezone Abbreviation GMT
vvvv General GMT Timezone Name Atlantic/Azores
z..zzz Specific GMT Timezone Abbreviation
zzzz Specific GMT Timezone Name
Z RFC 822 Timezone +0000
L..LL Month with leading 0 01..12
LLL Month abbreviation Jan, Feb, Mar…
LLLL Full Month January, February, March…
w Week of Year, 1st day of week is Sunday, 1st week of year starts from the last Sunday of last year, with leading zero 01..53
W Week of Month, 1st day of week = Sunday, with leading 0 01..05
M..MM Month of the year 1..12
MMM Month Abbreviated Jan, Feb, Mar…
MMMM Full Month January, February, March…
q..qq Quarter of the year 1..4
qqq Quarter abbreviated Q1, Q2, Q3, Q4
qqqq Quarter written out 1st quarter, 2nd quarter, 3rd quarter…
Q..QQ Quarter of the year 1..4
QQQ Quarter abbreviated Q1, Q2, Q3, Q4
QQQQ Quarter written out 1st quarter, 2nd quarter, 3rd quarter…
y/yyyy Full Year 2012, 2013, 2014…
yy..yyy 2 Digits Year 12, 13, 14…
Y/YYYY Full Year, starting from the Sunday of the 1st week of year 2012, 2013, 2014…
YY/YYY 2 Digits Year, starting from the Sunday of the 1st week of year
u Year

Location with iBooks Author

[Updated 3/9/12: Dashcode is no longer supplied with XCode 4.3+, you can download it here: https://developer.apple.com/downloads/index.action, I’ve also added some screen shots to help clarify the process, feel free to tweet me any questions you have @jaysonlane]

iBooks author was released just a few days ago. I mostly ignored it at first but the more I hear about it on Twitter and blogs, the more interested I become. Today I read an article about the possibility of using location to create more of an interactive book. After spending a few minutes playing around, I’ve decided to write a little about how you do that.

iBooks author uses Dashcode for widgets (they’re .wdgt files). Word of warning: I’m not very familiar with them and I’ve only created a basic widget once in the past but this is a pretty simply implementation, it’s quite easy to get going. Because Dashcode utilizes Javascript, you can simply use the same location code you’d use for a mobile website.

To get going, fire up Dashcode and create a new widget, I chose the custom template:

Once you’ve done that, you’ll see a blank “Hello World” widget — so far, so good:

Next, we’ll need to edit the javascript of the project. You can do this by clicking View >> Source Code (or  + 2). This should open you right into the main.js file. In the main.js load() function, just after the dashcode.setup(Parts), add the following:

navigator.geolocation.getCurrentPosition(getLoc); 

And then at the bottom or where ever you’d like add this function:


function getLoc(position) {

alert(position.coords.latitude);
alert(position.coords.longitude);

}

Don’t worry about running this in Dashcode as my experience was it did nothing, that’s ok. Scroll to the bottom of the “Workflow Steps” toolbar on the lower left side and click “Test & share”:

Then click the Play icon beside Share:

Now choose a name and click “Save to Disk” at the bottom. Save your wdgt file, we’re all done with Dashcode.

 

Next, open iBooks author and create a new book. Once you’ve created your book, click the Widgets drop-down in the toolbar and select HTML:

Click the Edit HTML button at the bottom of the widget:

Now simply drag your wdgt into the HTML box, your page will now look like this:

Now it’s time to test on your iPad. You’ll find your latitude and longitude. Now you can couple this raw data with an AJAX request to convert that data into some usable, relevant text to delight (or completely creep out) your reader.