Using the Management API on iOS

This post will walk you through implementing a simple iOS app for uploading images from the device's Camera Roll to Contentful.

Like the CDA SDK before, you simply install the CMA SDK via CocoaPods:

1
2
platform :ios, '7.0'
pod 'ContentfulManagementAPI'

In the project's README, you can also find explanations for other options, like Git submodules or a pre-built static framework.

After you successfully installed the library you need to create a client object. For this you need to obtain a Content Management API token. A token can easily be obtained in the developer center.

With it, you can instantiate a client object:

1
CMAClient* client = [[CMAClient alloc] initWithAccessToken:@"your-token"];

Next we need to fetch a space to use. Contrary to the Content Delivery API, we can just list all spaces the given account has access to:

1
2
3
4
5
6
7
[client fetchAllSpacesWithSuccess:^(CDAResponse *response, CDAArray *array) {
    self.spaces = array.items;

    [self.tableView reloadData];
} failure:^(CDAResponse *response, NSError *error) {
   NSLog(@"Error: %@", error);
}];

In our app, they are displayed as a list of names in a table view

Once we have selected a specific space, we can create resources, in our case an asset. This can be done using the -createAssetWithTitle:description:fileToUpload:success:failure: method. For this, we need to specify an URL to a file, though and we just have a local image in the photo library at the moment. For a temporary upload, we use this Pod, which just gives us a file URL back, as we need it:

1
2
3
4
5
6
7
8
9
[[BBUUploadsImUploader sharedUploader] uploadImage:someImage
completionHandler:^(NSURL *uploadURL, NSError *error) {
  if (!uploadURL) {
    NSLog(@"Error: %@", error);
    return;
  }

  NSLog(@"URL of uploaded image: %@", uploadURL);
}];

We can obtain the latest photo from the Camera Roll using the AssetsLibrary framework quite easily:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
-(void)fetchLatestPhotoWithCompletionHandler:(void (^)(UIImage* latestPhoto, NSError* error))handler {
  NSParameterAssert(handler);

  ALAssetsLibrary *library = [ALAssetsLibrary new];

  [library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group,
  BOOL *stop) {
    [group setAssetsFilter:[ALAssetsFilter allPhotos]];

    [group enumerateAssetsWithOptions:NSEnumerationReverse usingBlock:^(ALAsset *alAsset,
    NSUInteger index,
    BOOL *innerStop) {
      if (alAsset) {
        ALAssetRepresentation *representation = [alAsset defaultRepresentation];
        UIImage *latestPhoto = [UIImage imageWithCGImage:[representation fullScreenImage]];

        *stop = YES; *innerStop = YES;

        handler(latestPhoto, nil);
      }
    }];
  } failureBlock: ^(NSError *error) {
    handler(nil, error);
  }];
}

Using the upload URL, we can create our asset and start the processing of the image:

1
2
3
4
5
6
7
8
9
10
11
12
[self.space createAssetWithTitle:@{ @"en-US": @"Some image caption" }
description:@{ @"en-US": @"Upload from iOS" }
fileToUpload:@{ @"en-US": uploadURL.absoluteString }
success:^(CDAResponse *response, CMAAsset *asset) {
  [asset processWithSuccess:^{
         NSLog(@"Upload successful.");
    } failure:^(CDAResponse *response, NSError *error) {
        NSLog(@"Error: %@", error);
    }];
} failure:^(CDAResponse *response, NSError *error) {
    NSLog(@"Error: %@", error);
}];

As you can see, the Content Management API allows us the specify values for different locales in just one API call. Keep in mind that -processWithSuccess:failure: is asynchronous and will not report errors of the actual image processing back. If it was successful, you can publish the resulting asset from the Contentful web app as you normally would.

such content, very uploading, wow

When building the app, be aware that it will automatically upload the lastest image from your Camera Roll. Also, we have the SDK on GitHub and the API documentation on CocoaDocs.