Ben Ford

benford.me

UIAutomation on the Command Line

You can run UIAutomation scripts from the command line. It doesn’t seem to be very well documented by Apple. Here are my notes.

The command to run UIAutomation from the command line is:

instruments -t <template> <path to app in simulator> -e UIASCRIPT <path to automation script>

A real world example:

instruments -t /Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/PlugIns/AutomationInstrument.bundle/Contents/Resources/Automation.tracetemplate \
"/Users/ben/Library/Application Support/iPhone Simulator/7.1/Applications/4E07A63F-F14A-482B-9E50-E18EB29D668C/InstrumentsPlayground.app" \
-e UIASCRIPT "~/TestDemo.js"    

Path to App in Simulator

Run something like this:

NSLog(@"bundle path: %@", [[NSBundle mainBundle] bundlePath]);

How to find UIAutomation template path

To find the UIAutomation template path, call:

instruments -s

You will get something like this. Find the path that references Automation.

/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/Resources/templates/Activity Monitor.tracetemplate
/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/Resources/templates/Allocations.tracetemplate
/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/Resources/templates/Blank.tracetemplate
/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/Resources/templates/Counters.tracetemplate
/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/Resources/templates/Event Profiler.tracetemplate
/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/Resources/templates/Leaks.tracetemplate
/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/Resources/templates/Network.tracetemplate
/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/Resources/templates/System Trace.tracetemplate
/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/Resources/templates/Time Profiler.tracetemplate
/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/PlugIns/AutomationInstrument.bundle/Contents/Resources/Automation.tracetemplate
/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/PlugIns/OpenGLESAnalyzerInstrument.bundle/Contents/Resources/OpenGL ES Analysis.tracetemplate
/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/PlugIns/XRMobileDeviceDiscoveryPlugIn.bundle/Contents/Resources/Energy Diagnostics.tracetemplate
/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/PlugIns/XRMobileDeviceDiscoveryPlugIn.bundle/Contents/Resources/OpenGL ES Driver.tracetemplate
/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/PlugIns/XRMobileDeviceDiscoveryPlugIn.bundle/Contents/Resources/templates/Core Animation.tracetemplate
/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/PlugIns/XRMobileDeviceDiscoveryPlugIn.bundle/Contents/Resources/templates/System Usage.tracetemplate
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/Library/Instruments/PlugIns/CoreData/Core Data.tracetemplate
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/Library/Instruments/PlugIns/templates/Cocoa Layout.tracetemplate
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/Library/Instruments/PlugIns/templates/Dispatch.tracetemplate
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/Library/Instruments/PlugIns/templates/File Activity.tracetemplate
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/Library/Instruments/PlugIns/templates/GC Monitor.tracetemplate
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/Library/Instruments/PlugIns/templates/Multicore.tracetemplate
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/Library/Instruments/PlugIns/templates/Sudden Termination.tracetemplate
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/Library/Instruments/PlugIns/templates/UI Recorder.tracetemplate
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/Library/Instruments/PlugIns/templates/Zombies.tracetemplate

This will probably change, but this is the current template path:

/Applications/Xcode.app/Contents/Applications/Instruments.app/Contents/PlugIns/AutomationInstrument.bundle/Contents/Resources/Automation.tracetemplate

iBeacons in the Background

Standard CoreBluetooth advertisements can broadcast while the app is in the background, but not if they were started with CLBeaconRegion dictionary. The workaround is to ditch CoreLocation framework altogether and create your own proximity “framework” using only CoreBlueTooth.

You still need to use the appropriate background specifiers in the Info.plist file (e.g. bluetooth-peripheral and bluetooth-central).

The code looks something like this:

1) create a standard peripheral advertisement using CBPeripheralManager

1
2
3
4
5
NSDictionary *advertisingData = @{CBAdvertisementDataLocalNameKey:@"my-peripheral",
                                  CBAdvertisementDataServiceUUIDsKey:@[[CBUUID UUIDWithString:identifier]]};

// Start advertising over BLE
[peripheralManager startAdvertising:advertisingData];

2) use use CBCentralManager to scan for that service using the UUID you specified.

1
2
3
4
NSDictionary *scanOptions = @{CBCentralManagerScanOptionAllowDuplicatesKey:@(YES)};
NSArray *services = @[[CBUUID UUIDWithString:identifier]];

[centralManager scanForPeripheralsWithServices:services options:scanOptions];

3) in the CBCentralManagerDelegate method didDiscoverPeripheral, read the RSSI value of the advertisement.

1
2
3
4
5
6
- (void)centralManager:(CBCentralManager *)central didDiscoverPeripheral:(CBPeripheral *)peripheral
     advertisementData:(NSDictionary *)advertisementData RSSI:(NSNumber *)RSSI
{

    NSLog(@"RSSI: %d", [RSSI intValue]);
}

4) Translate the RSSI values into a distance.

1
2
3
4
5
6
7
8
9
10
11
- (INDetectorRange)convertRSSItoINProximity:(NSInteger)proximity
{
    if (proximity < -70)
        return INDetectorRangeFar;
    if (proximity < -55)
        return INDetectorRangeNear;
    if (proximity < 0)
        return INDetectorRangeImmediate;

    return INDetectorRangeUnknown;
}

I found that I needed to “ease” or “average” the RSSI values to get anything workable. This is no different than when you are working with any sensor data (e.g. accelerometer data).

I have this concept fully working hope to publish it somewhere at some point.

Also, use the docs (Core Bluetooth Programming Guide) if you get stuck.

Start With the Customer

One of my favorite Steve Jobs quotes is this:

“One of the things I’ve always found is, that, you’ve got to start with the customer experience and work backwards to the technology. You can’t start with the technology and figure out where you’re going to sell it. I’ve made this mistake more than anybody else in this room, and I have the scar tissue to prove it.” — Steve Jobs, 1997 WWDC.

It’s a simple, powerful concept. Start with the customer or the idea or the design, not the technology. Sure, the technology enables the creation of the idea, but the technology isn’t the product. The magic of software is in making that technology disappear. The challenge for us programmers is to solve problems the best way possible, despite limitations in the frameworks or platforms we depend upon.

Steve Jobs at the 1997 WWDC

Audience member: “Mr. jobs, you’re a bright and influential man.”

Steve Jobs: “here it comes [holds chair]”

Audience member: “…on several accounts that you don’t know what you’re talking about. I would like, for example, for you to express in clear terms, how say Java, in any of it’s incarnations, addresses the ideas embodied in open document, and when you’re finished with that, can you tell us what you personally have been doing in the past 7 years.”

[sits down]

Steve Jobs: [drinks]

“uhh… You know, you can please some of the people, some of the time. But, one of the hardest things when you’re trying to effect change, is that people like this gentleman are right in some areas. I’m sure there are some things that open doc does, probably even more that I’m not familiar with, that nothing else out there does. And I’m sure you could make some demos, maybe a small commercial app, that demonstrates those things, the hardest thing is: how does that fit in to a cohesive larger vision, that’s going to allow you to sell, 8 billion or 10 billion dollars of product a year.

One of the things I’ve always found is, that, you’ve got to start with the customer experience and work backwards to the technology. You can’t start with the technology and figure out where you’re going to sell it. I’ve made this mistake more than anybody else in this room, and I have the scar tissue to prove it. I know it’s the case, and as we have tried to come up with a strategy for Apple, it started with what incredible benefits can we give to the customer, and where can we take the customer. Not starting with: lets sit down with the engineers and figure out what awesome technology we have, and then, how are we going to market that. I think that’s the right path to take.

I remember with the laser writer. We built the worlds first small laser printer, as you know, and there was awesome technology in that box: we had the first cannon laser printer engine in the United States; we had a very wonderful printer controller, that we designed; we had Adobe’s post script software in there; we had AppleTalk in there; and I remember seeing the first printout come out of it; and just picking it up, and looking at it, and thinking, you know, we can sell this. Because you don’t have to know anything about what’s in that box, all we have to do is hold this up and go “do you want this?”. And if you can remember back in 1984, before laser printers, it was pretty startling to see that. People went: “Whoah! Yes!” And, that’s where Apple’s got to get back to, and I’m sorry that open doc is a casualty along the way.

I readily admit there are many things in life that I don’t have the faintest idea of what I’m talking about, so I apologize for that too. But, there is a whole lot of people working super, super hard right now at Apple: Avi, John, Greeno, Fred, the whole team is working, burning the midnight oil, and hundreds of people below them, to execute on some of these things, and they’re doing their best, and I think that what we need to do. And some mistakes will be made by the way, some mistakes will be made on the way, that’s good, at least some decisions are being made a long the way, and we’ll find the mistakes, and we’ll fix them. And I think what we need to do is support that team, going through this very important stage, as they work there butts off, we all getting calls being offered 3x as much money to go do this and that, the valley’s hot, and none of them are leaving, and I think we need to support them, and see them through this, and write some damn good applications to support Apple out in the market. Mistakes will be made, some people will be pissed off, some people will not know what they’re talking about, but I think it is so much better than where things were not very long ago, and I think we’re going to get there.”

NOTE: This video was posted by someone else on youtube. I did my best to transcribe the video by hand in my attempt to preserve it.

Function Over Color

Note: I previously published this post during iOS7 beta and decided to pull and republish it once iOS7 was released. This post is a “coming to terms” of the radical color changes and flat design—which I originally didn’t like.

Judging iOS 7 by looks alone will lead to shallow conclusions. Both the color scheme and icon design are only skin deep—a superficial detail in the massive overhaul of the entire OS. The color isn’t what matters. What stands out in iOS 7 is how fluid things move; It feels very tangible and enjoyable to use.

The icons and adorments take advantage of the retina display, and the UI is full of sharp single pixel details. This detail is surrounded by carefully placed negative space, which makes it easy for apps to prioritize the content over the UI. It’s a fantastic experience. Many UI elements exhibit a lot more depth than any two-deminsional screenshot could portray. This includes parrallax and 3D effects which give depth and great variety to the flat UI.

The color palette is certainly unconventional, but this doesn’t affect they way it works or the way it feels. Color is a trend and will soon change. Everything else—the good parts—are here to stay.

The Macro Behind Nil

  • We all know and love nil, but what is it and how does it work?
  • How is nil different from NULL?
  • What gives nil the ability to respond to messages?

(That last one was a trick question.)

The three nothings

nil is the Objective-c version of the C language macro NULL, which is used to indicate a “null pointer”.

Some claim that nil is defined as (id)0, which justifies how it respond to messages. If you look at the system headers, nil is actually defined as __DARWIN_NULL, which in Objective-C is defined as (void *)0.

So actually: nil, NULL, and (void *)0 are all the same thing.

If, like me, finding this out created a lot more questions than it answered, hold on and it will make sense momentarily.

The beginning

Both nil and NULL use the macro __DARWIN_NULL, which is defined in usr/include/sys/_types.h

When compiling Objective-C code __DARWIN_NULL is defined as (void *)0.

Excerpt from user/include/sys/_types.h

#ifdef __cplusplus
#ifdef __GNUG__
#define __DARWIN_NULL __null
#else /* ! __GNUG__ */
#ifdef __LP64__
#define __DARWIN_NULL (0L)
#else /* !__LP64__ */
#define __DARWIN_NULL 0
#endif /* __LP64__ */
#endif /* __GNUG__ */
#else /* ! __cplusplus */
#define __DARWIN_NULL ((void *)0)
#endif /* __cplusplus */

The reason for this chain of #ifdef statements is due to portability. In the C language standard, the value of NULL isn’t important, only that it will never be equal to a non-null pointer. The actual value is implementation dependent, meaning the value may change depending upon the compiler in use.

The takeaway here is that when using clang, the compiler built into XCode, nil and NULL is a zero cast to void *.

What is void *

In C, void * can be assigned to a any pointer type without an explicit cast. Objective-C programmers can think of void * as id; it serves the same purpose.

So void * is basically equivalent to id.

This is why although nil is not explicitly defined as (id)0, it may as well be.

What is the point of nil, NULL, and (void *)0

To recap: nil is a convention and is exactly identical to NULL. Both nil and NULL are a macro for (void *)0.

These three statements are identical:

id nilObject1 = nil;
id nilObject2 = NULL;
id nilObject3 = (void *)0;

The reason for the three separate values is only for convention, and they should be used in three different places.

Conventions

In C, NULL is used to represent “null pointers”. It exists to allow a programmer to make the distinction between 0 and a null pointer. In Objective-C, nil replaces NULL as the way to represent “null pointers”.

Who receives messages

Quick aside on the id type. In Objective-C, id is a reference to an object of unknown type. The compiler will allow any method called on an type id object, as long as this method has been defined previously.

The idea that nil can respond to any method call without an error is actually a feature of id, not nil.

Also, id is also like void * in that you never have to cast the r-value (e.i. value being assigned) during an assignment.

Zero effort

Ultimately, nil and NULL macros are defined as zero. They are only a convention to make code more readable.

nil is used for “null pointer” in Objective-c.

NULL is used for “null pointer” in C.

Technically you could use a plain 0 in place of either of them, but they are conventions that clearly illustrate intent.