Privacy Consent in Mojave (part 2: AppleScript)

This two-part series discusses lessons learned in controlling the user consent for access to private information by a third part program in macOS Mojave. In Part 1 of this discussion, we saw how to query the user for consent to privacy-restricted areas, how to do it synchronously, and how to recover when your program has been denied consent.

Consent for automation (using AppleScript) is more complicated. You won’t know whether you can automate another application until you ask, and you won’t find out for sure unless the other application is running. The API for automation consent is not as well-crafted as the API for other privacy consent.

The source code for this article is the same project I used in Part 1. It is available at https://github.com/Panopto/test-mac-privacy-consent under an Apache license. The product that drove this demonstration needs automation control only for Keynote and PowerPoint, but the techniques apply to any other scriptable application. Note that this sample application is not sandboxed. You’ll need to add your own entitlements for AppleScript control if you need to be sandboxed; see https://developer.apple.com/library/archive/documentation/Miscellaneous/Reference/EntitlementKeyReference/Chapters/EnablingAppSandbox.html#//apple_ref/doc/uid/TP40011195-CH4-SW25.

 

You will want to think more carefully about whether to ask your user for automation permission, and when to ask. You don’t want to bombard your customer with a large number of requests for control of applications that won’t be relevant to the task at hand. For the Panopto video recorder, we don’t ask for permission to control Keynote or PowerPoint until we see that someone is recording a presentation and is running Keynote or PowerPoint. If you’re running just Keynote, we won’t ask for PowerPoint access. One other wrinkle for automation consent that’s different from media consent: you only have one string in your Info.plist to explain what you’re doing. You can have separate (localizable) strings to explain each of camera, microphone, calendar, and so on. But Automation gets only one explanation, presented for each application you want to automate. You’ll have to be creative, perhaps adding a link to your own website with further explanation.

 

Screen Shot 2018 09 03 at 5 21 53 PM

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The newer beta versions of macOS Mojave provide an API to query the automation consent status for a particular application: the C API AEDeterminePermissionToAutomateTarget(). , defined in AppleEvents.h.You’ll call that with an AppleEvent descriptor, created either with Core Foundation or with NSAppleEventDescriptor. The descriptor targets one specific external application using the external application’s bundle identifier; you’ll need a different descriptor for each external application you want to control. Here’s how to set it up, using the C style API just for fun (you were expecting Swift???):

 

– (PrivacyConsentState)automationConsentForBundleIdentifier:(NSString *)bundleIdentifier promptIfNeeded:(BOOL)promptIfNeeded

{

    PrivacyConsentState result;

    if (@available(macOS 10.14, *)) {

        AEAddressDesc addressDesc;

        // We need a C string here, not an NSString

        const char *bundleIdentifierCString = [bundleIdentifier cStringUsingEncoding:NSUTF8StringEncoding];

        OSErr createDescResult = AECreateDesc(typeApplicationBundleID, bundleIdentifierCString, strlen(bundleIdentifierCString), &addressDesc);

        OSStatus appleScriptPermission = AEDeterminePermissionToAutomateTarget(&addressDesc, typeWildCard, typeWildCard, promptIfNeeded);

        AEDisposeDesc(&addressDesc);

        switch (appleScriptPermission) {

            case errAEEventWouldRequireUserConsent:

                NSLog(@”Automation consent not yet granted for %@, would require user consent.”, bundleIdentifier);

                result = PrivacyConsentStateUnknown;

                break;

            case noErr:

                NSLog(@”Automation permitted for %@.”, bundleIdentifier);

                result = PrivacyConsentStateGranted;

                break;

            case errAEEventNotPermitted:

                NSLog(@”Automation of %@ not permitted.”, bundleIdentifier);

                result = PrivacyConsentStateDenied;

                break;

            case procNotFound:

                NSLog(@”%@ not running, automation consent unknown.”, bundleIdentifier);

                result = PrivacyConsentStateUnknown;

                break;

            default:

                NSLog(@”%s switch statement fell through: %@ %d”, __PRETTY_FUNCTION__, bundleIdentifier, appleScriptPermission);

                result = PrivacyConsentStateUnknown;

        }

        return result;

    }

    else {

        return PrivacyConsentStateGranted;

    }

 

}

There’s an unfortunate choice made in AppleEvents.h to wrap the definition of result code errAEEventWouldRequireUserConsent in a #ifdef that defines it only for macOS 10.14 and higher. I want my code to work on earlier releases too, so I’ve added my own conditional definition to work on earlier versions. If you do the same thing, you’ll probably have to fix your code when Apple fixes their header:

// !!!: Workaround for Apple bug. Their AppleEvents.h header conditionally defines errAEEventWouldRequireUserConsent and one other constant, valid only for 10.14 and higher, which means our code inside the @available() check would fail to compile. Remove this definition when they fix it.

#if __MAC_OS_X_VERSION_MIN_REQUIRED <= __MAC_10_14

enum {

    errAEEventWouldRequireUserConsent = –1744, /* Determining whether this can be sent would require prompting the user, and the AppleEvent was sent with kAEDoNotPromptForPermission */

};

#endif

Finally, let’s wrap this up in a shorter convenience call:

 

NSString *keynoteBundleIdentifier = @”com.apple.iWork.Keynote”;

– (PrivacyConsentState)automationConsentForKeynotePromptIfNeeded:(BOOL)promptIfNeeded

{

    return [self automationConsentForBundleIdentifier:keynoteBundleIdentifier promptIfNeeded:promptIfNeeded];

}

 

Caution: this code will not always give you a useful answer. If the automated program is not running, you won’t know the state of consent, even if you’ve been granted consent previously. You’ll want to test whether the automated program is running, or react to changes in NSWorkspace’s list of running applications, or perhaps even launch the automated application yourself. It’s worth taking some time to experiment with the buttons on the sample application when your scripted app is running, not running, never queried for consent, or previously granted/denied consent. In particular, methods like showKeynoteVersion will not work correctly when the scripted application is not running.

 

Screen Shot 2018 09 04 at 8 17 25 PM

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

We can nag for automation consent, just as we do for camera and microphone consent. But the Security & Privacy Automation pane behaves differently. It does not prompt the user to restart your application. So let’s add a warning in the nag screen, in hopes of warding off at least a few support requests.

 

Screen Shot 2018 09 04 at 10 57 33 AM

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Automation consent is more complicated than media and device consent. Felix Schwarz, Pauloa Andrade, Daniel Jalkut, and several others have written about the incomplete feel of the API. This pair of posts is meant to show you how to ship software today with the API that we have today.

Advertisements

Privacy Consent in Mojave (part 1: media and documents)

MacOS Mojave brings new user control over applications’ access to user data, camera, microphone, and AppleScript automation. This two-part series describes our experience adopting the new privacy requirements in the Panopto Mac Recorder. We needed to smooth out the process for camera, microphone, and AppleScript, but our approach will work for any of the dozen or so privacy-restricted information categories.

 

Because the Panopto Mac Recorder is a video and audio capture application, we need to comply with Camera and Microphone privacy consent. Any call to AVFoundation that would grant access to camera or microphone data triggers an alert from the system, and an opportunity for the user to grant or deny access. 

 

However, the view controller that needs camera and microphone access has multiple previews, and a live audio level meter. The calls from AVFoundation to request access are asynchronous. That means that bringing up that one view controller triggers six different alerts in rapid succession, each asking for camera or microphone access. That’s not a user experience we want to present.

 

I talked with Tim Ekl about the problem. He said that Omni Group was using a single gatekeeper object to manage all of their privacy consent requests. That’s the approach we decided to take. A singleton PrivacyConsentController is now responsible for handling all of the privacy consent requests, and for recovering from rejection of consent.

 

The source code for PrivacyConsentController is available at https://github.com/Panopto/test-mac-privacy-consent under an Apache license.

 

The method -requestAccessForMediaType: on AVCaptureDevice requests access for audio and video devices. It takes a completion handler (as.a block), which is fired asynchronously after a one-time UI challenge. If the user has previously granted permission for access, the completion handler fires immediately. If it’s the first time requesting access, the completion handler fires after the user makes their choice. 

 

For simplicity’s sake, we require that the user grant access to both the camera and the microphone before we proceed to the recording preview screen. We ask for audio access first, and then, in the completion handler, ask for camera access. Finally, in the completion handler for the camera request, we fire a developer-supplied block on the main thread.

 

We need to support macOS versions back through 10.11. So we’ll wrap the logic in an @available clause, and always invoke the completion handler with a consent status of YES for macOS prior to 10.14. We track the consent status in a property, with a custom PrivacyConsentState enum having values for granted, denied, and unknown. We use the custom enum because the AVAuthorizationStatus enum (returned by –authorizationStatusForMediaType:) is not defined prior to 10.14, and we want to know the status on earlier OS versions.

 

There’s another complication, though. The user alert for each kind of privacy access (camera, microphone, calendar, etc) is only presented once for each application. If they clicked “grant”, that’s great, and we’re off and running. If they clicked “deny”, though, we’re stuck. We can’t present another request via the operating system, and we can’t bring up our recording preview.

 

Enter the nag screen. The nag screen points the user to the correct Privacy & Security pane. We will show the nag screen (optionally, depending on a parameter to our gatekeeper method) from the completion handler if permission is not granted.

 

Putting it all together, here’s what the IBAction looks like for macOS 10.14, with the guard code in place, restricting access to the AVFoundation-heavy view controller until we get the consent we need. This code works all the way back to macOS 10.11.

 

– (IBAction)newRecording:(id)sender

{

    [[PrivacyConsentController sharedController] requestMediaConsentNagIfDenied:YES completion:^(BOOL granted) {

        if (granted) {

            [self openCreateRecordingView];

        }

    }];

}

 

– (void)openCreateRecordingView

 

{

}

 

Here’s the entry point for media consent:

 

Screen Shot 2018 09 03 at 5 18 43 PM

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

– (void)requestMediaConsentNagIfDenied:(BOOL)nagIfDenied completion:(void (^)(BOOL))allMediaAccessGranted

{

    if (@available(macOS 10.14, *)) {

        [AVCaptureDevice requestAccessForMediaType:AVMediaTypeAudio completionHandler:^(BOOL granted) {

            if (granted) {

                self.microphoneConsentState = PrivacyConsentStateGranted;

            }

            else {

                self.microphoneConsentState = PrivacyConsentStateDenied;

            }

            [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {

                if (granted) {

                    self.cameraConsentState = PrivacyConsentStateGranted;

                }

                else {

                    self.cameraConsentState = PrivacyConsentStateDenied;

                }

                if (nagIfDenied) {

                    dispatch_async(dispatch_get_main_queue(), ^{

                        [self nagForMicrophoneConsentIfNeeded];

                        [self nagForCameraConsentIfNeeded];

                    });

                }

                dispatch_async(dispatch_get_main_queue(), ^{

                    allMediaAccessGranted(self.hasFullMediaConsent);

                });

            }];

        }];

    }

    else {

        allMediaAccessGranted(self.hasFullMediaConsent);

    }

}

 

The call to -requestAccessForMediaType: is documented as taking some time to fire its completion handler. That is in fact the case when you’re asking for consent for the first time. But on the second and subsequent requests, the completion handler is in practice invoked immediately, with granted set to the user’s previous answer.

 

Here’s a sample nag screen, to recover from a denial of consent:

 

Screen Shot 2018 09 03 at 5 19 01 PM

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

– (void)nagForMicrophoneConsentIfNeeded

{

    if (self.microphoneConsentState == PrivacyConsentStateDenied) {

        NSAlert *alert = [[NSAlert alloc] init];

        alert.alertStyle = NSAlertStyleWarning;

        alert.messageText = @”Panopto needs access to the microphone”;

        alert.informativeText = @”Panopto can’t make recordings unless you grant permission for access to your microphone.”;

        [alert addButtonWithTitle:@”Change Security & Privacy Preferences”];

        [alert addButtonWithTitle:@”Cancel”];

        

        NSInteger modalResponse = [alert runModal];

        if (modalResponse == NSAlertFirstButtonReturn) {

            [self launchPrivacyAndSecurityPreferencesMicrophoneSubPane];

        }

    }

}

 

How do we respond to the alert? By linking to a URL that is not officially documented, using the x-apple.systempreferences: scheme. I worked out the URLs by starting with the links at https://macosxautomation.com/system-prefs-links.html, and then applied some guesswork. You can see many of the URL targets I found in the source code at https://github.com/Panopto/test-mac-privacy-consent.

 

– (void)launchPrivacyAndSecurityPreferencesMicrophoneSubPane

{

    [[NSWorkspace sharedWorkspace] openURL:[NSURL URLWithString:@”x-apple.systempreferences:com.apple.preference.security?Privacy_Microphone”]];

}

Take note: when you’re working with camera, microphone, calendar, reminders, and other media-based access, your program’s privacy consents will NEVER change from PrivacyConsentStateDenied to PrivacyConsentStateGranted within a single run of your program. The user must quit and restart your program for the control panel’s consent to take effect. For standard media/calendar/reminders consent, your users will see a reminder to quit and restart your app. We will see in the next post that this is NOT the behavior for AppleScript consent.

Screen Shot 2018 09 04 at 10 56 24 AM

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

For testing, use the command line invocations “tcc reset All”, “tcc reset Camera”, “tcc reset Microphone”, or “tcc reset AppleEvents”.

Next up, in a separate post: how do we deal with AppleScript consent requests? It’s a bit more complicated.

Updating SceneKit WWDC 2013 slides for Xcode 7

With recent changes to the AppKit headers, you need to make a couple of changes to the WWDC 2013 SceneKit Slides code to get it to build. There are some cool examples in that year’s talk/sample code that didn’t make it into 2014’s.

In the ASCPresentationViewController, switch from a method declaration for the -view superclass override to a property in the header, and specify @dynamic for that property in the implementation.

@property (strong) SCNView *view;

//- (SCNView *)view;

 

@dynamic view;

//- (SCNView *)view {

//    return (SCNView *)[super view];

//}

I also updated the .xcodeproj to current standards, and fixed a couple of int/NSInteger/NSUinteger mismatches.

I’ve submitted it to Apple as rdar://23829155. In the meantime, here are the diffs:

diff --git a/SceneKit_Slides_WWDC2013/Scene Kit Session WWDC 2013/Sources/ASCPresentationViewController.h b/SceneKit_Slides_WWDC2013/Scene Kit Session WWDC 2013/Sources/ASCPresentationViewController.h
index 7d66316..bb0e54f 100644
--- a/SceneKit_Slides_WWDC2013/Scene Kit Session WWDC 2013/Sources/ASCPresentationViewController.h
+++ b/SceneKit_Slides_WWDC2013/Scene Kit Session WWDC 2013/Sources/ASCPresentationViewController.h
@@ -55,7 +55,9 @@
@property (weak) id <ASCPresentationDelegate> delegate;

// View controller
-- (SCNView *)view;
+// Hal Mueller change: make this a property, @dynamic, to compile under Xcode 7/10.11 SDK
+@property (strong) SCNView *view;
+//- (SCNView *)view;
- (id)initWithContentsOfFile:(NSString *)path;

// Presentation outline
diff --git a/SceneKit_Slides_WWDC2013/Scene Kit Session WWDC 2013/Sources/ASCPresentationViewController.m b/SceneKit_Slides_WWDC2013/Scene Kit Session WWDC 2013/Sources/ASCPresentationViewController.m
index 46d9e00..1c914b6 100644
--- a/SceneKit_Slides_WWDC2013/Scene Kit Session WWDC 2013/Sources/ASCPresentationViewController.m
+++ b/SceneKit_Slides_WWDC2013/Scene Kit Session WWDC 2013/Sources/ASCPresentationViewController.m
@@ -91,9 +91,10 @@ typedef NS_ENUM(NSUInteger, ASCLightName) {

#pragma mark - View controller

-- (SCNView *)view {
- return (SCNView *)[super view];
-}
+@dynamic view;
+//- (SCNView *)view {
+// return (SCNView *)[super view];
+//}

- (id)initWithContentsOfFile:(NSString *)path {
if ((self = [super initWithNibName:nil bundle:nil])) {
@@ -660,12 +661,12 @@ typedef NS_ENUM(NSUInteger, ASCLightName) {

#pragma mark - Misc

-CGFloat _lightSaturationAtSlideIndex(int index) {
+CGFloat _lightSaturationAtSlideIndex(NSInteger index) {
if (index >= 4) return 0.1; // colored
return 0; // black and white
}

-CGFloat _lightHueAtSlideIndex(int index) {
+CGFloat _lightHueAtSlideIndex(NSInteger index) {
if (index == 4) return 0; // red
if (index == 5) return 200/360.0; // blue
return 0; // black and white
diff --git a/SceneKit_Slides_WWDC2013/Scene Kit Session WWDC 2013/Sources/ASCSlideTextManager.m b/SceneKit_Slides_WWDC2013/Scene Kit Session WWDC 2013/Sources/ASCSlideTextManager.m
index ce17c6f..cdc12a4 100644
--- a/SceneKit_Slides_WWDC2013/Scene Kit Session WWDC 2013/Sources/ASCSlideTextManager.m
+++ b/SceneKit_Slides_WWDC2013/Scene Kit Session WWDC 2013/Sources/ASCSlideTextManager.m
@@ -71,7 +71,7 @@ static CGFloat const TEXT_FLATNESS = 0.4;
return self;
}

-- (NSColor *)colorForTextType:(ASCTextType)type level:(int)level {
+- (NSColor *)colorForTextType:(ASCTextType)type level:(NSUInteger)level {
switch (type) {
case ASCTextTypeSubtitle:
return [NSColor colorWithDeviceRed:160/255.0 green:182/255.0 blue:203/255.0 alpha:1];

Brent Simmons: Notes from Mac programming class guest lecture

Last week, Brent Simmons was kind enough to visit the Mac programming class I’m teaching. He’s posted the notes from his talk online:

Notes from Mac programming class guest lecture: “The idea behind the lecture was to talk about what makes a great Mac app. I took that as an excuse to talk about everything from work habits to UI to marketing. “

(Via Brent Simmons inessential.com.)

Timeline 3D on sale

BeeDocs is running a Macworld Expo special for their wonderful app Timeline 3D. Through January 11, it’s only $30, less than half price.

I’ve been using this app for some personal history and genealogy stuff. But the prospects are really unlimited. Timeline 3D speaks XML. It can import RSS feeds, iCal calenders, iPhoto images, and other formats. It communicates directly with NetNewsWire, iPhone, iTunes, iCal, and Contacts.

This is really a cool, beautiful product. Support response has been fast and helpful. You should buy this app.

ATI Radeon 9600 graphics card and G5 Mac under Leopard

The ATI 9600 AGP card does not appear to be a good choice for upgrading the graphics on a dual G5 Mac.

Yesterday and today I went through at least a dozen cycles of shut down computer, open computer, swap graphics cards, close computer, reboot in safe mode, reboot in normal mode, in a failed attempt to upgrade my dual G5 Mac desktop’s graphics card. I’d really like to get my 30″ Cinema Display connected to this machine.

I just got off the phone with a senior support tech at ATI/AMD. He told me that there are known issues with this particular card in a G5 running the latest sub-version of Leopard (10.5.5). He suggested that I revert the computer to 10.5.4 or earlier, which I am not willing to do.

I asked about driver updates. They are no longer provided separately by ATI. The senior tech I spoke with says he believes there will be a compatible driver update in Leopard 10.5.6. I’m just going to RMA the card. I’ll just live with the older graphics card, and the smaller monitor, until I replace this G5 with an 8 core Intel Power Mac (how many days until MacWorld?).