AERecord is a minion which consists of two pods:
Tag Archives: OS X
Google distributes several iOS specific APIs and SDKs via CocoaPods. CocoaPods is an open source dependency manager for Swift and Objective-C Cocoa projects. CocoaPods makes it easy to install or update new SDKs when working with Xcode.
You can install the CocoaPods tool on OS X by running the following command from the terminal. Detailed information is available in the Getting Started guide.
With Xcode 7, the Xcode development tools have expanded to support the new watchOS 2 platform in addition to iOS 9 and OS X El Capitan, with a host of new features to support making your development, testing, debugging, and deployment more seamless and efficient. Xcode 7 allows you to achieve things that weren’t possible before, such as enabling you to detect memory corruption just before it happens, test your app UI while reporting on the testing coverage, or ensure that apps and products downloaded to users’ devices have a sensible footprint and do not consume too many precious resources unnecessarily.
Xcode 7 enables you to to create products that are tailored for each Apple device’s unique platform. Whether that’s implementing glances to help your users stay up to date right from their Apple Watch, or size classes, which let you take advantage of the wide array of configurations of a MacBook Air, an iPhone, or an iPad, Xcode provides the tools to quickly translate your ideas into a reality that fits the target perfectly.
Xcode 7 requires a Mac running OS X version 10.10.4 or later. It includes SDKs for WatchOS 2.0, iOS 9, and OS X version 10.11.
Xcode 7 includes the following highlighted features.
A few months ago Apple introduced a new programming language, Swift, that left us excited about the future of iOS and OS X development. People were jumping into Swift with Xcode Beta1 immediately and it didn’t take long to realize that parsing JSON, something almost every app does, was not going to be as easy as in Objective-C. Swift being a statically typed language meant we could no longer haphazardly throw objects into typed variables and have the compiler trust us that it was actually the type we claimed it would be. Now, in Swift, the compiler is doing the checking, making sure we don’t accidentally cause runtime errors. This allows us to lean on the compiler to create bug free code, but means we have to do a bit more work to make it happy. In this post, I discuss a method of parsing JSON APIs that uses functional concepts and Generics to make readable and efficient code.
Playgrounds are fast and focused, allowing me to rapidly prototype Swift code. I use the OS X version more than I ever expected, even when my actual coding destination is iOS. OS X playgrounds run natively and support cross-platform solutions for technologies like SpriteKit, AVFoundation, and Core Image. Importantly, you can add interactive controls like sliders to your OS X playgrounds using nibs, so you can explore outcomes more flexibly, skipping many edit/compile/run tasks.
Simplicity is the key to effective interactive playgrounds. You want your code to focus on just the material you’re developing and no more. If you’re doing more than establishing a few callbacks in your set-up code, you should seriously consider moving large portions of that set-up into your playground’s sources folder.
This document grew from an set of notes I produced while working on SwiftGraphics. Most of the recommendations in this guide are definitely considered opinions and arguments could be made for other approaches. That’s fine. When other approaches make sense they should be presented in addition.
These best practices do not dictate or recommend whether Swift should be used in an procedural, object-oriented or functional manner. Instead a pragmatic approach is taken. Individual recommendations might be focused on object-oriented or functional solutions as needed.
The scope of this document is mostly aimed at the Swift language and Swift standard library. That said specific recommendations on how to use Swift with Mac OS, iOS, WatchOS and TVOS might be provided if a unique Swift angle or insight can be provided. Hints & tips style recommendations on how to use Swift effectively with Xcode and LLDB might also be provided.
This is very much a work in progress. Contributions are very much appreciated in the form of pull requests or filing of issues.
Discussion can be found on the Swift-Lang slack (in the #bestpractices channel)
Update note: This tutorial was updated for iOS 8 and Swift by Andy Pereira. Original post by Abdul Azeem with fixes and clarifications made by Joseph Neuman.
Recording videos (and playing around with them programmatically) is one of the coolest things you can do with your phone, but not nearly enough apps make use of it. To do this requires the AV Foundation framework that has been a part of OS X since Lion (10.7), and Apple added it to iOS 4 in 2010.
AV Foundation has grown considerably since then, with well over 100 classes now. This tutorial covers media playback and some light editing to get you started with AV Foundation. In particular, you’ll learn how to:
- Select and play a video from the media library.
- Record and save a video to the media library.
- Merge multiple videos together into a combined video, complete with a custom soundtrack!
If you run the code in this tutorial on the simulator, you’ll have no way to capture video. Plus, you’ll need to figure out a way to add videos to the media library manually. In other words, you really need to test this code on a device! To do that you’ll need to be a registered Apple developer .
Are you ready? Lights, cameras, action!