There have been some rumors about an Apple TV OS being introduced at WWDC and presumably an SDK. And it seems everyone is hoping Apple will ‘fix’ TV. It is a seriously broken system. But just an SDK won’t fix it.
The path of least resistance.
It would be easy for Apple to offer an SDK for Apple TV. The platform is there and many content providers already have an app that they could port to a TV interface. ESPN, ABC, NBC, etc. all have the infrastructure to deliver video to iOS on the phone/tablet, so adding Apple TV wouldn’t be too difficult. But I think it would be a mistake.
The problem with that is consistency. ESPN would have a layout totally different from ABC, and re-learning each new app you downloaded would be painful experience. Changing “channels” by switching apps would be a completely crazy context switch. TV’s current redeeming quality is it’s ease of use.
There’s a better way.
What if instead of allowing “apps,” Apple required structured content for inclusion in their ‘TV Store’? The basic pieces that would cover most use-cases are:
- Live video stream. Using HTTP Live Streaming of course.
- Program Guide. An XML representation of the past and upcoming shows.
- On-demand video clips. Links to video clips. Although, they could be full episodes.
And now imagine Apple lets users subscribe monthly to each “channel.” Prices set by the channel, but you could imagine $2 for most Cable and maybe even free for the broadcast stations. Movie channels like HBO could be more at the $10 price point. Ads could be served by you or by Apple’s iAds. You could bootstrap a TV channel without needing millions of dollars. Independent video podcast networks like TWiT could have a channel. I could start a channel in my garage. All for just the cost of a camera and a server.
Siri and the interface.
Because they’re getting structured data, Apple could provide a clean and consistent interface. And it seems a solid bet that the interface will be primarily voice based. You could ask Siri “Tune to Wolf Blitzer.” and she would change to the CNN channel. Or “Remind me when the next show about whales airs.” and she would do a search and schedule a reminder.
But the Apple TV doesn’t have a mic? Apple could introduce a new bluetooth remote with a mic built in (idea from @rickriemer). Or one could simply use the new “TV” app built into iOS on iPad, iPhone and iPod touch. When you’re in your living room, it acts as a remote control and interface for Siri for your TV. And when you’re on the bus, you can watch streaming TV just as you can at home.
The sad part.
This won’t happen. Not because it’s hard or because users won’t like it. Because of copyright holders. I don’t know much about the industry, but I’m guessing channels won’t be able to just add their content to the Apple TV without restructuring all their movie and TV show contracts. Lawyers…what can you do?
A lot has been written about why Apple is keeping locations in a file called ‘consolidated.db’, so I’ll try and keep it brief.
Apple is constantly doing cell triangulation in the background. These locations are stored in ‘consolidated.db’ along with a timestamp. They are used for the significant location change API.
I say it’s just cell triangulation because the points recorded aren’t exactly where you’ve been. There is also a column in the database for the ‘HorizontalAccuracy’. This never falls below 500 (meters presumably) in my file and is often higher. I have one point that has an error of 79130m!
Why are there no points at my home or work? I spend all day there!
Cell triangulation isn’t very accurate. I would guess that inside, cell triangulation puts you all over the place. It would be interesting to see the error circles around each point. This would make things a little more clear.
Why is Apple keeping a complete history?
Engineers over-engineer. They’d rather be safe than introduce a bug. It’s a lot easier to keep all the data than to purge it.
What if they’re cell tower locations as suggested by this article?
Why would Apple be tracking cell tower locations. That wouldn’t make sense. They know where the cell towers are.
I doubt there’s anything malicious going on here. Just a solution to a problem that failed to take into account how insane people are about location information.
It’s a little last minute, but here are my 100% correct predictions.
Haven’t heard much about this, but Apple isn’t going to announce a new product with a new feature (front-facing camera) and not come out with software that takes advantage of it. If you think they’ll just have a little switch in the camera app for front or back, you’re crazy.
I’m guessing it will be available on the App Store instead of being part of 4.0.
iBooks for iPhone
I correctly predicted that iBooks would become available for iPhone so why wouldn’t you believe me when I say it’ll be available tomorrow or shortly there after. You know, after Apple gets through the approval process.
Turn by Turn
Haven’t heard one peep about this…but I think Apple is going to at least demo some turn-by-turn features in Maps. Perhaps even debut it as a separate app.
Free iPhone for Attendees
Maybe this is just wishful thinking, but there are a few signs pointing to a free phone tomorrow. First of all, increased conference cost and decreased swag. Not necessarily an indication, but I would imagine the phone costs about what the price of admission went up by.
Developers have to always buy the latest phone. Going from 2G to 3G didn’t affect anyone’s upgrade eligibility because the 2G wasn’t subsidized. Going from 3G to 3GS was a little more rocky. Some developers were left in the lurch, but most didn’t have to wait long. Now the problem has gotten out of hand and compounded because the developers who waited to get the newest phone now are pushed back even further. AT&T announced they will be allowing some to upgrade, but I for one am not eligible.
Apple can nip this in the bud and make sure all the developers who need the phone have it by providing one tomorrow. Not only that, but since everyone can skip an upgrade cycle, next time Apple can stick to the low conference price and shitty bags.
Update: iPhone 4 is a much better name. Glad iChat isn’t standalone and is instead a feature of the phone. Game changing… iBooks for iPhone. No turn-by-turn…although that’s definitely in the pipeline. And damn, no free phone.
Wil Shipley had an idea for Adobe:
If Adobe were smart, they’d modify their Flash ▶ iPhone code to just output Obj-C source code. Not much Apple could do.”
I’m sure Adobe is smart. And that’s why they aren’t getting into a battle they can’t possibly win. Let’s explore a what-if scenario of Adobe going down this path. How would Apple react in this cat and mouse game? Could they create a tool to automatically filter Flash created apps without the sourcecode to your program?
Step 1: Class dump executable.
For those of you that don’t know, you can get all Objective-C symbols out of a Mach-O binary with a wonderful tool called class-dump. If they aren’t there, you’ve already detected it’s a possible obfuscated build.
Step 2: Search class-dump output.
Look for suspicious symbols. Because anything outputted by a source generating engine is um, source, you can see exactly what pops out of a typical compile. It isn’t as though Apple can’t afford a copy of CS5. You can search for “CS5FlashView”1 or any myriad of classes/symbols that would be put into the boilerplate output. This necessitates something way more devious as suggested by rentzsch: polymorphic code.
Step 3: Search for patterns.
After Adobe had spent a good chunk of time creating an Objective-C polymorphic coder, Adobe could now distribute a code generation engine that produces binaries that looks like it’s a home-made animation and application framework. But wait, if all the binaries produce the same obfuscated code, that is trivial to detect. Just look for “ER0GlashView.” This necessitates a nondeterministic Objective-C polymorphic coder. A perfect Ph.D topic, but nothing that’s been done to my knowledge. If it existed, this contest would be utterly pointless.
And you’re not just dealing with hiding a 200 line C function. Imagine randomizing the following:
If you produce class names like “QTi3uigejr” dictionary attacks can trivially red flag programs.
Ok, so now you use perfectly hidden class names, but every class hierarchy produces a recognizable fingerprint for a library. Without knowing anything about the classes, you can see relative relationships between them. The bigger the code-base, the easier this is. I’d assume the flash support structure is fairly large and its interdependency very complicated.
Let’s say you find an awesome little UIView subclass, but Apple has banned anyone from using it on the store. It wouldn’t be too hard to get around. Copy/paste, rename some methods, change a few method names around, etc. You then would have plausible deniability and you could say it wasn’t the banned UIView.
Imagine me asking you to incorporate three20 into your app and obfuscate it in a way where I couldn’t tell you were using it. And assuming the code-generator produces deterministic output, access to all your source code.2 How hard would that be? You aren’t just dealing with changing the names of things. You are tasked with changing the fundamental structure of internal APIs and preserving assumptions that were undoubtedly made when making the code work with itself.
Now imagine making a program that could produce unique obfuscations of three20 and in a way that made it look from the outside like it was redesigned and rewritten from scratch by a human.3 Brain. Explode.
What if Adobe successfully can hide its entire Flash subsystem and so it debuts CS5.1 with the feature turned back on? Hundreds of apps get onto the store by tricking Apple gatekeepers. What do you think will happen when Apple figures out a way to detect with 99% accuracy that an app was generated with Flash?4 Yep, they’ll pull those apps faster than Brian J. Hogan picked up that phone.
The point is, there is no reason for Adobe to take that chance with their customers.
I have been waiting to buy an eBook reader in anticipation of the iPad, so one of the most interesting apps to me is iBooks.
My predictions for iBooks:
- There will be an iPhone version.
- You will be able to drop in your own books.
The thing that bothers me most about the iPad is this: if I had an iPad rather than a real computer as a kid, I’d never be a programmer today. I’d never have had the ability to run whatever stupid, potentially harmful, hugely educational programs I could download or write. I wouldn’t have been able to fire up ResEdit and edit out the Mac startup sound so I could tinker on the computer at all hours without waking my parents. The iPad may be a boon to traditional eduction, insofar as it allows for multimedia textbooks and such, but in its current form, it’s a detriment to the sort of hacker culture that has propelled the digital economy.
I’m a fairly young person, yet I realize these sentiments are shared in every generation. “My kids won’t have the same great experiences I had because of this new technology. They won’t know the joy of riding a horse on the open range because of this newfangled mechanical carriage!”
Your kids are going to have the wonderful experience of opening up an interactive textbook on their iPad and learning about how their cells work. Or tapping on a circuit design simulation application and dragging components into a circuit to create a ripple adder. Or posting to their virtual reality homebase that because of the new GenomeEditor 3000 (with presets!!), their kids aren’t going to have the fun of tinkering with their genome. So the “think of the children!” argument isn’t going to sway me. You can leave those up to Fox News.
The new paradigm of a simplified, consumer electronics computing experience isn’t going to catch on overnight. Look on the bright side though. By the time the only computers left are the ones abstracted away enough to where you can’t edit the startup sound, we’ll be retired or, fingers crossed, dead.
It’s a common problem now on the App Store. User buys Lite version. User likes lite version. User buys the paid version and loses all of her/his data. Not a very good experience.
How can we fix this?
UIPasteboard of course!
Apple introduced an excellent and oft overlooked addition to the iPhone SDK when it gave developers the same power that they have on the desktop for sharing large chunks of data between applications. But Zac, you say, you of all people should know you can’t share data between apps…they’re sandboxed! Not true! Think of UIPasteboard as those little paper airplanes you used to send notes in back in school. No sandbox can hold UIPasteboard…
This first method is fairly straightforward and easy for small data sets: Keep custom pasteboard filled with data you’d want users to import into your paid app.
UIPasteboard *sharedPasteboard = [UIPasteboard pasteboardWithName:@"com.mydomain.myapplite.sharedpasteboard" create:YES]; sharedPasteboard.persistent = YES;
By executing that on launch and keeping your sharedPasteboard up to date, you’ll always be able to request that from your paid app and auto-discover data from the lite version. The nicest thing about this method is most people won’t notice you did anything. Maybe you want users to notice your work…but sometimes the best features are the ones users don’t notice.
This is for apps that have lots of content to put on the pasteboard and updating it all the time wouldn’t necessarily be so great. It’s a little weird but bear with me.
First, make sure you app can respond to the MyAppLite-export:// scheme. The parameters to this don’t really matter, but you could use them to specify some subset of data to export. You could also make this a little more generic and perhaps a little more secure by specifying a pasteboard name to the URL.
When MyAppLite is launched using this scheme, the users is presented with an “Exporting…” screen or something of the sort while MyAppLite puts everything requested on a special, one time use, pasteboard. After the pasteboard has been filled, MyAppLite opens MyApp-import://NAME_OF_PASTEBOARD. Now the paid app knows where to look to get all the necessary MyAppLite data off of the temporary pasteboard.
Method 2 definitely has some security risks which possibly could be worked around. For one, if someone finds out your schemes they can read users data from a lite version and overwrite data in the paid version. With some obfuscated keys, this might not be much of an issue. Also, you wouldn’t want to send passwords or sensitive data using this method. That kind of data should be stored in the keychain anyway that is already available across your apps.
I hope this gives you some ideas on how to transfer data with UIPasteboard. I’ll try and post some sample code illustrating some of these concepts soon.
I made a little app for my personal use the other day and thought I would share it. It basically keeps a window that is globally accessible through a keyboard command (cmd+shift+m by default). It lets you jot down any notes and use them in any other app.
Simi-useful but more importantly…free!
Note: This is also mirrored on openclip.org.
Let me first explain how OpenClip works:
How it works is relatively simple and doesn’t break the SDK agreement. OpenClip works by looking into the Documents folder of other applications to get their pastes. Applications are allowed to write all they want to their own Documents directory (for copy), so no foul there.
Applications are also allowed to read outside their sandbox into the Documents directories for other apps (for paste), so no foul there.
How could that ever go wrong? What’s the problem?
The problem is Apple is probably going to shut down reading into other application’s boxes. I’m all for that as long as one of two things happens before:
- Apple ports NSPasteboard to the iPhone SDK (radar://6158362)
NSPasteboard is what I modeled OpenClip after. It allows copy and paste between applications, but more than that, it adds communication between applications. Porting NSPasteboard would make OpenClip moot and developers could easily respond by changing all references to OCPasteboard to NSPasteboard. Easy for everyone involved.
- Apple shuts down sandbox reading but creates a public folder for apps to write to. (radar://6156881)
If Apple does this, inter-app communication would still be possible through OpenClip or some other file communication method.
Applications need to work together. That’s the Apple Way®. I don’t know what you think, but to me the smaller and simpler an app is on the iPhone, the better it is. To make apps that are simple but powerful, developers need to make applications that can communicate.
What happens if Apple shuts it down?
Here’s what happens if a new version of the iPhone OS comes out and OpenClip can’t communicate anymore:
- Developers who implement OpenClip to either *only* copy or *only* paste can easily check if that will work before displaying any UI. If it breaks, those options just disappear.
- Developers who implement to do both copy and paste retain copy and paste within their application. That means that when and if the next update breaks OpenClip, applications won’t stop launching, they won’t crash when you try and copy and they won’t get garbage data. All that will happen is the app will only be able to copy and paste in the application. Not a horrible way to degrade if you ask me.
What about the UI? There are going to be like 20 implementations for copy/paste!
UI is a hard problem to solve. One of OpenClips goals is to provide examples of UIs for copy and paste. But to be honest, the best thing you can do with OpenClip is use it only for simple data. Only copy interesting data that a user would want to copy and present UI similar to Apple’s push to save image UI. Doing what MagicPad does and providing a whole way to select text is not what 90% of apps need. Look at what Twittelator did with copy in the video on GeekBrief.tv (http://www.geekbrief.tv/copy-and-paste-for-iphone). Press and hold needs to be the way most apps utilize copy and paste.
You can check out Proximi’s MagicPad UI proposal video here. It’s worth a look.
Lets face it. Hardly anyone is not buying an iPhone because it doesn’t copy and paste. It’s useful, but not necessary. Apple knows this so they put it at the bottom of their to-do list. If Apple were to implement NSPasteboard, however, 90% of apps could gain some really great functionality. Until then, maybe OpenClip can serve as a sneak peek to Apple, developers and users that this kind of framework would benefit the iPhone greatly.
I created an open source project for iPhoneDevCamp 2. It allows cross application copy and paste.
Basically, all you have to do to get the benefits is include a few classes and use the very simple API to copy data or paste data. The special part is cross application. Copy a cocktail in Cocktails and paste it into MagicPad (Video of this in action).
There are some limitations. This technically complies with all Apple agreements. It is completely possible that apps that use this wouldn’t get on the App Store. Not for any real reason other than it will eventually step on Apple’s toes. It is also conceivable that the technology this is built on will break in the future. The hope is that the update that breaks this also brings copy and paste support.
If you are interested in looking at some code, send me an email to zacwhite+copypaste@@at@@gmail..dot..com.
Update: Ok people. Go download the code at http://code.google.com/p/touchclipboard/. Willing to hear name ideas
Update 2: Video: http://www.viddler.com/explore/mager/videos/36/.
Ignore all that.
Update 3: Site is up. Check out OpenClip.org.