I've recently seen some interesting videos about using Automator to do more stuff, so I thought I'd give it another try. (See my first post about Automator for some background.)
Turns out that you can pass multiple variables into an action, if you are very, very careful. In general it's easy - if you use two "Get Value of Variable" actions chained together you get an array with the two arguments. Unless you don't. Going back to our example I need to provide two arguments: one is the URL of an image and one is a text string describing the map. The Automator script that I currently use saves the filename into a text file, then asks for the description which it saves into a separate text file. Then it fires off a Ruby script that processes the filename into the URL for the image and makes the necessary changes to the web server. It works, but it's a bit slow and it looks ridiculous.
OK, so you're with me so far right? We have a file that the user selected (via a 'Ask for Finder Items' action), and a description that the user provided (via a 'Ask for Text' action). So let's store each of those in a variable and get them back to back. It works! We have two arguments in the data stream. We can pass that to a 'Run Shell Script' action and $1 will be our filename, and $2 will be the description. I work with that for a bit and it's all great, so I go ahead and start plugging those changes into the "real" script. Except it blows up when I try to retrieve the description variable.
After a lot of messing around I realized the types of the variables is important. The 'Ask for Finder Items' gives you back an file alias. (If you inspect the result it will say 'Alias' in blue and then the filename.) If you manipulate that (say by a 'Rename Finder Items' action) it changes from an alias to the actual file. If you get an alias variable and then a text variable you will be fine. If you get a file then attempt to get a text variable the second get fails. There's a fix for this. You can run a 'Store Disk Item References' action and it will turn the file back into an alias. That seems wonky, but sure whatever.
Next problem. My action *MOVED* the image file from my local drive to the server. So the original file doesn't actually exist by the time you get the variables together. This will also fail, and cause a screwup, even if you have an alias. After some thought I decided to copy the file and then delete the file on my local drive later. Turns out you can delete the "alias" and there's no error, but the original file is there. You have to do the opposite of that 'Store Disk Item References'. Get the variable, run a 'Retrieve Disk Item References', THEN run the 'Move Finder Items to Trash'. Goofy, but I can hang with it.
OK, so now we're set. I've got a script action that is getting the two arguments, I can write the necessary sed mojo (sed is a UNIX command that lets you manipulate text) to convert the filename over to the URL and we're cooking with gas. It all works! Fantastic! Now, there are a couple of actions where Automator seems really slow: both of those Disk Item references take a while and I'm now using Preview to convert the RAW file from my camera to a JPG (which means I don't have to change the filetype on my camera every week.) The convert is instant but Automator takes several seconds before moving to the next step. I wonder if making it into a stand-alone "application" would help? I try that. The application doesn't work. I open the workflow (the one that just worked mind you) and run it. It crashes. After a lot of playing around I confirm what it looked like originally. I can take a running workflow, add a new "Run Shell script" command and paste in the commands I want, and that workflow will run. Save it and run it again and it still runs. Close and reopen Automator and the Run Shell Script action now errors out!
I have no idea what that's about. When I last complained about Automator I felt like I didn't really understand how it worked. Now, after several hours I understand how it's supposed to work, and it just plain doesn't all the times. The distinction between a file and an alias to a file is subtle and it's not documented. The automator documentation calls both of them a Files/Folders object. The 'Rename Finder Items' action claims its input is a 'Files/Folders' and its result is a 'Files/Folders'. Which is true enough, but the change from an alias to a real file changes what you can do with variables later. This second problem with the external script command not working the same on a file load as it does when the command is created is just bullshit.
So, I really need to get rid of Automator. It has a lot of promise, and if I was just automating something simpler it would be OK. But in a way I preferred the way I thought it worked (where it seemed very limited but it worked) to what it actually does (where some things don't work right at all and there are undocumented types that you need to care very much about.)
Read moreA Slight Snag
Here I go with more about backups again! I recently posted about using JungleDisk to backup my documents to Amazon's S3 storage. When I posted that I had still been running the original big old backup to the cloud, plus an annoying update where I moved around a few gigabytes of stuff and so my system hadn't been sleeping for a week or so. I assumed that was because it was grinding away at the backup. Since then I've discovered that TinyGod won't sleep automatically when JungleDisk is running. I can put it to sleep manually (with the sleep option on the System menu) but that's not what I want. I have a support ticket open with the JungleDisk people, we'll see what happens. In all fairness the current version of the JungleDisk software is listed as beta, so maybe this problem will go away soon.
In the meantime, after I reboot TinyGod I run this command to unload JungleDisk:
Read moresudo launchctl unload /Library/LaunchDaemons/com.jungledisk.service.plist
Obviously the usual caveats apply. If you don't know what that means, probably don't run it. If you're not comfortable mucking around at the command line, don't run this. It may not fix the problem. Etc.
If you followed my Twitter feed on Friday you know it was a little more complex than this. There are two other issues that complicated my diagnosis. First off is that running this shutdown doesn't seem to always fix the sleep issue. I think the problem is that if I run the JungleDisk Monitor application then sleep is just screwed up until I reboot, even if I later unload the daemon.
The second thing is that I had installed an application called FuzzyClock while the big backup was running. FuzzyClock also prevents sleep. I was somewhat amused by getting the "fuzzy times" in my menu bar, but not enough to put up with this side effect so FuzzyClock got the boot. I probably would have normally noticed the sleep-blocking the first day I ran the application, but it snuck under the radar while the big backup ran.
At any rate, if you care about having a system sleep automatically when idle I'd recommend staying away from FuzzyClock and JungleDisk at the moment. Personally I'm willing to manually shut-down JungleDisk like this and make it a process I run every week or so as opposed to having no offsite backup, but your mileage could certainly vary.Just in case you thought the TV studios/networks were accidentally "getting it"
I haven't written about it but I've been running Boxee on my Mac Mini for a while now. It's pretty sweet. To give one real world example, during the month of December Heroes seemed to consistently run long or late and the TiVo didn't know it so we had like three episodes cut off the last minute or two. No problem, I'd fire up Boxee and we'd stream the show off Hulu. We'd have to watch an ad or two and it wasn't HD, but we'd get the important final line of dialog and be on about our business. Note the most important thing there: we'd end up watching an ad or two.
Well, Boxee has been asked to pull the Hulu support. Hulu says they are asking this because the content providers made them. Which is ridiculous. Let's get this straight. Watching the show, plus unskippable ads in a web browser is OK, but watching the exact same content in a program designed to work with a remote control isn't? Why is that again?
They do realize that I can get the same show, minus the ads, via Bittorrent right? And feed that into Boxee (or Front Row or whatever)? Whatever these content folks smoke, I gotta get me some of that ....
(Word on the street is that putting "rss://thejakemarsh.com/boxee/" in a feed for Boxee patch it up, but the feed is getting hammered right now.)
A big tip 'o the hat to Veronica Belmont for reporting both the problem and the possible fix.
Read moreOffsite Backups, the Final Piece of the Puzzle
Back in 2007 I did a whole set of posts about backups, laying out my "clone your drives" strategy and listing what software I used to make the backups, and going through several different choices for Windows. About a year and a half later that's still what I do for the most part - every machine has an external drive and I weekly clone the entire boot drive to the external backup. I've added Time Machine more for the "oh shit, I'd like to get yesterday's version of a particular file" coverage, and since a lot of things are heavily sync'ed between my laptop and my desktop that means I really have something like six copies of my main files. Windows has been mainly ghetto-ized into a virtual machine so it's a single file that gets cloned and backed up that way and to hell with trying to run a Windows backup. (I still have Boot Camp and the Bart PE/DriveImage XML solution for that install, but I don't think I've used Boot Camp in 2009 yet. Which means the first day I want it I'll have to let Microsoft play patch-and-reboot for quite a while. Be nice if they could get their shit together about updates.)
There was one remaining glaring hole in the entire system: if a tragedy destroys all of the hard drives in my office and living room then all six copies of whatever can be destroyed at once. D'oh! I finally got around to solving that last week, and my suggestion is Jungle Disk. Jungle Disk is a cross-platform piece of software that backs up data to Amazon S3 storage. When I installed the OS X version it walked me through adding S3 to my existing Amazon account and then configuring the backup process. Jungle Disk can run backups automatically, throttle bandwidth usage, provide encryption, will archive multiple versions of files, and costs $20 for a license on as many computers as you want. It took around a week being throttled during the day to get my Documents folder and my Aperture photo library up to the cloud - that's roughly 26 Gigabytes of data. Now that it's all up there it looks like it will be able to fairly easily handle updates and changes as we go. I just finished installing it on Horton to backup my Subversion archive, MySQL databases, and all of the web pages.
S3 charges $0.15 per Gb of storage per month and $0.10 Gb of transfer in (plus some other twiddly charges for overhead requests). My projected bill for S3 as of March 1st is $4.98. So for $25 ($20 for JungleDisk and $5 for Amazon S3) I've backed up every document and photo I have to the cloud. If every hard drive I own is destroyed at the same time, once the insurance claim clears and I buy a new Mac I can pull down everything I care about. Sure it would take a while, but none of it's GONE. That's worth $5/month no doubt.
If you care about your data, the Amazon S3/Jungle Disk combo seems pretty solid to me. I'm aware of some free services that do this, but there's no business model there that makes sense to me, and if you're using a backup service you don't really trust then what's the point? I don't think Amazon is going anywhere and since they are charging money (even if it is a piddly amount of money) then if I have a problem I can be aggressive about complaining and yelling.
Anyway, so far I'm pretty happy with how it all works and I can sleep easy knowing that even if a meteor crushes my house and demagnetizes the rubble I'm still able to restore my critical data.
Read moreWiFi 2009
Last week I had my laptop in the living room and it was getting miserable wireless performance. This was outrageous and unacceptable because my laptop has a 802.11n card and the Time Capsule that plugs right into the DSL modem has 802.11n as well and it was less than ten feet away, in the same room. This made me decide to make the plunge and set up tiered WiFi in my house. I had the Time Capsule, the Airport Extreme, and the Airport Express all participating in a single 802.11b/g/n network which meant I had great coverage throughout the house and yard, but performance was pretty sub-par.
The first thing to do was to make the Time Capsule run a separate 802.11n only network. This has a much lower range - it only reaches about half of my office, but that's OK, because if my laptop is in my office I usually plug it into the LAN with a cable. This was pretty easy, configuring Airport stuff has gotten a lot easier since the last time I tried it.
The next trick was to make take the Aiport Extreme (which was in my office for coverage) and make it the main base station of a new 802.11g only network. I also moved it to Karin's office for a more central location. I suppose in practice I could put it in the living room, but since it can reach the Airport Express in the kitchen as extender putting those on either side of the center of the house is good. Once I got the Extreme running 802.11g then I connected the Airport Express up to that and we're off to the races.
The Wii is happy to talk to the 802.11g network, and it's the most primitive home device I had. The iPhones all all of the laptops talk to g without a hassle. Lorax (my Macbook Pro) can connect to N through a large chunk of the house and will gracefully degrade onto the G network at the extreme edges.
There are two devices that won't talk to the G network: the PSP and the DS. I'm not sure what the PSP's issue is, a lot of people online claim it will connect to a G network and it does try, but it fails. It used to talk to the exact same hardware when it was running b/g mode, so it almost has to be a G issue. The DS won't even TRY to connect to the G network.
The solution for that is pretty simple: plug Lorax into the wired LAN, and run internet sharing on it. Set up a basic poor-man's b/g network and you're off to the races with the PSP. The DS is odd, it complains about not being able to find DHCP. The solution is to give it a manual IP address, which is a minor pain to configure but it works. For the record, Lorax sets up a subnet on 10.0.2.x and becomes 10.0.2.1. I gave the DS 10.0.2.3 for an address (since I had just been messing with the PSP and figured it might have taken 10.0.2.2), 10.0.2.1 for the gateway, and then gave it my ISP DNS servers and it works. A hat tip to Tony who told me how he had configured his DS to work with Mac internet sharing.
Speaking anecdotally, the performance of the g-only network seems better than the b/g one. Since the only reasons I was keeping the b network alive were the portable gaming systems and I can't tell you the last time I actually USEED either system online this seems like a win. Plus I have the fancypants n-only network with the "wide channels" for super-plus-fast WiFi for the newer hardware in the house.
Of course this means if you've set up your computer or phone to use my WiFi you'll have to redo that on your next visit. :-) Keeping everyone on their toes!
Read more