A cute little trick with Ruby hashes and .try()

Recently I encountered a situation where incoming JSON was converted to a hash, and certain keys were not always present. Naturally I want to safely grab the values that were present without barfing on the nils.

    
    hsh = {foo: bar: [2]}
    bad_hsh = {foo: nil}
    should_be_2 = hsh[:foo][:bar].first # works
    uhoh = fail[:foo][:bar].first # NilClass error

At first began looking for a way to use .try() to get at nested hash values (since we’re not on Ruby 2.3 yet and can’t use the lonely operator). But the normal try syntax didn’t work:

    uhoh = fail[:foo].try([:bar]).try(:first) # does not work

But then I remembered that [] is actually the name of the method, and what’s in it is the first parameter, so I tried this:

    win = fail[:foo].try([], :bar).try(:first) # It works!

This is not fantastic code, I know there are better ways to do this, and I can’t wait to upgrade to Ruby 2.3, but I do find this interesting.

WordPress Multisite Tips

You can’t start a multi-site install from scratch, you have to install the base site first, then turn on multi-site. If you turn it on right out of the box, you’ll get stuck with a database error and probably spend an hour messing with MySQL permissions before you figure it out. Don’t ask how I know 😉

Also, for running WordPress locally, it took a while to get auto-update working (it’s easier to use than than update manually on the local machine).

To do that I had to enable FTP on this machine, which I kind of hate to do because it’s a security hole. But to do that, execute this command:

sudo launchctl load -w /System/Library/LaunchDaemons/ftp.plist

When wordpress asks for FTP info, you can use ‘localhost’ as the hostname

permissions updates may have been a factor, but I’m not sure, but mostly it just worked

The big revelation is the steps for multi-site setup. The wordpress docs are weak in this area. I really wish they would create two different documents, one for subdomain, one for subfolder, because it’s a lot of work to mentally filter out one or the other as you’re reading the document, and it’s easy to miss details.

Of particular note is that first, you turn on wp_allow_multisite, then  the Network settings item appears in your admin, then you enable the network, THEN you add the code it recommentds, includeing the MULTISITE=true item

More than I ever wanted to know about cookies

Here is a list of things I learned working on a project a while back, I wrote them down because they’re so screwy I know I’d force myself to forget them:

  • Cookie size causes both the browser and the server to barf. Total size, not just individual cookie.
  • Deleting a cookie is tricky, sometimes IE won’t delete session cookie without a restart.
  • When deleting, use ‘0’ as expiration, the full “Thu, 01 Jan 1970 00:00:00 GMT” fails in IE, it sets it to 2070 for some reason.
  • If you have two cookies, one with a subdomain, one without, the one without will “win”.
  • it can be hard to mess with a higher scoped cookie from a subdomain. Avoid it if you can
  • And here’s the kicker: Sometimes the buffer space on your load balancer can be too small for the total size of cookies that your browser is sending, even if Apache can handle larger cookies. In that situation you’ll get an ugly 500 error page. This buffer space may only become an issue if there are rules on the LB that require it to inspect the headers.

Just the other day I saw this article on a related note: Let’s Break The Internet

Image Uploads to AWS S3 using Rails 3, Paperclip and PLUpload

For the Invest Your Heart project, I’ve been working on getting image uploads perfected. That’s easy enough with the Paperclip gem. But storing uploads on your Heroku dyno isn’t a good idea, so I wanted to use S3. Oh and throw in upload previews, resizing and a progress bar.

I managed to get it all working, though it took a lot because there are so many moving parts. I’m using the PLUpload code to provide uploads and progress var interface and Paperclip for resizing. And I’m uploading directly to S3, to avoid timeouts on Heroku when large files are uploaded. After the initial upload, Paperclip actually downloads the image, processes it, and uploads the different resized versions back to S3.

I couldn’t have done it without these two incredibly helpful sites: The Rails-S3-Plupload demo project, and Swarut’s post. Thanks guys! I would never have gotten it working without you.

Swarut also turned me on to the Gon gem which provides a simple way to send data from your controllers and JS variables. Very handy for when your app doesn’t need something like Backbone.js