Winter scenes on Cootes Paradise

Contrary to the industrial stereotype of Hamilton, a great natural resource can be found west of downtown: the Cootes Paradise wetland area including Princess Point and the Westdale Ravine, all part of the Royal Botanical Gardens organization.

Living in an urban environment, the feeling is one must travel some distance to find peaceful isolation. Not so. It’s too easy to forget about this natural setting only minutes away.

While the sound of traffic humming along Highway 403 never abates, given the right conditions, it can be obscured. Such was the case this week. There had been snow over the weekend and temperatures remained well below zero. The ice on Cootes Paradise was once again thick and solid, invitingly calling for winter wanderings.

With the weather app on my phone indicating -11C and the winds gusting strongly, it was cold. Really cold! Mistakenly, I browsed my phone gloveless for 30 seconds, resulting in a hand that needed 10 minutes to recover full functionality. It was a slightly painful reminder of just how vulnerable one can be in such an environment, even only minutes away from home. But layers of technical clothing was an effective insulator and kept me cozy while wondering how people survived in similar conditions hundreds and thousands of years ago.

Overcoming the usual uneasy feeling of venturing out on ice for the first time, I made my way west from Princess Point towards Cootes Drive. About halfway the sound of the highway was buffeted away by the gusting wind. Facing west, there were few signs of urban life. Only some distant electrical towers and a few houses on the edge of the Dundas escarpment. Ignoring this and letting my imagination wander, I could picture the natural wonder of this nook at the west end of Lake Ontario and how it received its ‘paradise’ designation, named after British officer Captain Thomas Coote, stationed in the area in the 1780s.

The cold kept the snow as icy granules and the wind created a sea of desert-like dunes. Sitting down and just watching, I could observe their edges slowly growing and eroding, gently creeping east.

Off in the western distance, I watched each wave of wind approach, kicking up billowing clouds of snow along the ice’s surface, occasionally culminating in small vortices. There was something peaceful about sitting in the path of these chaotic waves of drifting snow. The howling sound of the wind. The tinkling, whooshing of icy snow crystals bouncing over the undulating surface. So detached from assurances of safe, modern urban life.

Ripples were etched across the dunes, packed hard by the wind, offering secure footing compared to the cold, hard, slippery ice over which my boots squeaked like wet shoes on linoleum. The snow was capped by a crust like shaped styrofoam. But my weight on the deeper drifts was too much and I would break through to softer snow underneath. it made for more of a trudge, like over a soft sandy beach.

I often avoided these areas. Not due to the harder slog, rather, to preserve their pristine, beautifully wind sculpted surfaces.

Always keeping an eye out for potential photos, it would frustrate me to find I had inadvertently wandered through a particularly interesting area of shapes or patterns, causing me to generally skirt the snow and stay on the ice, lest I should ruin more photo opportunities. But the slippery ice made for difficult progress against the powerful wind. Not that I was in a rush. Had there been someone watching, they might have been puzzled by my frequent tendency to walk briefly then stop, standing in one spot for a few seconds, or minutes. I was simply observing, but to someone else, it might seem strangely paranoid.

At times I felt boxed in by the abundance of potential scenes. It was a problem of information overload and took time to process for photographic merit. I didn’t want to mar potential scenes with footprints. But Cootes Paradise is expansive, and the western half appeared to receive few winter explorers. Occasionally I would run across other footprints, both old and relatively new. But they were exceptions; I was certain to always find fresh, untrampled scenes.

Sometimes it was interesting to follow these footsteps. Usually they hugged the shoreline, which I could understand. There is a certain vulnerability when out on the middle of the ice. Nothing is nearby in case of disaster. I had to remind myself that Cootes averages only 60cm deep. The ‘deeper’ question in my mind was the gooey, silty mud under the water…

In any case, there was no risk of falling through. It had been very cold long enough. Yet, along the shore tucked in small nooks were the occasional open patches of water. Maybe where a spring was bubbling through? Deer often seemed to stop here, based on their tracks in the snow.

The shoreline provided for some interesting motifs. Often this consisted of oak leaves, long turned richly brown, embedded in the ice.

Out in the middle were interesting patches of opal ice. Not the dark steel gray, reassuring kind. Walking here created painfully discomforting crackling and crunching sounds underfoot.

Stopping to examine these areas revealed an interesting ‘honeycomb’ layer of ice lattice sandwiched between a thin surface layer and solid ice underneath. Kicking through it brought a childish feeling of smashing things simply because I could. I suppose a kind of empowerment over the micro-environment underfoot; the shattering of glass-like ice shards provided a momentary diversion. Breaking through to the solid ice below was reassuring, though the unease of hearing the ice crunch underfoot as I continued wandering never fully subsided.

The afternoon sun skimmed along the treetops and soon was setting in the southwest. Rich, golden light feathered across sculpted snow, accentuating patterns of frozen ripples and backlit windblown snow sparkled like fine diamond dust in the air.

And then it was over. The sun set, its warm glow enveloped by cold blue shadows. The wind picked up and instantly it felt colder.

It was time to head home before it became too dark! At least the walk back was fast, pushed along by the unrelenting wind.

Digital photo archiving and Amazon’s S3 online storage service.

It should be fairly safe to assume that any semi-serious digital photographer has some sort of archiving system in place, be it CDs, DVDs, second or third duplicate hard drives, etc. because we’ve all experienced some form of data loss. Generally one approaches this topic with the goal of finding an ideal physical storage medium… But what about all the recent talk about cloud storage?

And what is cloud storage anyway? It’s data stored on remote servers, typically owned by someone else, accessed through the internet. The benefit is that it’s physically separate from one’s other copies and prevents total data loss due to a local disaster (fire, storm, war, etc.). Downsides can include that it’s difficult to quickly transfer multi gigabyte file sets even with decent high speed internet access; third party reliability (being able to serve the archived files or even just staying in business for a few years); there are generally costs involved. While there are some free options, those have restrictions.

I’ve contemplated this for some time and about 4-5 years ago joined PhotoShelter as a possible solution. While PhotoShelter is an excellent service, it’s not a simple, no frills online storage service, therefore one pays for additional features which may or may not be relevant to one’s needs. For pure online data storage, I’ve decided to give Amazon’s S3 service a try.

What is S3? It’s pay as you go redundant online data storage at a reasonable set rate without volume restrictions. I can imagine some thinking, “well I already pay for web hosting, can’t I just use that?” Yes and no. It will depend a lot on your web host. You may get away with 5 or 10GB, but eventually you’ll be called out, as I discovered when I was too lax at letting certain ftp accounts accumulate. My particular host has a no storage clause tucked away in the TOS, meaning if the files are not relevant to the operation of the website, they must be removed. This generally means no bloated ftp accounts. For photographers it’s possible to keep files on the http side of the hosting account by creating web galleries, etc., but at some point the web host will grow unhappy. In my case the threshold seems to be in the 30GB neighbourhood. And I’m not even storing photos for the purpose of online archiving, just galleries created for various clients and projects that have added up and eventually have to be removed or down-rezzed. Here’s a link to an unhappy Dreamhost customer who thought she could use the service for storage purposes. Another option could be a service like Flickr or Smugmug (which is actually hosted on S3). For $2 per month, Flickr will allow unlimited storage, though there are bandwidth restrictions. My issue with Flickr is whether one could actually download a larger number of images quickly in one sitting, in addition to some slightly sketchy privacy issues. While Flickr does allow users to set privacy to eliminate public access, will it also ensure that none of their third party developers/partners will not have some sort of access to these files? You would think that they won’t, but am somewhat suspicious about this aspect of social networking sites.

Back to S3. If one spends some time over at the Amazon Web Services website it can quickly become confusing and/or overwhelming. Much of the AWS service is aimed at web developers. Photographers can skip straight to S3. Even there some of the jargon is a bit obscure for the average photographer. What matters is that it’s an online redundant storage solution at a relatively low price.

Why S3 and not some other service? I suppose this is somewhat subjective, but from my point of view the primary factors were cost and reputation. I’ll address cost a bit later, but in terms of reputation, Amazon is a huge company with the resources to create and maintain a redundant storage solution. The bottom line is they’re likely to be in business for the long run, which can’t be guaranteed for smaller operations. Of course I’m not advocating that anyone moves their entire archive to S3 in place of maintaining a local, physical archive. As we’ve seen from the recent great recession, there is no such thing as too big to fail, therefore don’t rely on anyone else as the sole solution. For some history on this from a photographic perspective, read up on the collapse of the Digital Railroad photo site.

How to set up S3:

Sign up for Amazon Web Services (AWS) via the link at the top of the page.
If you don’t currently have an Amazon account, sign in as a new user.
Sign up for S3 on the S3 page by following the link under the Products category, or go here and click on the Sign up for Amazon S3 button.

When you sign up for AWS, Amazon creates a set of unique access credentials. These are different from your sign-in credentials used to access your AWS account. The access credentials are two sets of alpha-numeric keys. One is called the Access Key, the other is called the Secret Access Key. They are respectively the user id and password for accessing your S3 account.

Amazon has a number of user guides to explain S3 via a number of methods such as javascript, php, etc., but are all aimed at web developers. Photographers can ignore these and instead use a number of current, photographer friendly tools. Photo Mechanic includes an upload function to S3, as does the ftp program Cyberduck. I’m sure there are other options, such as Firefox plug-ins I’ve seen discussed on other blogs, but my current workflow uses both Photo Mechanic and Cyberduck, so it’s a welcome option to be able to use familiar tools.

While Cyberduck is typically thought of as an ftp tool, S3 is not accessed via traditional ftp. But Cyberduck makes it appear as though one is accessing an S3 account via ftp, which keeps the user experience very friendly and consistent. One in fact accesses S3 via https, in other words, secure http. You might read somewhere that it is a folderless storage system, but not to worry, you can upload contents in folders and Cyberduck will translate it into the proper http for S3 and will display folders when you later access the files, allowing you to maintain a familiar organizational structure. One of the S3 terminologies which might seem a bit unfamiliar is what they call a bucket. It’s more or less a root folder that one creates in which to store files, and one can create up to 100 buckets. The one catch with buckets is they must be uniquely named; not just for one’s own account, but across all S3 users. Each bucket becomes the prefix of the http address for each file uploaded, meaning each file will have a unique, permanent web address. For example, if you’re uploading archive photos from 1999, and your name is Joe Smith, you could create a bucket called joesmitharchive1999. The bucket will appear as a folder in Cyberduck, into which you can then upload your files/folders. Because the system works via http, one can access the files from a web browser, but it’s not quite that simple. Each file is set to be either publicly or privately visible. If one uploads via an ftp program like Cyberduck, that setting is located in the preferences. But even if left at publicly visible, it’s not as if anyone will be able to see all of your uploads, or even find them with Google, etc. The only way to find the files is if the name of your bucket(s) and also the exact file name is known. For example: https://joesmitharchive1999.s3.amazonaws.com/january/filename.jpg where joesmitharchive1999 is the name of your bucket, which holds a folder called january, which in turn contains a file called filename.jpg. Since uploads to S3 are done via https (secure and therefore encrypted), chances are very remote that someone will be able to sniff out this information. So while it might seem like you’re putting all your files out there in cyberspace, the odds of someone finding them are very low if left as publicly accessible, and zero if set to private. If this is a concern, it’s possible to change Cyberduck’s preferences for upload, or after upload, to change privacy settings. On the flip side, there are benefits to having images stored with public access. One would be for serving files to a website, but based on what I’ve read from others, the S3 service can at times be fairly slow for this purpose. A better Amazon service would be Amazon CloudFront. An example would be if one has a popular video file that is frequently accessed but is on a bare budget web hosting account. If your site suddenly gets a huge spike in traffic, your host might suspend service due to what they deem excessive bandwidth use, even if you’re on a so-called unlimited plan. Another aspect to S3 is remote file delivery to clients, such as if you’re away from the office and don’t have direct access to your regular archive. It would be possible to send a client a given file’s S3 web address and allow them to download it at their leisure. Or you could access it from the web link, or via Cyberduck, then email it to them. Client delivery via S3 could be quite useful for very large files. The only catch is it will cost you money. While it’s only pennies, it could add up with a relatively large file linked to a popular web page.

This is a good time to address costs. S3 is an attractive option for long term storage where files will be infrequently accessed, most typically only for disaster recovery or when access to one’s main archive is not possible.

Full pricing details can be seen here.

As of November 2010 (prices in $US):

Storage (Designed for 99.999999999% Durability)
$0.140 per GB – first 1 TB / month of storage used
$0.125 per GB – next 49 TB / month of storage used

Reduced Redundancy Storage (Designed for 99.99% Durability)
$0.093 per GB – first 1 TB / month of storage used
$0.083 per GB – next 49 TB / month of storage used

Data Transfer

$0.000 per GB – first 1 GB of data transferred out per month
$0.150 per GB – up to 10 TB / month data transfer out

Requests
$0.01 per 1,000 PUT, COPY, POST, or LIST requests
$0.01 per 10,000 GET and all other requests (DELETE is free)

As of late June 2010 Cyberduck now supports the Reduced Redundancy Storage (RRS) option. It can be set as the default in preferences under the S3 tab, or individual files can be changed between regular and RRS via the info window.

For example, PhotoShelter’s basic plan offers 10GB for $10 US per month. For the same price, one could store 66.7GB on S3 at the regular rate, or 100GB at the reduced redundancy rate, though that’s not taking into account upload/download fees. But for pure storage, S3 is attractive and scales incrementally rather than in PhotoShelter’s coarsely tiered jumps (though as of July 1, 2010, additional storage rates have been reduced by as much as 50%). Where one has to be careful with S3 is if using it as a file hosting solution because one has no control over how often visitors access a given page. With a very popular site, thousands of hits could quickly add up to relatively significant fees. In this respect, the best plan would be to host as much as possible with a regular ‘all inclusive’ web host and push it as far as possible while using S3 purely as an online long term archive.

My immediate goal for S3 is as an online archive for client work. The benefits are twofold: Mitigate potential data loss in case of a local disaster, either for me or for a client. Access client work while traveling because it’s happened enough times that I’ll be out of town and get an email from a client that can’t find a given file. In the longer term I will also archive personal work, but first I want to see how much storage I’ll require for client work and how much it will cost. $0.15 per GB seems cheap, and it is, but considering that my current client archive is nearing 2TB, it would translate into $300 per month simply for storage, which isn’t cheap (for me). Therefore I’ve decided on the following plan: Only final Jpeg images will be archived to S3. Images greater in resolution than 10MP will be resized to 3600 pixels and saved to a final size of approximately 2.5MB per file. At the same time as the upload to S3, I’ll create a low resolution (800 pixel) set of images to keep on my laptop. This low resolution set will allow me to quickly search for and identify images, even when on the road, in order to quickly find them in S3. It’s currently the only solution I can think of to offset the fact that there is absolutely no graphical user interface with S3, meaning there is no way to browse and identify images in S3 without first downloading them (an advantage with PhotoShelter or Flickr, etc.).

So there you have it. If you’re fed up with random hard drive crashes and the hassles of keeping things organized in a local archive, consider Amazon’s S3 service as an additional measure for safekeeping your valuable digital images. While there are other online archiving options available, such as the fully-featured PhotoShelter site, S3 offers a simple, low cost, redundant and secure service for those looking to park images (or any other files) as a long term, low traffic archiving solution.

Happy Victoria Day

I think the last time I went to the Victoria Day fireworks display at the Dundas Driving Park was around 20 years ago… Fireworks at this time of year in Ontario can be hit or miss due to the often cold and/or wet weather, but this year was an exception with a relatively warm summer-like day (and very smoggy as well).

I guess I lost interest over the years – been there, done that. For whatever reason, I decided to make the effort this year, but I wanted to find a different vantage point than from within the park directly under the show.

My initial plan was to photograph it from Dundas Peak, but at the outlook point I consulted with a few ‘locals’ (guys who were 20-something in age, knocking back drinks before wandering over to the show). They confirmed my suspicion that the angle of the escarpment and trees would probably block some of the display, so went with my Plan B location at the top of the hill in Greensville.

I arrived at 8:30 pm and thought I was going to have the place to myself, but slowly a small crowd slowly gathered. The City of Hamilton’s event calendar indicated the show would be at dusk between 9-10 pm. 9 pm came and went and the sky was still fairly bright, and there were a lot of cars driving down the hill into Dundas, which meant those people were more familiar with how the fireworks show was timed than I was. The show started at about 9:35 and because it was my first time at this location, I was unsure how large or impressive the display would be. I’ll admit, I was somewhat underwhelmed. Unlike some of the big shows in Toronto or Montreal where one has a great impression even from far away, the Dundas show was more down to earth, literally. While the fireworks were quite a distance from this vantage point (~200mm & 1.4x teleconverter for a relatively tight shot of just the fireworks), it worked well enough because I was able to add a human element to the foreground from the 20-odd crowd watching with me, giving the photos some depth and additional interest. I was initially concerned that the streetlights would add overpower the foreground and cause too much flare shining into the lens, but it worked well. Without the streetlights the people in the foreground would have been nearly invisible and I probably would have had to pop a flash a couple times to add separation from the background.

A few technical points:

Not having photographed fireworks recently for the specific point of photographing them, I was unsure what the correct exposure would be. I started at ISO 200 10 seconds at f/8 because that was about what the correct exposure for the street lit foreground would be. Luckily it was also a pretty good match for the fireworks. The only change I made was a longer 15 second capture at f/14. In hindsight, I should have tried some at 30 seconds. Because this show wasn’t large and overly dramatic, the photos you see here, most at 15 seconds, still don’t show a large number of bursts. The advantage of 10-15 seconds was not having to wait as long to move the camera for another vantage point (to change the foreground composition). I used a cable release and locked it and set the camera to continuous shooting. Therefore, I was able to leave the camera alone and move around to scope out other positions while the camera made continuous exposures without me. Even then, with changes in position, composition and lenses, the roughly 20-25 minute long show went by before I was able to get to all of the vantage points I hoped to try. Lens was a 70-200, sometimes with the 1.4x teleconverter. The first image below was with a 50mm, trying to show more of the foreground environment with the road up the hill, traffic, etc., but of course with the fireworks much smaller in the distance.

Here are a few more:

Hamilton sign – a sign of the times?

A sign of Hamilton’s aspirations?

Hamilton is a city on the verge of a revival. You can feel it. But there are also plenty of issues that could result in setbacks. The current hot issue is the location of the athletics stadium for the 2015 Pan Am Games. The City wants it near downtown at the west harbourfront. The Tiger-Cats football team, who will become the stadium’s primary tenant after the Games, don’t like the location. With the 2015 Games organizers getting nervous, the issue is going to mediation. Hopefully a solution can be found, otherwise the city could lose Pan Am events, and without a new stadium to replace ancient Ivor Wynne, possibly also the Cats…

Confession: One word was digitally removed from the sign, but no words were added. Otherwise it’s a real sign in Hamilton.