Welcome to the personal home page of web developer Clark Rasmussen. I'm trying to blog more than I used to but in addition to that, this is where you can find my portfolio (now part of the blog), resumé, and a handful of other things. Look up in the header for ways to find me on social media.

My Blog

Exporting from Snagit to Amazon S3: Revisited

One of the first things I wrote about when I started this blog was my workaround solution for exporting from TechSmith Snagit to Amazon S3.  That worked okay for Windows but I’ve started working on Mac significantly more of late and I missed that functionality.  As such, I took another look at options for this since TechSmith itself still hasn’t developed a Snagit to S3 output for either Windows or Mac.

I feel like exporting on Mac shouldn’t be a problem.  There’s no S3 Browser available but you could replace it with s3cmd and do the same thing.  The catch: There’s no Program Output option in Snagit Mac.  That’s right, on Windows you can essentially make your own outputs but on Mac you’re out of luck.

I came up with a workaround, though.  It’s not pretty but it works.  It also works on Windows, but with better options available I’m not sure there’s a reason to use it.

I use ExpanDrive to map my S3 buckets as a local drive.  Then I can save from Snagit straight to the location I want in S3.  That part’s great.  It’s pretty much seamless.  ExpanDrive is a really awesome tool.  Probably too expensive if all you’re using it for is Snagit exporting, but worth taking a look at if you’re working with S3 in other ways.

The problem is you don’t get the uploaded URL out of this.  That’s where it gets hacky.

I wrote a Chrome extension that gets me a list of the last five files uploaded to this particular S3 bucket.  So after saving my file, I have to go to my browser to get its URL.  Extra steps.  The bonus is that I can get the URL any time later.

A view of my Chrome extension, showing the last five files uploaded to my filebox S3 bucket.

A view of my Chrome extension, showing the last five files uploaded to my filebox S3 bucket.

Since the ExpanDrive part of it works out of the box, here’s the breakdown of my Chrome extension.

  $s3 = Aws\S3\S3Client::factory(array('key' => $access_key, 'secret' => $secret_key, 'region' => $region));

  $objects = array();

  do {
    $response = $s3->listObjects(array('Bucket' => $bucket_name, 'Marker' => $response->NextMarker));

    foreach ($response['Contents'] AS $item) {
      $objects[] = array('key' => $item['Key'], 'url' => ($bucket_url . $item['Key']), 'timestamp' => strtotime($item['LastModified']));
  } while ($response->NextMarker);


  usort($objects, function($a, $b) {
    return $a['timestamp'] - $b['timestamp'];

  $display = array();
  while (count($display) < 5) {
    $display[] = array_pop($objects);

  header ('Content-type: application/json');
  echo json_encode($display);

I start with a script on the server side that uses the AWSSDKforPHP2 to read in the files from my filebox, sort by date, and grab the five most recent.  Those five are then spit out as JSON.

To access that file from the Chrome extension, it’s important to include that domain in the permissions section of the manifest.json file.  Also necessary is the clipboardWrite permission.  In addition to the required manifest file, the extension uses a single HTML page, a stylesheet (which I’ll skip here since how it looks doesn’t really matter), and a Javascript file.  There are also some images but I’ll skip those, too.

<!doctype html>
    <title> Online Filebox</title>
    <link rel="stylesheet" href="styles.css" />
    <div id="container">
      <div id="branding_icon">
        <img src="img/icon-48.png" alt="icon" />
      <div id="main">
        <h1> Online Filebox</h1>

        <ul id="content"></ul>

    <script src="main.js"></script>

The important things here are the UL element with the ID of content and the inclusion of main.js.  The UL will be targeted by our JS for dynamically adding elements.

var site_url = '';

function request_data () {
  var xhr = new XMLHttpRequest();'get', (site_url + 'path/to/script.php'), true);
  xhr.onload = populate_list;

function populate_list () {
  var obj = JSON.parse(this.responseText);

  var container = document.getElementById('content');
  while (container.firstChild) {

  for (var n = 0; n < obj.length; n++) {
    var li = document.createElement('li');
    var a = document.createElement('a');

    a.innerHTML = obj[n].key;
    a.setAttribute('href', obj[n].url);
    a.setAttribute('target', '_blank');
    a.onclick = function () {


function copy_to_clipboard (text) {
  const input = document.createElement('input'); = 'fixed'; = 0;
  input.value = text;

window.addEventListener('load', request_data);

The request_data function wraps a call to the PHP script noted above.  Onload of that data, we call populate_list.

The first thing we do in populate_list is parse the text we got from the PHP script into an actual JSON object.  Then we remove any list items we may have in our previously-mentioned UL.  We loop through each of the items in our JSON object and create new elements for them.  Each item gets an LI with an A inside it.  The A has an HREF of the item’s URL and the TARGET is set to _blank so it opens in a new window.  Additionally, we use the copy_to_clipboard method that I grabbed from someone’s GitHub to save that URL to the clipboard, setting it as an onclick event for the A tag.

I’m certain that this could be cleaned up and made more configurable and turned into a publicly-available extension but I’m not going to bother with it.  I figured I’d put this out and hope that it helps someone.

I will say that one idea I’m intrigued by is replacing the PHP script with an AWS Lambda function that triggers any time the S3 bucket is updated.  I’m not entirely certain how that would work but it seems possible.

My Projects


DetroitHockey.Net is my long-running tribute to the Detroit Red Wings, where I do a lot of writing about the team as well as post photos and stats.

Me on Twitter

I wrote a bit about exporting from Snagit Mac to Amazon S3.

10/13/2015 - 7:49 AM

"Mickey's comet is on its way! Time is running out!" Kids TV is not nearly as ominous as it sounds.

10/12/2015 - 7:21 PM