User comes to tech, tech solves problem, world moves on. Yawn.
Some months ago, I helped with a project to help move someone’s large archive of digital photographs and clipart to Google Drive. That was easy enough in itself – we just installed Google Drive on their Mac and just moved everything from the appropriate folder on their Mac to a suitable place on inside their Google Drive Folder, on a fast (100Mb synchronous) connection, and let time and Google Drive application do their work. “Job done…”
Problem 1: Images are over 2MB. Okay, so we’ll shrink them…
…Eeeeeexcept they wanted to use the images immediately, as-is, in Google Slides and other Google apps. The very first image dropped into their new presentation was too big. It was either over 2MB, or over some arbitary pixel dimensions that the dialog box didn’t tell the user about. So back the user came to our team asking what the heck was going on…
Looking at the relevant Google Support page for Docs Editors (as at 10th April 2015, and still not fully populated on 9th November 2015), one might think that just recompressing the images so they *just* squeeze under the 2MB size limit would be enough to comply. And indeed on this info, given the ‘000’s of images affected, I sync’ed a copy of the affected Google Drive account to a spare Mac, installed Imagemagick (along with its numerous dependencies) and wrote us a bash-script. Looking at the fileset, I noticed that only the JPGs were over 2MB in size, so I found I could simply tweak the script to look for any JPG over 2MB and use Imagemagick’s “convert” tool to resize the file in-place, then delete the old file to save confusion at the user end. The basis of the conversion was this command:
convert image.jpg -define jpeg:extent=1970kb image.jpg.smaller
Sure, we sacrifice some overall quality due to JPG recompression, and the script needs to take care of some housekeeping along the way. But having looked at a bunch of test-images side-by-side, we decided the work involved, and the results obtained were more than good enough for the intended use-case, and indeed any quality losses were barely detectable in >9 out of 10 cases even when pixel-peeping on a decent calibrated monitor. So on we went with the live dataset after many test-cases, thinking the job was done.
So with the results looking good, off I went to tell the user that the file conversions were done, the results looking good and let us know of any problems. Meanwhile we’ll move onto the next tasks on our creaking lists.
Problem 2: File size alone wasn’t the issue – pixel dimensions mattered too! D’oh!
Then the dreaded email pinged. Our poor user had tried to insert the first of the newly recompressed images and sure enough, it had failed to load, and again the same generic helpless dialogue box appeared. Not because it was a bad image itself – It’s a perfectly valid JPG, and looked very nice, despite the high pixel count and our fears over high compression rates and known multiple recompression steps. These images are intended for end-use after all, not for further editing, and certainly not for anything other than on-screen use at low DPI output at long distance.
I had to confirm the issue for myself, dragging a JPG onto the insert image tool’s “upload from computer” window finally got the Google Slides image tool to tell me that the image was too big, and finally it gave me the actual limits I was supposed to be working to.
Great. Now I need to go off and resize my images. Again.
So, off I went to find a new copy of the original images in their original folders (you do still keep backups of what’s on your cloud storage, right?) Then I worked on a new script, that would resize the images to fit inside a 3500×2500 window, preserving aspect ratio, and would again work for GIF, JPG or PNG since those are all supported by Google Docs. THEN I ran the same recompression script as before on any files that were still too big after downscaling. Overall the process took much the same amount of time as the first run, but came with the advantage of the end-result overall looking much better up to the limits of the pixel dimensions and the file size.
Some time on our end testing the full end-to-end process might have saved both us and the user some time and hassle, for sure, so my own lesson here is that some short-sightedness on our part for the sake of trying to “get back to other things” most certainly bit us all in the bum. In our defence however, the process would have been *much* easier had the image dimensions, aspect ratio and file size limitations all been given up-front, NOT just at the point of the image throwing up an error, but also on ALL appropriate import screens, and in accompanying documentation we can search from outside the situation the users face. Another couple of lessons both for developers and support folk here! 🙂