On commercial remasters possibly issued without Dolby A decoding; Mistakes, art, or…?

Some background…

I’ve commented on this blog before about the possibly questionable quality of some digital remasters released of late. Common subjective complaints in online fan and hifi forums, made myself both here and among friends in person, are that some particular remasters might be too loud, too bright, and/or otherwise “overdone” given what we know or perceive of the original source material.  There might well be various artistic or marketing-related reasons for this, so I’m not here to argue for or against these issues.

Further complicating the issue for me, both as a fan and a professional, is that many of these stand-out features are seen overwhelmingly as positive things by many fans, whether they are technically correct or not.  It would seem that a combination of perceived increase in detail and volume outweighs any issues of listening fatigue or known certain deviation from the presentation of the original master.

I’ve embarked professionally on remastering and restoration processes and have learned, from the coal-face so to speak, much of the reality of what’s involved. To onlookers it appears to be a black art – and believe me, from the inside, it can feel a lot like it too!  Sometimes I’m asked by a client or a reference-listener how or why I made a particular decision; and in some cases, especially those without actual technically verifiable information or logged conversations to go on, I have to go out on a limb and essentially say something to the effect of “well, because it ‘felt right'”, or “because it brings out the guitar *here*, which really flatters the piece” or some other abstract quantity.  At this point I just have to hope the client agrees.  If they don’t, it’s no big disaster, I am rarely emotionally tied to the decision. I just need to pick up on the feedback, do what I can with it, and move on.  Looking at the process, I guess that’s partly why the word “abstract” appears in my trading name! 🙂

“Okay, so you know a bit about this from both sides, on with the subject already!”

There are two particular commercial albums in my digital collection, both hugely successful upon their original release, whose most recent remasters have bothered me. It’s not fair to name and shame them, especially not while I await confirmation from engineers/labels that my hunch is correct.  Anyways – I’m bothered not because they’re “bad” per se, but because I bought them, took them home, and from the moment I first heard them, something about them stood out to me as being not quite “right” from a technical perspective. One of them (Album A) was released in 2001, and another (Album B) that was released earlier this year, in 2015.

What these two albums have in common is that their tonal and dynamic balance is *significantly* different to the original releases, beyond the usual remastering techniques involved with repair, restoration and sweetening of EQ and dynamics carried out to sit alongside contemporary new releases.  The giveaway is that the top-end is both much brighter than usual, and much more compressed – and the result is unnecessarily fatiguing.

Where the two albums differ, then:

  • Album A has not suffered from the “loudness wars”.
    • Its overall dynamics are relatively untouched compared with the original.
    • It appears, looking at the waveform in a DAW, that the album material has been normalised to 0dBFS (so it fills the maximum dynamic range CD has to offer), but it rarely hits such high levels.
  • Album B however, despite never having been a “loud” album on original release, has suffered from the “loudness wars”.
    • Looking at its waveform, it’s clear that it has been maximised; this means that the material has been both compressed and limited such that the original dynamics have been squashed and gain applied such that almost the entire album waveform hits the 0dBFS point.
    • As a result, the album has lost its previous tidal ebb and flow, and while arguably some details are indeed much more audible than before, it no longer has the organic subtlety it once did.  Important instrumental details get masked and actually reduced in level as louder ones come into the foreground, because with that much compression going on, there’s nowhere else for them to go except lower in level.
    • Sure, it’ll play better on an iPod while travelling on the London Underground, or in the car, so it might open up a new market that way – but for the rest of us perhaps looking forward to a better quality transfer to listen to at home or anywhere else, we don’t get that choice.
    • I’ve heard the 2015 vinyl re-release of the latter album, and it seems to not have the same issues – or if it does, nowhere near to the same extremity. There are likely good technical and human reasons for that, but that’s an aside for another post.

Experiment 1:  Treating the common issues

Last week I had some downtime, and a hunch – a dangerous combination.

Neither album was famed in its day for brightness, except for the singer’s sibilants in Album A causing vinyl cutting and playback some serious headaches if alignment wasn’t quite right. Album B does carry a lot of detail in the top end, but being mostly synthetic, and certainly not a modern-sounding album, the spectral content is much more shifted toward low-mid than anything we’d be producing post-1990.  So there will be some sheen and sparkle, but it should never be in your face, and never compressed.

Such clues told me two things: first, that Dolby A was likely not decoded from the master-tape on transfer; next, that in the case of Album B, further dynamic compression has taken place on top of the un-decoded material.

So – out came a Dolby A decoder, and through it I fed a signal from each album in turn, bouncing the decoded signal back into my DAW for storage and further analysis of the decoded signals.  Now please understand, it’s hard (if not impossible) to get a correct level-alignment without getting the test tones from the master tape, but those of us in the know can make some basic assumptions based on known recording practices of the time, and once we know what to listen for, we can also based on the audible results, especially if we have a known-good transfer from the original tape to work with.

All that said, I’m not claiming here that even with all this processing and educated guesswork, I’m able to get back to the actual sound of the original tape! But I am able to get closer to what it ought to sound like…

The result? Instantly, for both albums, the top-end was back under control – and strangely both albums were suddenly sounding much more like the previous versions I’ve been hearing, be it from vinyl, CD or other sources. Album B’s synth percussion had space between the hits, Album A’s live drums had proper dynamics and “room” space. In both albums, stereo positioning was actually much more distinct. Reverb tails were more natural, easier to place, easier to separate reverb from the “dry” source, especially for vocals. Detail and timbre in all instruments was actually easier to pick out from within the mix.  To top it all off – the albums each sounded much more like their artists’ (and their producers’) work. Both albums were far less fatiguing to listen to, while still delivering their inherent detail; and perhaps some sonic gains over previous issues.

Experiment 2:  Fixing Album B’s over-compression

First things first – we can’t ever fully reverse what has been done to a damaged audio signal without some trace being left behind.  Something will be wrong, whether “audible”, noticeable or not.  But, again, an educated guess at the practices likely used, and an ear on the output helped me get somewhere closer to the original dynamics.  But how?

Well, it was quite simple.  One track from the album has a very insistent hi-hat throughout, that comes from a synth.  If we assume that synths of the time were not MIDI controlled, and likely manually-mixed, we can assume that it should essentially sit at a constant level throughout the piece, barring fade-in/fade-out moves.  And listening to an “original” that’s pretty much what it does.  But neither in the clean nor my “decoded” version of the later album does it do so.  It drops up and down in level whenever the other pads and swept instruments come and go.  It was more noticeable on my “decoded” version, but with the frequency and micro-dynamic blends being so much more pleasant, I knew that I’d made progress and the way forward was to fix the compression if I could.

Out came a simple expander plug-in; Inserting this before the Dolby decoder, and tweaking various settings until I was happy that the hi-hat was sitting at a constant level throughout my chosen reference piece, restored dynamics to something like the original, and returned that hi-hat to something much closer to a near-constant level as the track plays.  In the end, we get something like a 6-9dB gain reduction, and the waveform looks far less squashed.  And sounds it, too.

The trick then, was to listen to all four Albums, A, B, A restored, B restored, at similar overall loudness levels, and see which works better.  So far, in this house anyways, we’re happier with the restored versions, even including those who are unfamiliar with the artistic content.

Prologue – Is this a mistake? And if so, how could it have happened?

When dealing with remasters, especially for older albums, we typically go back to playing analogue tape. There are *many* things that can go wrong here at a technical level. We’re worrying about whether the tape machine is aligned to the tape itself, both tape and machine are clean, and that the correct noise reduction technology is used, whether we’re actually getting all the information we can off that tape.

Then there is a human element. I’ve lost count of the number of times even in my small sample, where I’ve encountered a DAT or 1/2” reel labelled as being pre-EQ’d or Dolby-encoded with some system or another when in fact it wasn’t. Then there are other similar labelling and human errors I’ve encountered; Perhaps it wasn’t labelled as being Dolby-encoded and it really was. Or perhaps the “safety copy” was actually the clean master and the “master copy” was actually the cruddy “safety” with a 10dB higher noise-floor recorded at half-speed on lower-grade tape on an inferior machine that we know nothing about, with the channels swapped randomly due to a patching error in the studio.

Technology, and technicians, like the kind of questions and answers that have defined, logical “0 or 1”, “yes or no”, “is this right or is this wrong?” kind of answers. Unfortunately for us then, when dealing with music, as with any other art, and so then dealing with musicians, producers and other artists involved with the music creation and production process, we soon find that the lines between “right and wrong” very quickly get blurred.

As an engineer, I’m also all too aware of the dichotomy between my *paying* client (usually the artist), and my *unpaying* client (the listener).  Most of the time these are in agreement with what is needed for a project, but sometimes they’re not. The usual issue is the one of being asked for too little dynamic range – “can you turn it up a bit so it sounds as ‘loud’ as everything else?” and the resulting sound is fatiguing even to me as the engineer to work with, let alone the poor saps who’ll be invited to buy it. Sometimes I know that some sounds simply won’t process well to MP3/AAC (that’s less of an issue these days, but still happens).

Anyways – all that to say -if these albums both suffered the same mistake, if indeed it was, then even without the myriad artistic issues creeping in, I can see how an unlabelled, undecoded Dolby-A tape can slip through the net, and blow the ears off an artist or engineer who’s been used to the previous released versions and get people saying “YEAH, LET’S DO THAT ONE!” 🙂

CF

Review: Rega Carbon MM cartridge

Image

(Rega Carbon MM conical cartridge; Image from Rega website)

In recent months I’d found our vinyl playback becoming increasingly distorted, especially on sibilants.  It seemed to me that our beloved Denon DL-160 MC cartridge tip has seen better days, and likely needs repair or replacement.  The problem was, with what should we replace it, even only for a short time while it’s away?

I had already kept a backup in our Ortofon DN165 with an OM-5 “generic” stylus which never really seemed to fully “sing” up against the Denon, but a quick swap showed that it was indeed more able to track inner grooves far better and with far less sibilance than the Denon was showing, especially on the recent Pulp “Different Class” 180G reissue which seems to be very densely packed towards the end of side 1.

But – the Ortofon really is no match in terms of tonality on our Dual 505-II compared with what the Denon could do with a new tip.  So, while we started to work out what do with the Denon, I hit up some online forums to see what people think of the cheapest available cartridges.  This narrowed the choice mostly to Audio-Technicas, either the AT-91 or the AT-95E.  Then I came across the Rega Carbon, which was well regarded in these two reviews:

http://audiofi.net/2013/03/rega-carbon-cheerful-cheapie-cartridge/

http://theartofsound.net/forum/showthread.php?21594-Using-a-Rega-Carbon-cartridge

So – for about £27 including delivery, I ordered on Amazon and was surprised to have one delivered to me by Sevenoaks Audio.  I mounted it within minutes of arrival and spun a few discs before leaving for a holiday.

First impressions…

…surprisingly good.  The overall balance was very similar to how I remembered the Denon DL-160 sounded when it was new to us.  Tracking ability of the deck was much improved – and it cleaned up many of the distorted sibilants in our rather well-loved first-run copies of Michael Jackson’s “Thriller”, and Al Stewart’s “Year of the Cat”.

Since our return, I’ve spun another varied and very enjoyable 10-15 discs with it, and am now sat enjoying a lovely rendition of an 80’s repressing of Pink Floyd’s DSOM.   So now I’m collecting some brief thoughts on how it now sounds after some 15-20 hours of playing time.

Longer term impressions…

It’s settled down – a lot.  The initial slightly brash treble presentation has become much more smooth, and surprisingly detailed considering how I’d have expected a conical stylus to sound, based on my limited understanding of the physics involved.  It rarely sounds as if it’s missing any significant high-frequency detail, though it’s fair to say its useful upper-limit in its frequency response is perhaps 1-2KHz lower than the Denon.

Surface noise is much-reduced compared to the tired Denon or mid-life Ortofon.  I’m therefore feeling much more able to just plunk a clean-looking disc down and get the needle stuck-in without spending significant cleaning time.

The overall sound is now much more balanced across the whole playing surface of any disc.  The balance change from “The Great Gig in the Sky” (end of Side 1 DSOM) to “Money” (beginning of Side 2 DSOM) is much less noticeable.  The latter sounds absolutely stunning in its detail, overall balance and sound-staging.  The tightness of the room reverb in the recording studio is now absolutely evident, with the background sounding “darker” than ever before.  The cymbals are absolutely crisp, as are the vocal sibilants.

Again sticking with DSOM as the example, while the apparent width of the soundstage feels narrower with the Carbon than with the Denon DL-160, the apparent depth of the soundstage feels much more accurate. Centre-panned voices seem to stand forward of the rest of the band. Individual instruments take on a definite space and are much more able to be followed than with the Ortofon.  Arguably in this more subjective respect, the cartridge does as good a job as the Denon ever did in our rig.  In some ways, it’s better – fine details seem much more apparent, and solid, than I’ve ever heard on this rig before.

“Us and Them” – the Rega pulls sparkle and space out of a dense mix in an increasingly tricky part of the disc.  It actually makes our rather tired copy sound brand-new. The huge chorus section has always sounded screechy with either of our previous cartridges – but with the Rega it just sounds big, and heavy and much cleaner.  Fine details of Sax placement, piano, organ and guitar riffs, complete with their acoustic space, are still audible even in the really heavy sections.  The synths, guitars and organ in the closing section perhaps have less sparkle than I remember, but their placement in the soundfield is much more assured, and much less distorted.

The overall impression is that this cartridge is a stunner – and it simply delivers *music* at whatever pace was intended. It delivers space and detail enough to communicate the message, if not always to convince you that the band is playing live right in front of you. And it does all of this without any apparent resonant tradeoff, nor any significant omission in any other area.

So – maybe I had a duff DL-160, and maybe our Ortofon had seen better days.  Maybe the DL-160 was perhaps a less-than-ideal match for our deck. But whatever the reasons for the differences I’m hearing, this cartridge absolutely *sings*, and it does so with a poise and fun-factor that I’d always heard vinyl was supposed to offer.   The Denon got us there for a good year or more, and I when I add up its total known playing-time in our care it’s really about time it was repaired or replaced.

Then I consider the price-tag, and I can only conclude that regardless of its peers, the Rega Carbon is an absolute gem and works incredibly well with our Dual 505-II, with its ultra-light original tonearm and (admittedly) customised heavy non-suspension base.

I’d tell any vinyl lover to just buy one to try for novelty-value, regardless of whatever other “prestige” cartridges you might also have. You might be surprised at how well it actually compares.  It’s always good to have a more-than-passable backup to a much better cartridge – but in our case, I’m suddenly in much less of a hurry to re-tip or replace our beloved Denon. I now have the time to get it right.

Oh, and if you need more evidence to commend this little gem – I can tell you one more thing:

Any good hifi component, or system, should make you want to listen to your music more.  Judging by the pile of played discs building up on my desk that need putting back onto the shelves, I can tell you that this has certainly got us listening to a *lot* more music, in a phase of live when I can tell you we’ve had the least actual *time* to listen to it.

Some thoughts on using Google Docs

Following on from yesterday’s thoughts on using a Chromebook for an extended period, I thought it worth updating it (coming soon!), as well as jotting down some thoughts about Google Docs.  This got so big (and is relevant to all platforms, not just the Chromebook) that for the sake of clarity I decided to hive it off as a separate post.

Game-changing features

I think the main thing I’ve had to learn in terms of my expectation of what Google Docs can do, is to consider them as functions of a large and very advanced database.  From this perspective, the vague consideration of “wow – how do they even do that?” becomes much easier to resolve and put to rest.  With that in mind, I can now take a deep breath and present some major gains I’ve found with Google Docs as opposed to working in traditional desktop productivity apps like MS Office.

Never hit “Save” (or ctrl-S) again

This is a big one.  I type out a sentence, and then pause to look up to the toolbar… the word “Saving…” presents itself for a few seconds, before eventually changing to “All changes saved in Drive”. In theory, this means I can go into a document, type some stuff, then just navigate away from it in the knowledge that the changes were saved without my even having to worry about it.  Compared with MS Office, where it’s quite normal to get completely sucked-in to writing that important document then have it crash when fine-tuning the formatting and then find you didn’t manually save that last 3 hours of work, even the Autosave functionality often doesn’t keep up with important edits.  The Google Way™ seems so much better, and has saved many a draft.

Always available, on any computer in the world…

…provided that it has an Internet connection and a modern web browser.  This has massive implications for the freedom of users to roam the planet as they need and still have access to the information that’s important to them.  Obviously this doesn’t negate the need for backup of truly valuable data – but does act as a less-admin-intensive solution than providing a full roaming Windows/Mac roaming network account with all the security and software licensing hassles that creates.

Collaboration

It’s now routine for my boss and I to dump a load of notes into a Document, or run through entries on a spreadsheet, then have both of us view and edit the same document at the same time.  While we remain online and inside the document(s), we can each see who is doing what and where – even where the cursor is for each user.  This helps us greatly in documenting expenses, working through tricky wording of contracts, manuals, specifications and other basic project management tasks.  This feature alone, working across documents, spreadsheets and even presentations, has changed our working lives for the better.

Word processing

Generally, for any document created in Google Docs itself, everything pretty much works as expected – at least from a simple “type up some notes, edit then, make them look vaguely presentable, and print/email it” perspective.
That said, some foibles have been found that have stepped in the way of my making a more complete switch to Google Docs full-time, and relying on MS Office:

  • Previewing of MS Office documents does indeed (mostly) work, but Google Docs’ more simple headings, formatting and layout options mean that document fidelity with formal reports tends to suffer.
    • Sometimes inserted graphics disappear, or are rendered very badly, or appear in the wrong place with text wrapping mangled in the process.
    • Appendices and other numbered/customised headings tend to get lost – sometimes changing the implied meaning and flow of the incoming report.
    • To get around these issues, I tend to ask those reporting to us to submit (both final and draft) reports to me either as email body text (for informal reports), or as PDF’s for more formal work.
  • Page layouts that preview well on-screen can end up with very different pagination, especially when printing to A4, or rendering to PDF.
  • Working with headers and footers is basic, but in fairness does allow insertion of tables, images etc for fine control over layout of logos, titles, author details, page numbers etc.
  • While I’m pleased to see that footnotes work, it’s not a full referencing system that can log and tabulate the source of each reference – again this makes full academic and some reporting use-cases awkward, and calls for migration to more powerful desktop software.
  • Table of Contents can be inserted, taking and automatically updating its entries from headings used throughout the document.  Good basic stuff, but:
    • No page numbers alongside the links.
    • No obvious control over which heading classes are included, nor over the specific formatting of the table entry.
    • Headings cannot be formatted with numbering, in the way that MS Word or other word-processing apps handle.  (Collaborative) Drafting of formal proposals, reports or academic writing can be done on Google Docs, but really formal documents are best having the final text copy/pasted into MS Word or a more advanced desktop word processing or page layout tool of your choice.
    • Table formatting is quite flexible, but not as many available line styles or formatting options as MS Word.
      • Also, can only move cell boundaries when they are visible, eg. when they have a border thickness greater than 0pt.
  • Printing and output
    • Page size is set to US Letter by default. This can be changed to any other supported paper size – A4 for me, please!
    • Equations entered through the Equation tool end up inconsistently placed and pixellated on both PDF and printed output.
    • Documents can be downloaded (or shared) as PDF
      • An example of the PDF output, combining these and yesterdays’ posts, is here:  SamsungChromebook303Cusability (2))
      • Useful for sending out fixed versions of a document files as a reference.
      • The PDF rendering engine can have some strange results, notably with changes to pagination.  Stray blank pages get inserted, and some placement changes made for the onscreen page preview end up looking different on paper.
      • A 20-page report (such as this one, according to the page count in the footers) on-screen ends up coming out as a PDF with 22 or more pages, depending on how and where simple page-breaks have been used.
      • Interestingly, automatically-generated page counts remain correct regardless of whether the document is viewed in the Docs editor, or as a PDF.
      • These are the kind of inconsistencies that most users I know find absolutely maddening for formal work – and a crucial limitation for users to be informed of. It’s like using a camera that takes a photo of the most beautiful mountain range in the world, at sunset, and when you download the photo to your home computer you find it actually gives you a photo of a discarded needle on a wet East London street-corner.
    • Documents can also be downloaded in common MS Office and other (more open) file formats.

Spreadsheets

My needs for spreadsheets tend to fall into one of two categories:

  1. Simple line-entries and basic summaries thereof, for things like expenses, inventory-lists and the like.  This kind of use is so easy to cater for that I’ve yet to find any flaws – and the extra collaboration and availability of the files tends to win over the bulk of a desktop application and opening an actual file from a disk.
  2. Complex mathematical data import, analysis and charting, with templates for print output of charts and tables  to be included in other documents.  Such work tends to involve complex and obscure cell functions, and often (in Excel) some customised VBA code.  Such documents have previewed in Google Docs with reasonable fidelity, but there’s no way I’d expect anything other than MS Excel to understand the file, let alone work with it in any meaninful way or timeline.

Presentations

Rather than using presentations in teaching, I tend to use more of a show-and-tell approach, or even use a Google Doc (word processor) as a virtual blackboard to help explain what’s going on.  That said, when I want a simple pack of slides to summarise the points made, or to outline the plan for a day,
I’ve not played with the Presentations tool much beyond this, mostly because I expect problems even getting Powerpoint files to open and play out correctly on another copy of MS Powerpoint – let alone transferring them to another app such as Google Presentations.  

(Nearly) Two weeks with a Samsung Chromebook 303C

Scope of review

In the week before Christmas, we took delivery of a Samsung Chromebook Series 3 (303C) – with the intention of reviewing it for suitability towards a distinct usergroup we administer.  So to that end I’ve spent many hours using this machine in place of my usual MacBook Pro (for work) and occasionally for personal use in place of my usual Windows 8-based netbook.  I’ve taken some notes as I’ve encountered thoughts and issues provoked in daily use, which have been compiled into this review (which itself was written on the Chromebook in Google Docs) for others to see where I’ve got to with it and why.  Hopefully it will inform and comment rather than poke holes or fun.
Please note therefore that this review is neither an analysis of Google software/policy/infrastructure, nor is it an in-depth user manual for this machine or the Chrome OS it runs.  Others have these functions covered far better elsewhere.

Setting the scene

The computing market has been flooded with sub-£400 laptops in recent years, with many being in the small “netbook” form-factor.  Their primary intended use is for the consumption of online content, and getting simple tasks done like email, letter-writing, online banking etc.  Most of these netbooks run full copies of Windows or Linux and offering power enough to run basic internet, office and even multimedia software – this has given us a new class of affordable machines with surprising processing power and flexibility, despite being designed for much simpler tasks.  New models continue to be offered with Windows 8 and Intel/AMD x86-compatible processors.

Cheap, powerful computing – what it *can* be

I bought an Asus EeePC 1011PX to aid study and note-taking in 2011.  As I progressed through the studies beyond simple note-taking, writing up projects in Microsoft Office 2010, it has been used for mixing multitrack audio on the move, as well as room-acoustics analysis with a USB test mic.  That’s an amazing amount of processing power and flexibility for £230, even though that doesn’t include the extra hardware and software I now use with it.
To get the best out of such a small machine, I’ve had to carefully analyse my needs and find solutions that scale down appropriately to such a small machine.  Document compatibility issues finally pushed me to purchase and relearn Microsoft Office 2010.  To make that transition I ditched the dog-slow Windows 7 Starter Edition in favour of the two major consumer-previews of Windows 8, enjoying both enough to finally upgrade to the release version Windows 8 Pro.
I’ve also had to deal with what I feel was more than my fair share of maintenance.  Within 11 months of purchase both the fan and hard-drive failed, both of which were dealt with surprisingly quickly by the manufacturers’ UK repair agents.  No surprise that these moving parts needed replacement, but within 11 months?  The OS itself needs to update itself from time to time, as do most of the individual applications – albeit less often and usually without requiring a reboot.
So all this leads me to ask; what makes the Chromebook any better than what I know of an arguably similarly-specified Windows machine at a similar price point, and what can one expect from such a machine?

Software and hardware

First-off, a Chromebook comes preinstalled with enough of an operating system (OS) to run Google Chrome, and connect to the outside world via WiFi and Bluetooth wireless, alongside slots for USB and SD-cards.  Anything that can be done inside a web-browser can be done with a Chromebook.  This essentially makes it a Netbook in the most literal definition of the word.
Additional software is available, but only in the form of web-apps that can be installed inside Google Chrome itself.  This should ensure an increased level of OS security and stability compared with a full-blown Windows, Mac or Linux installation, since the user cannot fiddle with it.  It should also ensure that software updates are much more limited in scope and number, since there are less components on the Chromebook.
Installing Microsoft Office is out of the question, but that doesn’t mean that the machine can’t be useful for paper-based productivity – but instead of Office, Google would expect you to use their Docs/Drive package with a Google account.  Instead of Outlook, Gmail – this would include calendar and contacts functionality.

User data

A Chromebook typically comes with very little built-in storage.  The Samsung 303C tested here comes with a 16GB SSD which is seemingly used for both the built-in OS and any user-data such as downloads, etc.  With such limited onboard storage, multimedia options are limited to anything that can be downloaded from the Internet, or played directly from USB/SD media.
The idea of the Chromebook platform is that it acts as an interface to cloud-based storage and management of email and documents – and is clearly best used with a Google account.  If you don’t have one, the machine will allow you to create an account as part of the login process.

First impressions – hardware

 

  • Fast boot time (needs measuring)
  • Easy to get going with Google account credentials or as a guest user
  • Fast to sleep and to wake up.

Display

Pros

 

  • Surprisingly nice screen – compares well with existing Asus EeePC 1011PX netbook. Pixel size seems ideal for form-factor.
  • Text rendering looks surprisingly crisp – without being fatiguing.
  • Matte finish much nicer to use than the reflective shiny glass finish on Macs and some PC laptops.

Cons

 

  • HDMI connection to second monitor has yet to work with any DVI or HDMI-equipped TV or computer monitor I’ve tried – usually causing the laptop screen to go dark.  This might make presentations a problem.

Build

Pros

 

  • Thin
  • Light
  • Feels solid in the hand.

Cons

 

  • Fiddly to open one-handed, but too light and small to easily open two-handed.  Could easily have been solved by setting a bigger indent just under the trackpad to offer more grip.
  • Silver coating is really too easy to scratch. The underside of the machine is scratched up after a day’s use, and it’s only ever been on a clean desk, or inside a padded case.
  • The “G” from the Samsung lid decals has fallen off – not good since the unit has only ever travelled in my hand or a soft case!
  • While the machine feels solid enough in handling, the screen does seem to touch the keyboard when folded down, allowing dust and skin-grease to transfer, particularly from the spacebar to form lines on the screen.  This is a common problem to all plastic-screened laptops and notebooks I’ve used.  Models such as recent MacBook Pro’s with much more solid glass-faced screens seem to flex less easily to begin with, and mark less easily than the plastic if they do make contact with the keys.
  • Headphone socket is a very tight fit with most standard 3.5mm plugs encountered during the trial.  Really does feel like I’m going to break the machine if I push too hard.  This is the complete opposite case to most laptops I’ve ever encountered, whose headphone/line-out connections are generally too loose, causing nightmares for corporate presentations.

Keyboard

Pros

 

  • Full-size keyboard is very much like the MacBook (Pro) machines we’ve been using for the last five or more years.
  • Function keys well thought out with dedicated (and marked) keys for tab refresh, maximise, window cycle, brightness, volume mute/down/up, standby.
  • Typing longer documents (like this review, even) is a surprisingly comfortable experience – I’m finding it hard to feel any notable difference between this and a MacBook.
  • Dedicated “Search” button likely more useful to modern users than “Caps Lock”, but…

Cons

 

  • …Where’s the CAPSLOCK KEY SO I CAN SHOUT AT PEOPLE??!
    • Actually, Alt-Search has the same effect – makes sense since the search key is in the traditional place for the Caps Lock key, but this config could confuse new users who might not understand why their Chromebook “randomly” brings up a search function!
  • No “Delete” key, nor obvious way to replicate function.
  • Left and right arrow function keys would make most sense as a way of moving across tabs in the same window, but don’t appear to do anything?
  • No media keys – would be useful for YouTube, Google Play Music player, etc

Trackpad

 

  • Like many new machines, this was set a little slow by default. Soon fixed by adjusting settings (more on this later).
  • Right-clicking with two-finger tapping seems hit-and-miss.  Right side seems more sensitive/accurate to touch gestures than left.
  • Works best either with a firm thumb-push at the bottom (where buttons used to be before smooth trackpads became the “in thing”), or using tap-to-click. To this end-user, this feature seems no different to the glass Apple Trackpads fitted to aluminium unibody models.

Built-in software – in use

User accounts

 

  • Multiple user accounts can be set up on the same Chromebook.
  • “Admin” tools, suitable for remote control and corporate deployment are available as part of a Google Apps domain (how else?), but at a cost of something around $20 per year per machine at a quick glance.
  • Most users will likely be fine with a strong password and normal “user accounts”.
  • “guest” (browser-only”) access can be selected as an option at login/lock screens.
  • Accounts can be “locked” after sleep, requiring password (or switch to guest/alternate account) to wake – important for security.

Taskbar; a.k.a Launcher

 

  • Seems to be fixed at the bottom of the screen – but can be set to auto-hide.
  • Left side shows currently-open apps
  • Apps can be pinned to the launcher, much like Windows.
    • Some apps open in their own window, some open in a new tab.
    • Right side shows clock, WiFi, battery and account avatar pic by default.  Also shows notification of audio muting and caps-lock.

Menus

 

  • Relatively few built into the OS itself.
  • Tend to be limited to particular app (for the browser) or function (for things like WiFi, Bluetooth etc.)

Network connectivity

This machine’s sole means of connectivity with the outside world is via WiFi, which supports WPA, WEP and unencrypted connections on 2.4GHz (a/b/g) or 5GHz (n) WiFi networks.  Connectivity has been consistently good with a variety of Ruckus, Netgear and Apple access points.

Bluetooth connectivity

File transfer

Not attempted as couldn’t get the Bluetooth Stack to connect with any phone compatible with bluetooth file transfer protocols.

Keyboard/Mouse

Pairing an Apple keyboard/mouse set with the Chromebook was easy, once I’d remembered (searched Google for) the method to get the devices into a discoverable state.  Keymapping seemed reasonably logical – with volume, screen brightness, dashboard and windowing keys apparently behaving as expected.
Interesting discovery:  Playing a WAV file from a CF card (via USB card reader) brings up a built-in Music app – which does seem to respond even to the media keys on the Apple keyboard – impressive since there are no marked media keys on the built-in keyboard.  Nice little “easter egg” inserted to make developers’ lives easier perhaps?

Internet tethering

See “Interacting with Smartphones” below.

Windowing

Apps can be set (usually by right-clicking on them in the Launcher bar or menu) to the following windowing modes:

  1. As standard tab
  2. As pinnned tab
  3. Maximised
  4. Fullscreen

In real use, the actual implementation (and terminology) seem confusing and inconsistent.  “Maximised” Gmail has a different (and more minimalist) window style to any other “maximised” tab.  Some other apps (Scratchpad, for example) seem to be able to use the same minimalist maximised style, but not everything.

File management

It’s bound to happen – at some point in using a Chromebook, you’ll find that you’ve got some file(s) from a camera or USB drive that need attaching to an email or uploading to cloud storage somewhere.
Essentially, anything presenting itself as a USB Mass Storage Device, when plugged into one of the USB ports on the back of the machine, will bring up the File Manager window and make the contents available.  Obviously not every file type can be opened directly on the machine, but all files can at least be copied, uploaded or attached to emails.
Pretty much all common disk formats are supported, with no problems found during testing when reading and writing to USB drives formatted to default Mac OS X or Windows 8 settings.  According to the relevant Google support page, common Linux filesystems are compatible too – so the average user should rarely get into a situation where a given USB drive is unreadable.

A note about photos

Inserting an SD card or USB drive full of pics straight from a camera gives access to the pictures via the file manager.

  • Photos can be viewed as a slideshow directly from the drive.
  • Opening a photo will view the photo fullscreen.
  • Once the photo is open, the file manager also includes some simple editing tools:
    • Editing mode is enabled by clicking on the pencil icon that appears in the bottom-right corner of the preview screen/window.
    • WARNING:  ANY EDITS ARE AUTOMATICALLY OVERWRITTEN BY DEFAULT!

Web browsing

This machine essentially is Google Chrome, with enough of an OS to run it.  So browsing the web is essentially the same as it would be on any other machine supporting the same version of Chrome.

Apps

The Apps menu links to various built-in apps by default, including an app for the webstore where additional software from Google and third-parties can be installed. Note that this doesn’t mean you can install standard Mac, Windows or Linux software on this machine at all, let alone expect it to run.
Any apps installed are essentially plugins that extend the functionality of the Chrome web browser.  If you sync your Chrome settings to your Google account, then all pre-existing bookmarks, settings and apps installed on other machines should find themselves synced on the Chromebook.

Settings

All machine settings are essentially available through the Settings tab of the Chrome browser itself – with some shortcuts (date/time, WiFi, Battery) on launcher.

Email

Online

Uses the normal web Gmail interface, just like any other browser.

Offline

Available via a free downloadable Offline Google Mail app, from the Chrome web store.

  • Interface looks more like Mail.app on iPad than the usual Gmail web interface.
  • Offline syncing selectable up to whole of previous months’-worth of messages.
  • Some odd windowing issues when composing or filing messages.
  • Also default zoom levels needed reducing (eg press ctrl & – to zoom out) to make text in “Apply” and “Cancel” boxes

Smartphone interoperability

Given the cloud-based credentials of the Chromebook and Chrome OS, how does one get at photos, audio or video recorded on a smartphone?  it would seem that these should be synced to a suitable cloud-based service via some form of native app running directly on the device itself.  Once in the cloud, they’re accessed through a browser or web-app like any other web content.

Interaction with iPhone 3G (iOS 3.1.3)

 

  • No way to get photos or other content direct from device over USB.
  • No mobile Internet tethering via USB/Bluetooth. No WiFi tethering via iPhone 3G without jailbreaking the iPhone, which is untested as I don’t want to jailbreak my work phone!
  • All Google services accessible through Safari will be synced with same services accessed via Chromebook.

Interaction with iPhone 4 (iOS 6.0.1)

As iPhone 3G above, but:

  • Wifi hotspot may be possible but unable to test as the feature is locked out on my iPhone/plan.
  • All Google iOS apps, AND services available through Safari/any other browser app, will stay in sync with content accessed via the Chromebook.

 

 

Windows 8 Consumer Preview on Asus EeePC 1011px

I’ve just installed the Windows 8 Customer Preview on my netbook to see what all the fuss was about, and first impressions are…

…strangely positive!

The install took about 20mins from booting from the USB installer to having a working desktop. From that desktop, I noted that all the components were immediately usable, including a reasonable driver for the Intel GMA3150 graphics chipset.

I then tried to play with the Metro apps, quickly finding that they all require a desktop resolution of 1024×768. Since this machine (and pretty much every other netbook I’ve encountered) has a small 1024×600 panel, none of the new apps work. Frankly, given that the Metro interface is most suited to such small displays, this seems to be a bit of an own-goal on Microsoft’s part, and something that I think ought to be fixed before the final release if MS wants to give an incentive for a lot of users like me to spend real money to upgrade.

Besides Metro, the rest of the desktop interface seems to make sense. The new Start interface seems to work, and I was soon able to remove entries for the Metro apps I won’t be using. In doing so, I noted that the tiling and grouping doesn’t seem to be as flexible as most users would like – I wasn’t able to choose a tile colour or size for other installed apps, nor was I able to change their labels.

After installing Chrome, Thunderbird (with Lightning and Google Address Book addons), LibreOffice and a couple of other apps to get real work done, I’ve found the rest of the interface informative and swift.

One surprise as a former XP user wo migrated to Mac OS X, is the ability to calibrate the display colour output using a built-in tool from the Control Panel. It’s simple but surprisingly effective, removing the blue-tint. I tried this in Windows 7 when the netbook still had it, but it wasn’t terribly effective – perhaps a user error on my part.

It seems to me that for the kinds of admin, email, browsing and media consuming tasks I’d usually put this netbook to, 2Gb RAM is enough to keep Windows 8 happy – even with an email client, multiple Chrome tabs, iTunes, Dropbox, LibreOffice Write and some other apps open, memory use rarely topped 1.3Gb – better than my Dell Inspiron 6000 running similar workloads on XP.

So I’ll be sticking with this for a while, and will report back with more findings when I have time.

Finding the right OS for a basic Asus Netbook

Back in the fall of 2011 I found myself looking for a netbook-format computer, which I planned use for a combination of basic online, office and media work. Online work covers the usual email, social networking, blogging and surfing duties. Nothing too heavy, I’m not expecting to use this as a media playback machine for video, nor for games. Office-related work for me is the usual emailing, documentating and spreadsheet number-logging and number-crunching work, and the occasional printed letter. Media work is the basic management and non-critical editing of a large photo library, along with occasional audio mastering work.

Getting the hardware right

Lacking the funds for the MacBook Air that I would want for such duties, I had to look around at the netbook offerings from the rest of the market. All of them seemed to come with Windows 7 Starter Edition, and all of them seemed to offer the same 3 USB2.0 ports, SD card slot and analoge video output over VGA.

Aside from the occasional Nikon RAW photo, nothing of the work I want to do with such a machine is terribly processor-intensive, but I decided that something like the 1.6GHz Intel N570 dual-core Atom processor would give a reasonable compromise between cost, battery life, speed and future-proofing.

In fairness, there’s not much user-configuration to do on a Netbook beyond picking the right CPU/battery/storage-space for the job. All the netbooks I found were offered with only 1Gb RAM, which I thought would likely not be enough to get real work done in Windows 7, Starter Edition or not. I could easily see an upgrade to 2Gb on the cards, and was happy to see that all the netooks I found offered easy access to the RAM bays to do this.

So – I tried typing on a few machines to see how the keyboard felt, and how responsive each machine was. No point buying a machine which is unable to keep up with my slow typing, from new! Within a few minutes I found myself gravitating to the EeePC line, whose out-of-the-box software was slim enough to not bog the machine down in real use, while having a keyboard I could comfortably type on without feeling like I’m constantly having to “switch modes” from my full-sized work machines, Mac and PC alike.

So – The EeePC 1011PX became my weapon of choice – mostly because it was the only machine I felt comfortable with, that also offered a dual-core processor, decent-enough battery and reasonable hard-drive space: 320Gb is a welcome improvement on the 120-160Gb I found in other machines at a similar price-point, and should give me room to spare even with a decent (compressed) music and photo library on board.

Experience with Windows 7 – Starter Edition

So – I got the machine home, and started out with everything as it came out of the box. Windows 7 Starter Edition was a welcome modernisation on the Windows XP PC’s I’ve owned in the past. Coming back to the Windows from using Macs for 6 years was rather a shock I’ll admit – of the first 30 hours of real use I’d ended up spending 20 of them waiting for new updates to Windows, Office or other software. Thats not a good ratio, and the updates just never stopped. Absolutely hopeless.

When I was able to get real work done, I found the machine was paging to virtual memory on the hard drive pretty much constantly. Given that this only involved use of Google Chrome and/or LibreOffice, none of which for intensive tasks, it was pretty clear that a RAM upgrade was on the cards.

Asus says that this machine is capable of 2Gb RAM max, so that’s what I put in it for the princely sum of around £15 from a real bricks-and-mortar store. In Windows 7 the difference between 1Gb and 2Gb RAM was immediate, even under the lightest of use. No, it didn’t improve startup or application load times, but it was nice to finally have a machine that didn’t noticeably bog down over hours of use.

Over the next week I found a number of niggles with Windows 7 that lead me to ditch it:

  • Limitations of Starter Edition:
    • Maximum of 3 simultaneous applications. It’s not uncommon for me to have a media player, spreadsheet, word processor and web-browser open alongside each other. Bang – I’m over the limit already. None of these are intensive enough to bog down a Netbook, so this really is a silly arbitary rule that gets in the way unnecessarily.
    • Use of screen space. Again, this was a silly thing, but I found the task-bar taking up too much space for the functionality it gives. Netbooks with small screens need some thought applied to them on the part of developers, so that the content takes up more space than the UI that displays and manipulates it. All the Windows-based software failed this test badly, especially MS Office. I was able to do some things about this like hiding some toolbars, setting the taskbar to auto-hide, but it still didn’t feel right.
    • Typing lag. As the software updates racked up, the machine bogged down. I turned the bundled Anti-Virus software off which helped for a while, but the machine soon bogged down. There’s just no excuse for this kind of behaviour on any machine designed for real users.
    • Wallpaper. Yup – W7 Starter Edition doesn’t even let the user configure their wallpaper.
    • System backup/restore. I bought a 16Gb USB key to host a system-restore image because Asus, like pretty much every other manufacturer, doesn’t bundle even optical media to get the system reinstalled in the event of massive user error or hard-drive failure. It turned out that not one of the (confusing) array of built-in tools would create a bootable disk that would reinstall the system from scratch. The results were:
      • Software crash part-way through creation of the restoration media
      • Hardware crash during boot from restoration media
      • “Missing Operating System” error messages on booting from the restoration media
      • Once booted, the restoration software failed to see the backup image as a valid image, OR would refuse to recognise the machine as a valid installation target.
    • The results were repeatable across a variety of USB flash-drives, USB hard-drives and even DVD media created using an external drive plugged into this machine.

So, after wasting two days trying (and failing spectacularly) to get to a point where I was confident that I would be able to reinstall the system software in the event of a failure (which will happen one day), I took the decision to ditch the Windows install and look for something more suitable.

Alternative OS’s

I briefly tried and reviewed the following alternative operating systems, and concluded the following:

OS: Pros: Cons: Notes:
Android x86 ports Very, very fast even from SD card.Nice, modern interface, works well on smaller screens.Great battery life.

Small footprint.

Excellent syncronisation with Google mail, calendars, contacts.

Software selection very limited.Getting the machine to sleep needs some hacks.Machine thinks it’s a phone, which means that its software doesn’t know how to interact with local file storage on a hard drive.

Too much reliance on a working Internet connection.

Stability issues.

Too much of a chore to get real work done, stored and sent out.

 

One to watch.Releases 3 and 4 used.I really wanted this to work out – I’m all for “unusual” solutions where they bring real benefits.

 

Ubuntu 11.10 Well-known,All hardware works immediately.Stable.

Reasonable use of battery and other limited system resources.

Good selection of sofware bundled or in repos.

Long boot time.

Iphone Internet tethering worked out-of-the-box over USB

Great online forum community.

Unity interface can get slow and glitchy.Gnome Shell nice enough but slow on mobile hardware.Needed time to whittle down the UI to make efficient use of display.

KDE too complex/fiddly for daily UI use, especially on small screen.

Flash video really slow, especially for BBC iPlayer content.

Desktop/Window managers tried were:Unity (2D and 3D),Gnome 3

KDE 4

Openbox

LXDE

#! – Crunchbang linux Excellent speed.Light on resource.Highly customisable.

Hardware worked out-of-the-box.

Good range of software in repos.

Great online forum community.

Iphone tethering took a lot of work to get running, including compiling a new kernel and some drivers/pairing software.Kernel and some other software running behind the times.  Recommended.My favourite out-of-the-box install, let down by driver support on newer hardware. 
Fedora 16 and 17 Faster than Ubuntu in general use.Great online forum community covering a wide range of uses. Even more resource-heavy than Ubuntu when running comparable desktop/window-managers.UI default settings not good on small screens.Slower bootup than Ubuntu. I loved releases 1-3 back in the day, but I think it’s been surpassed for most” normal” users by Ubuntu.
Haiku OS Fabulous speed and use of resources.UI is efficient and great on small displays. Clearly not a “finished” solution.Software and drivers not available. One to watch.I was a fan of BeOS 5 back in the day, and would really like to see its community-driven successor.
Joli OS (Jolicloud) Nice presetation of applications.Online syncronisation of apps, settings and content is enticing.Based on Ubuntu. Iphone tethering never worked correctly.Dropbox integration doesn’t produce a local cache.Integration with Google Docs needs a working Internet connection.

“Offline” operations are possible but not easy.

Application “store” not terribly intuitive.

I wanted this to work, but the silliness of having no offline cache or operability with built-in apps made me run away screaming.
Chrome OS Similar to Joli OS Hampered the same way as Joli OS.Wasn’t able to try on real hardware as none of the available builds booted on this machine.
Pear OS Slicker than Ubuntu, slightly quicker to boot. French localisation can’t entirely be turned off.Some rough edges to UI,Some installable software didn’t work correctly One to watch, if it ever takes itself seriously enough to fix the rough edges.Watch out for a lawsuit from Apple – there’s a lot of UI similarities and even straight copies of some elements. Good for Apple-savvy users, perhaps. 
Peppermint OS Two Almost as quick as Crunchbang, to boot and in use.Quicker in use than Ubuntu.All hardware worked out-of-the-box.

Default Openbox config works well on small screens as it comes.

Insane battery life compared with box-fresh Ubuntu or Windows 7 installs.

Some fiddling required to make it look and operate like a modern OS.Iphone Internet tethering worked, but only after installing ipheth-pair software.  The all-round winner in my testing.

The above list is by no means complete, and clearly doesn’t cover every option out there. It does cover a good range I think of the different OS concepts and OS’s out there,

Building my workspace in Peppermint OS Two:

So far I’ve imported my documents, music and photos, and have installed:

  • Peppermint OS Two base installation
  • LibreOffice office suite, with toolbars set to “small” mode.
  • Evolution for email, contacts and calendar management, synchronised to Google account with built-in tools.
  • Dropbox for online document storage/backup.
  • xcompmgr for screen shadow and transparency effects.
  • Docky for Mac-OS-like dock. I’m a sucker for UI niceties, so long as they’re capable of getting out of the way when I’m trying to get real work done.
  • Ipheth-pair utility to get iPhone Internet tethering working.
  • Shotwell for photo library management and basic editing.
  • Audacity for basic sound editing.
  • Gimp for more advanced image processing/editing.
  • VLC media player.
  • Google Chrome browser. It’s built-in bookmarks/app/settings synchronisation has been a genuine lifesaver while I’ve been trying to find the right OS/workspace for this machine, working for everything except Haiku OS and (strangely) Android.
  • Gwibber for basic access to Twitter.
  • Skype for transatlantic voice/video calls.
  • DOSBox for some light relief playing old games, such as:
    • Monkey Island
    • Simcity Classic
    • Simcity 2000
    • Lemmings
    • Pipe Dreams
    • Test Drive series

Things to fix:

As I’ve typed this post, I’ve found that everything seems to be working well together, with LibreOffice Writer consistently keeping up with my (not exactly stellar) typing speed. There have been a couple of niggles though:

  • Backup
  • Trackpad – it works, but a little too well during typing, sometimes invoking a click as I tap it accidentally while typing, even when I’ve turned “tap to click” off.
  • Screen colour calibration – I’ve been spoiled by how easy this is to do (by eye) using built-in tools on Mac OS X, and could do with finding a similar method here on Linux.
NAD 3020 where it should be: In our rack!

NAD 3020B: Keeper or Clunker?

Been a while since I last posted on anything audio-related – I’m taking that as a good sign because I know I’ve been enjoying a *lot* of music lately.

NAD 3020 where it should be: In our rack!
Our NAD 3020B in use. (Please forgive the poor photo!)

Many an audiophile posting online has an extremely polarised attitude towards the humble NAD 3020 series of integrated amplifiers, which seem to be very much a “love ’em or hate ’em” box. I always thought I was in the “love ’em” camp, but until I inherited a 3020B from my father at the end of last year I never quite knew why. It’s not been the easiest of journeys, so please bear with me as I try to explain what I’ve found and what was going on at the time I found it.

If there’s any one lesson to glean from this experience, it’s that getting hifi sounding good is as much about the interaction of components working together as it is about finding of well-engineered components and slinging them together according to a spec-sheet.  These are also differences that I feel can make or break a system over the long term, but may not be immediately identifiable in typical demonstration arrangements that most stores can offer.

When inheriting our current system, my intention had been to replace my existing components one-by-one so I could check how the sound was changing at each stage on the way. I first swapped the speakers, as mentioned in another post. I then started repairing and using the record deck – plenty of other posts on that particular subject. With that now mostly bedded-in, i’ve come to the final part – using the 3020B.

Build quality

As a whole the unit feels well manufactured. Years of dust needed cleaning out of the phono contacts before connecting anything, but the speaker output binding posts are firm and accept 4mm banana plugs without modification – this amplifier was made in the generation(s) before the EU got their teeth into manufacturing regulations in the mid-90’s.

The source-select buttons are known on this series to be of slightly cheap construction, resulting in the plastic caps flying across the room when a new source is selected. Also, the source input sockets are somewhat loose.  This might be a result of their PCB flexing slightly when connections are made, or it might just be that the dimensions tolerance of the sockets themselves isn’t quite right. Again, this is a common flaw with amplifiers of this series, perhaps even of this generation.

The switches operate silently so far as the audio path is concerned, and the Bass, Treble, Balance and Volume pots/knobs also operate silently – rather impressive for such an old unit, especially if it’s ever been exposed to cigarette smoke, pets, small children and life’s little accidents as I know this one has.

Overall this unit is in better physical condition than I could have asked for – some surface grime aside, it’s basically unmarked except for the small hole drilled into it side where an intruder-alarm used to have a line threaded through it as a crime-prevention method. It’ll be an extremely rare find on Ebay that turns out in such good condition.

Sound quality – Take 1

Used with the Tannoy Mercury M20 loudspeakers it had been paired with in its previous home, the first impressions were that it is far warmer in tone than the 302 I was comparing it to, even with all tone controls at neutral and the loudness control off. Bass has more depth, stereo imaging is wider and deeper, but treble felt like a veil had been placed over the speakers.

Some experimentation with the Soft Clipping circuit showed no audible difference whether it was switched “in” or “out”.  I prefer to be safe rather than sorry, so I’ve left it “in” for now.

Another interesting experiment was to assess any audible differences between using the “Normal” (Low and High-pass-filtered) and “Lab” (Unfiltered) power amplifier inputs.  Theoretically the “Normal” input should be used, to filter out frequencies below 20Hz and above 20KHz, enabling the amplifier to use all its power in the audible frequency range and to run without interference.  The “Lab” input sounds better to my ear – soundstaging feels more solid, and the tonal balance a little more accurate throughout the entire frequency range. (See the first comment on this post for more about the correct selection of “Normal” vs “Lab” input).

Even having worked out which signal path to use, and to avoid the “Loudness” button, the amplifier was still not producing an overall sound I thought I could live with.  I therefore started to do some tweaking to work out where the “problem” was, if only to understand what was going on.

Experimenting with Pre/Power amp combinations

Both the 302 and 3020 have pre-out and power-in socket sets, allowing either to be used as the power amp for the other’s pre-amp section. First of all I wanted to see if the older 3020’s pre-amp section was the cause of the slightly muted treble. Some re-plugging later, I had both CD and LP feeding the 3020 pre-amp section, which in turn was wired to feed the power-amp of the 302. This combination had narrower imaging, slightly leaner bass, and still the soft treble that felt like it was hiding something.

Next I swapped the amp sections round, with the 302 pre-amp now feeding the much older power-amp section of the 3020, and everything seemed better. The soundstage was locked tight between the speakers for centred instruments and vocals, but there was much freer reign for anything panned between and even outside the speakers to be given space to do their thing. Either amplifier seemed equally capable of playing ‘depth’ information in recordings that have it, and so this was the way I left the units set up for some weeks while I got settled with the record deck and its cartridge.

Listening to the Tannoy’s through the 302 (using both its pre and power sections) I thought the sound was nicely tonally balanced, but always felt like I was listening through an imaginary window that the box placed on the musical world being painted in front of me. Conversely, the 302-pre and 3020-power combo gave slightly more extreme bass and treble presence, and effectively took away that windowed effect while fixing the veiled treble of the older amplifier used on its own.

System changes – a second chance?

Having settled on using the 302 pre-amp and the 3020B power-amplifier, a couple of things changed. First off, I found the new complexity of the system somewhat frustrating, but were willing to live with it if that’s what was going to give us the best overall sound. Then came the other major shift in our listening; I upgraded the phono cartridge to a Denon DL-160 MC (High output), seeking more accuracy of sibilants and better soundstaging. This much I got, but then many recordings were now too bright. Whether this was a result of longer-than-optimal running times on some discs, or perhaps due to an active mastering decision, I’ll likely never know.

Phono stages

With the new cartridge in place, switching between 302 and 3020 phono stages showed the differences between them were surprisingly subtle, but the older stage won out. It seems to reveal more midrange detail than the newer design, particularly with female vocals.  There’s also a lot more information being played from the background of mixes, better rendering things like room ambience and reverb tails. It also has better overall dynamics, and the soundstaging is a little deeper and wider.

This surprised me, since on paper the older design looks like it should perform worse than the new one. For one thing the signal-to-noise ratio quoted by the manufacturer is slightly higher in the older design, and I would expect its component tolerances to have drifted enough with age and use by now to have a significant negative effect, likely leading to loss of high-frequency detail and increased noise.

Just one side-note on the 3020 phono stage – it has two modes, one for MM (Moving Magnet) cartridges and the other for MC (Moving Coil) cartridges. MM carts typically have higher output levels than their MC siblings, but our MC is a “high output” model, compatible with conventional MM stages. Having tried the unit in both modes, neither sounds different than the other, even when the setting is “wrong” for the kind of cartridge in use. The phono stage shows ample headroom – I did experiment with using the MM cartridge with the extra amplification of MC mode and could hear absolutely no evidence of added distortion, even with discs mastered with very high recording levels. Further, using the MC mode with its extra gain ought to bring more measurable background noise into the mix, but I’ve yet to hear this in practise.

The 3020B on its own – Take 2

I decided to give the amplifier a second chance to fly solo, with vinyl as the primary source. Soundstaging now sounds wonderful with well-mastered discs in good condition – Pink Floyd’s “Dark Side of the Moon” and Eric Clapton’s “Slowhand” show a lot of their natural recording ambiences.  Newer, more synthetic recordings like Enya’s “Watermark” or Jean Michel Jarre’s “Revolutions” sound as modern as their source material and production values should present, with the end result sounding always convincing and really very human. Every instrument and voice has its own space in the mix, with no particular instrument or frequency range standing out above any other.

Poorer or duller discs can easily be improved with an adjustment via the tone controls. The effect of the tone controls is subtle but effective – I don’t feel like either circuit (Bass or Treble) impedes any other aspect of the sound passing through it other than whatever I’m telling it to do. Most bass-light recordings are usually too heavy in the treble, so a slight treble reduction usually brings things back into perspective. The inverse tends to be true if a recording is bass-heavy – usually a slight treble boost evens things out.

Turning to digital sources, playback again felt like it was lacking some treble at first, and the soundstage was somewhat vague. For most TV and DVD content we watch this isn’t a bad thing, and easily fixed with a slight adjustment to the treble control.

With playback of CD or downloaded content from our EMU 0202USB, it seemed that while bass and mid-range were coming through with much more timbre than I’ve been used to, and a much more even tonal balance, the high-frequency content was being reduced slightly, and felt slightly hazy, if such a term can apply to audio.

Having noted a slight increase in treble response over the few weeks the system lived in this new state, I’d have been happy to leave it there, concluding that either the increased usage had brought some components and connections back within tolerance, or (more likely) my subconscious processing of what I’m hearing was adjusting to the new system.

But then I made a discovery:  I could change the settings to run the DAC at a much-increased sample rate of 176.4KHz and 24-bit, with internal volume processing being done in the computer at 32 bits. This had the overall effect of giving slightly more audible treble, but more importantly it gave a lot more definition and control to the treble content.

I’ll likely write separately about this transition, but it really does take the digital playback to a level that competes with the best of what our vinyl source can give us. Listening to Royksopp’s “Senior” album for example, bass frequencies go into (and possibly below) sub-bass territory and the system keeps up, resolving the basslines with good speed – at no time does any bass note feel like it’s stopping later than it should. Synthesised kick drums tend to have very short attack times, and these are resolved wonderfully, the tonality of each kick drum making even different synths identifiable.  This is something I’ve never experienced before.

Remastered recordings I’ve complained about before (Al Stewart’s “Year of the Cat” and Genesis’ “Trick of the Tail”) are still a little too treble-heavy for my tastes, but have huge amounts of spacial and vocal definition, and are finally on a par with the original vinyl releases of the same albums.

Conclusion?

Based on some very practical testing, done by ear and confirmed with others who were unaware of the tweaking going on behind the scenes except for the cartridge upgrade, I have concluded that my 3020B is very much “a keeper”. Its warm tonal balance is generally flattering and does not interfere with the finer details of dynamics, soundstaging and definition. It is certainly able to show up any flaws of the recordings and source devices it’s amplifying. I think it fair to surmise that it does a good job with entry-level devices as they come out out of the box, but it does a truly great job when fed with higher-end devices, whatever form they would take.