Access and use "technological measures" – a legal distinction without a technological difference?

Last Friday I had the opportunity to speak with a lawyer that wastrying to understand the differences between “access controls” and “usecontrols” in the context of technological measures used by copyrightholders. Bill C-61, in the definition of “technological measure”, makesa differentiation. In our discussions she observed that nearly everytechnological measure that controls the “use” of a copyrighted workrestricts “access” to the work first. It was then asked if if wasappropriate to differentiate between the two at all.

I’m a technical person, and while I love to talk to lawyers abouttechnology law, I can’t answer why lawyers want to make thisdistinction. All I can do is share my technical knowledge, and hopethat the legal community will author and interpret laws that make sense.

Here is what Bill C-61 says:

“technological measure” means any effective technology, device or component that, in the ordinary course of its operation,

(a) controls access to a work, to a performer’s performance fixedin a sound recording or to a sound recording and whose use isauthorized by the copyright owner; or

(b) restricts the doing — with respect to a work, to aperformer’s performance fixed in a sound recording or to a soundrecording — of any act referred to in section 3, 15 or 18 and any actfor which remuneration is payable under section 19.

Section 3 (copyright in works), 15 ( Copyright in performer’sperformance) and 18 (Copyright in sound recordings) are lists ofactivities which require permission from the copyright holder in orderto do with their work, and section 19 (Right to remuneration) talkabout activities which are under a compulsory license (See: Copyright: locks, levies, licensing or lawsuits? Part 2: levies) for Performers and Sound Recording Makers.

Technological measures that “(a) controls access” are called “accesscontrols”, and measures which “(b) restricts the doing” are called “usecontrols”.

From a technological point of view, we need to translate this legalspeak to real-world technology which obeys the laws of physics.Content, whether digitally encoded or in an analog format, cannotitself make decisions or do things to itself (copy itself, “selfdestruct”, read itself out loud, etc). For this you either need a humanbeing (for human observable content) or some sort of hardware andsoftware combination that is able to “observe” the content and thenmake it observable to a human.

As I included in a handout for the meeting (OpenDocument, PDF),there are some things that can be done to content. You can usecryptography to convert ordinary content (plaintext) to gibberish(cyphertext) in a way that requires a decryption key to convert thegibberish back to the content. Cryptography can also be used todigitally sign the content, and watermarking can be used to identifycontent or embed hidden messages in the content.

These are the most robust types of technological measures used bymodern digitally encoded content. In the past an additional techniquewas used, which involves creating deliberate defects in the media. Themarketing claim was that it was possible to introduce defects whichwould cause content to not be accessed by some types of devices (forexample, VCR’s) but would work perfectly fine for other types ofdevices (for example, Televisions). As anyone who remembers Macrovisionon VHS cassettes will know that this never quite worked well. Thedefects would mess up many televisions and would not be noticed by someVCRs. This technique was tried on a variety of content, and evenincluded putting laser holes in floppy disks for game software — withall of these copy control methods being trivial to defeat for those whowanted to infringe copyright, and often made the content fail to workfor legitimate customers of that content.

Side-Note: Many people in the movie industry don’t think fondly ofMacrovision. What they really created was a type of ‘tax’ thatMacrovision could collect for every commercially produced VHS moviedistributed and every ‘legal’ VCR. Fairly inexpensive off-the-shelf Time base correctiontechnology eliminated the Macrovision defects, bypassing this allegedcopy control. It was expensive for the industry, annoying forlegitimate customers, and easily circumventable — the only winner wasMacrovision itself.

Beyond these techniques (cryptography, watermarks, deliberate mediadefects), everything else that is done with technological measures isdone in software running on some computer hardware.

If you look at the second handout I used (OpenDocument,PDF)I offered some details on 11 different scenarios involvingtechnological measures. The most common real-world situation is this:

a) content is encrypted and only distributed/communicated in encrypted form, accessibly only with the right decryption key
b) decryption key is embedded within specific devices or software,forcing customers of the content to use one of the “authorized” devicesor software.
c) These devices and/or software are locked down to disallow their owners to control the device/software.

The first thing to notice is that from the perspective of thecontent, it is encrypted to only allow it to be accessed by authorizeddevices. In my mind as a technical person, that is an “access control”technological measure as it controls access to the work.

There is also a technological measure applied to the hardware and/orsoftware by the hardware manufacturer or software author. If there arerestrictions on what people can do with content accessed by thishardware/software, it is encoded in the rules authored by the softwareauthors. This means that “use controls”, when they exist, are authoredin software by software authors and executed on computer hardware. Theyare not things which can be applied to content alone.

Internationally renowned security technologist and author Bruce Schneier has said a few timesthat, “trying to make digital files uncopyable is like trying to makewater not wet”. This is common knowledge to everyone in the computersecurity field.

What can be done to content is to make it inaccessible without theright technology to access it. This means that any technology thatclaims to “(b) restricts the doing” starts with somehow forcing peopleto use “authorized” hardware and/or software, and then implements anyuse restrictions in that software.

Is it possible to have a ‘use control’ without an ‘access control’?

Use controls are accomplished in software running on hardware. Ifpeople are free to make their own hardware and software choices, thenthey will make choices of combinations that meet their own demands, notthe demands of someone else. The user of software under their owncontrol may choose to use the software in a lawful way and neverinfringe copyright. The software may even help them by making theintentions of the copyright holder clear to the owner of thetechnology. In this case we are talking about copyright being enforcedby the law and courts rather than by technology disobeying theinstructions of its owner.

Encrypting the content to try to revoke hardware and softwarechoices from audiences is one common technique used by copyrightholders. There is, however, an even worse situation: governmentregulation of technology.

In the USA under the title of the “Broadcast Flag”they have been discussing a regulatory regime where any hardware andsoftware that is involved in the reception of broadcast signals arelegally not allowed to be under the control of average citizens. Whenwe consider TV tuner cards able to be plugged into generic computerequipment, this essentially disallows hardware and software choice fora massive amount of consumer electronics.

In both cases the goal is to disallow average citizens to own andcontrol (through software choice) their own communications technology,and in both cases it is radical changes in the law against theinterests of technology owners that is the root of the problem.

Why this distinction with a difference matters

Close observers to the digital copyright debate will notice something important.

With few exceptions, the proponents of anti-circumventionlegislation are thinking entirely about “digital locks” being appliedto content that in theory will protect the interests of the copyrightholders of that content.

With few exceptions, the opponents of anti-circumvention legislationare focused on “digital locks” being applied to hardware and softwarewhich oppose the legitimate interests of the owners of the hardware andusers of the software.

This debate is hard to understand until you realize that thesetechnological measures can be applied to many things (not justcontent!), and are being applied by other than the owner of what thetechnological measure is being applied to. Different participants inthe debate are focused on the consequences (unintended or intended) ofapplying different types of technological measures to different things.

The fact that there are policy makers wanting to make a legaldistinction between “access controls” and “use controls” in the lawsuggests that they may not be aware that in practise nearly everyconceivable “use control” starts with an “access control”, unless weare talking about further government regulation against technologyowners. They may also not be aware that the people who control the “usecontrol” technology is not the copyright holder, but the softwareauthor — and that this software author will have their own interests inmind when authoring the software, not the interests of their customersor copyright holders.

Related: Even in the “DRM” debate, Content is not King.

Would you recommend this article?


Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.

Jim Love, Chief Content Officer, IT World Canada

Featured Download

IT World Canada in your inbox

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Latest Blogs

Senior Contributor Spotlight