Wednesday, April 23, 2014

Copyleft

The history of the "copyleft" movement is more or less coeval with the emergence of the Web as a means and mode of sourcing, creating, and publishing works, whether texts, images, music, or multi-media productions. Some trace the movement to the desire of some computer programmers in the 1970's to create versions of programming languages and software that could be developed independently of corporate-owned commercial products. Others see its roots in the xeroxed or mimeographed "zine" culture, or in pubk-rock designers' preferences for public-domain clipart. Whatever its exact origins, a signal moment arrived in 1988 when Richard Stallman created the first copyleft license, which he dubbed the EGPL (for Emacs General Public License), which evolved into the GNU General Public License in 1989. A version of this license was, until recently, used by the Wikipedia and many other wiki- and crowd-sourced sites; in essence, it declares that the material licensed may be shared by anyone as long as a) they indicate the source; and b) the same license to share is imposed upon all subsequent users. The GNU license originally didn't contemplate "remixing" in which the GNU-licensed material might be slight, nor did it account for problems that might occur when someone tried to copyright a longer text that quoted from GNU-licensed material. In 2009, ion part because of such concerns, the Wikimedia Foundation dropped the GNU license in favor of a Creative Commons license known as CC-BY-SA, which has similar attribution and share/alike, but is friendlier to quotation/remix, and doesn't requite reproducing the full text of the license in every work (Creative Commons maintains human-readable and legal versions of all its licenses).

Creative Commons licenses are now quite common, and have been used on blogs, text archives, wikis, streaming music and media, and downloadable media. I haven't yet seen one on a printed book, but it's certainly conceivable that it will be on certain e-books or other references works. They offer the advantage of a variety of licenses, which allow for a) Copying; b) Modifying; c) doing either with or without attrubution; and d) allowing commercial or non-commercial use. Creators of content can thus exercise as much, or as little control over the re-use of their material as they like. Of course, no CC licensed material has yet been at issue in a courtroom, and as a result there is no case law to indicate how effective these licenses may be when it comes to the vital question of whether and to what extent they are enforceable. And perhaps that's best, for now.

Wednesday, April 16, 2014

History of Copyright

The idea of an inherent right of the author of a written work to protect it from unauthorized copying is, in terms of western history, quite recent indeed. The 1709 "Statute of Anne" was the first legal recognition of the rights of an author. It presented itself as "an act for the encouragement of learning," with the implicit argument that allowing authors the exclusive right to publish their work for a limited term would enable them to earn some reward for their labors, while at the same time eventually allowing their work to be used freely. As with earlier systems of intellectual property, such as "Letters Patent," the Act's term was limited -- 14 years, which could be extended for 14 more, after which the rights of the author expired; it was understood then, as it is now, that authors, like inventors, quite frequently drew from the works of those who have come before them, and that preserving such rights indefinitely would stifle creativity. One thing that has certainly changed since 1709 is the term of copyright; US copyright eventually settled on a period twice as long as the Statute of Anne (28 years, renewable for 28 more years); revisions to this law in the past three decades have extended these 56 years to 80, 100, and even as many as 120 years; the last of these, the "Sonny Bono Copyright Extension Act," went further and even re-instated copyright in works where it had become extinct, freezing the date at which works could enter the public domain at 1923. Many creative artists feel that this law has exercised a stifling effect upon creativity; many of them joined in support of a legal case, Eldred vs. Ashcroft, that challenged these extensions on the basis of the Constitution's reference to copyright law being for a "limited term." The Supreme Court eventually ruled against Eldred, saying in effect that Congress could establish any length of term they wanted, so long as it was not infinite. Could, is of course, not should.

The result has been, ironically, that in the very age when the ability of writers, artists, and musicians to draw upon, alter, and incorporate what the copyright office calls "previously existing works" is at its greatest, the legal barriers against doing so have been raised to the harshest and longest in the history of copyright protections. This is offset, to a degree, by two factors: 1) "fair use," a doctrine established in the 1977 revision of the law, whereby a certain limited amount -- say, less than 10% of the original "work" -- may be used so long as it is not employed for profit, is used in an educational context, and/or used spontaneously; and 2) simple lack of enforceability. It's quite impossible to police all the billions of web servers, web pages, and personal computers and devices, to ensure that no copyrighted material has been taken or stored; enforcement, as a result, tends to be spotty if dramatic (as in the case of a woman in the midwest who was assessed a fine of 1.5 million dollars because her son had shared 24 music files on his Napster account).

It needs to be noted that copyright also functions very differently depending on the medium in question. Printed texts are straightforward enough, but in the case of physical media such as a sculpture or a painting, possession of the physical object confers certain property rights, including the right -- if one desires -- to restrict or prohibit "derivative" works such as photographs of these works, although the issue of non-manipulated or "slavish" copies is a murky one. Music is the most complex form: there are at least four layers of copyright in a recorded song: 1) The composition itself, and its embodiment in sheet music; 2) The performance of that composition on the recorded matter, including the act of interpretation and any variations on the composition; 3) The physical embodiment, if any, of this performance, known as "mechanical" rights; and 4) The right to transmit the performance. All of these, of course, were once separate domains: the sheet-music industry/print, the recording studio, the record company or "label," and radio stations -- but all are now merged indistinctly into a single, complex activity that can all be achieved on a single device, even a smartphone.

In Hip-hop, nearly all original samples were taken from vinyl, and most consisted of a few measures of the "break beat" -- far less than 10% of the original recording.  And yet, by cutting or looping this beat, it could in fact become the rhythm track for an entire recording.  How then to measure that, in terms of originality? (One recent legal formulation of this question -- "fragmented literal similarity" -- muddies the waters rather than clears them).

As things have turned out, neither of the two most significant cases of copyright litigation involving Hip-hop ended up dealing with this question.  In one case, where Biz Markie's half-singing the chorus from Gilbert O'Sullivan's "Alone Again (Naturally)" was represented by his lawyers as being part of the natural creative practice of Hip-hop -- an argument which they lost spectacularly, as the recording was quashed, all copies ordered destroyed, and substantial damages awarded.  In the other case, one which reached the Supreme Court, Luther Campbell and 2 Live Crew were sued for their parodic Hip-hop version of "Pretty Woman," originally made famous by Roy Orbison.  Their lawyers quite wisely avoided the originality issue altogether, arguing that the 2 Live Crew version was a satire, an argument accepted by the "Supremes" on first amendment grounds.  After all, if we're going to use our free speech to parody or mock others, we'll have to be imitative, won't we?  It's too bad Biz's lawyers didn't make the same argument.

Saturday, April 12, 2014

Surveillance

We are being watched. And yet, although public security cameras are the most visible and obvious signs of surveillance, there are now many more efficient ways to follow an individual person. A few weeks ago, a German politician, Malte Spitz, made headlines when he asked for his records from Deutsche Telekom, and found that they had recorded his exact location in terms of longitude and latitude more than 35,000 times in one six-month period. And yet this is nothing new; cell phones only function when they can be located by the cellular system; the only news was that the information was retained. Indeed, there have been many sociological studies made using anonymous cellphone data to examine traffic patterns, pedestrian flow, and other broader areas of human society. Such data exists, and it would be a small leap indeed to link it with personal information.

Imagine that, perhaps because a person was suspected of terrorist activity, the FBI issued one of its nefarious letters, and obtained the cooperation of all the relevant agencies. It would be easy to track such an individual: their cell phone would reveal their probable location 24 hours a day; their web use and browsing history could easily be obtained, and their e-mail intercepted. Radio tags on goods they purchased, surveillance footage from cameras in stores where they shopped, and so forth would add to the picture, and a simple search on Intellius or another such site would show tax records, criminal records, and sundry other background information. And yet, perversely, all this information might not, assuming the person really were up to no good, be sufficient to prevent their carrying out their plans -- and, even if the person were innocent, the information could still, according to recent court decisions, be retained indefinitely.

Fear seems to predominate, and although we may all have a creepy feeling of being followed, few want to rock the boat by complaining. In London, one of the most surveilled cities on earth, some have estimated that there is one camera for every 14 citizens -- 421,000 in London alone. And yet there have been relatively few instances of vandalism of these cameras. In more rural areas of England, in contrast, roadway cameras designed to catch speeders -- the hated "Gatsos" -- have been frequently vandalized, with the means varying from spray paint and hammers to, in a number of instances, bombs. Are rural Brits angrier than urban ones? Is a speeding ticket more hated than the idea of being watched while one shops?

Here in Providence, there is a series of outdoor, building-mounted cameras operated by the city, state, and federal governments, along with any number of private cameras, but few people seem to take notice. Traffic cams have provoked some outcry, but no direct attacks. Most recently, with the aid of a special act passed by the RI Legislature, cameras have been mounted on school busses to track those who pass while the lights are flashing. And, although I would swear I've never passed a school bus in my life, one of these cameras saw me going down Westminster street past the entrance of Classical High School -- I was one of four cars which drove merrily along past such a bus. So, though I still don't see why a bus parked at curbside at a school would use its flashers, I had to pay my fine with the rest -- and, by contract, 75% of the fine goes to the private company that installed the cameras, and city and state split the remaining quarter!

Thursday, April 3, 2014

Hand-held Media

I suspect that, given the topic for this week, you probably weren't expecting to see .. a "transistor radio." It was the latest in hand-held media technology when it débuted in 1954, in an era when a "radio" was a piece of furniture only slightly smaller than the sofa. It was co-produced by Texas Instruments, who would later be a pioneer the field of computing; four years later, its laboratory would be the birthplace of the integrated circuit, and it would produce the first hand-held calculator in 1967 -- for the low low price of $2,500! It was probably around that year that I first got my own transistor radio, complete with a single monaural ear-bud, and saw one of the TI calculators my dad had brought back from the lab at General Electric (he actually had to sign it out, since it was such an expensive piece of hardware).

Of course no one foresaw in these early days that there would come a slow, Frankenstein-like convergence which would create a new device that would serve not only as a radio and a calculator, but also as a camera, video camera, music player, and telephone. The sheer weight and size of all the devices and media that an iPhone or Android smartphone replaces would easily top a hundred pounds, and take up an entire living-room wall. Weighing in at an average of 140g (about 5 ounces), the LP's that would be needed to equal a 32 GB iPhone loaded with music (400) stacks up to 125 pounds, not counting the weight of the sleeves and covers!

The milestones along the way are worth remembering, even as they fade from our sight: the Walkman (1978), the Discman (1984), and the first iPod (October 2001, scarcely a dozen years ago, if that's possible), and the first smartphone (IBM's Simon in 1992). One could very well ask, what could possibly be next?

Wednesday, April 2, 2014

Make Love Not Warcraft

Sometimes, there is a narrative which so perfectly casts into relief the connections and differences between various old and new media forms that it becomes a kind of date/time stamp for the history of media. Orson Welles' Mercury Theatre production of War of the Worlds, Gary Ross's 1998 film Pleasantville, or the recently viral video of a baby trying to use pinch and swoosh finger controls on books and magazines, are among those that come to mind. And to that very select list, one should really add episode 147 of South Park, Make Love Not Warcraft.

The episode's first stroke of genius was to collaborate with Blizzard Entertainment, which custom-produced the computer game segments, even adding features -- such as synchronizing mouth movements with speech -- not actually available in the game. The second stroke was using the South Park character voices with their Warcraft avatars (as the game would look and sound to those using software add-ons such as Roger Wilco or TeamSpeak) so that the kids' voices come out of their tough-looking overbuilt Warcraft selves. Thus there is irony in every scene, the more so when a balding, beer-bellied, potato-chip munching man wearing glasses and a carpal-tunnel brace turns out to be the big-bad fellow who is "killing everyone in the game." Blizzard executives are show being stunned to discover that this character has become so strong, he's even killing their admins -- he has been playing WarCraft all day every day since it came out, such that he must have no life whatsoever outside of the game. So, as one Blizzard exec asks in Master Po fashion, "How do you kill that which has no life?" The kids will show us the way.

Thursday, March 27, 2014

Social Media II

The range and size of social media networks has increased almost exponentially in the early years of the twenty-first century. We've gone from early forums in which only a few hundred people might participate, such as a BBS or a LISTSERV list, to truly mass media such as Facebook and Twitter, which have billions of users around the globe.

But much more than just size has changed. At a certain 'tipping point,' social media begin to function in ways that, when they were smaller, would have been impossible. Facebook and Twitter have been credited as playing roles in the "Arab Spring" in the Middle East, particularly in Egypt and Tunisia; Facebook's founder has been the subject of a major Hollywood film; and twitter feeds and cell-phone photos has brought down politicians of every party, sometimes within a matter of mere hours. It certainly sounds as though these technologies have crossed some threshold, altering the fabric of reality itself -- but then, of course, one can look back at similar claims made about virtual-reality video helmets (anyone remember Lawnmower Man?) and wonder whether these revolutions will seem such a few years from now.

Three key developments have shaped this period: 1) Social media with "presence" -- a main page at which users can add or copy content, offer images, texts, or video of their own making or choosing; 2) Sites with instant linkability -- the ability of users to add (or subtract) active and immediate connections to other users; and 3) Sites that bundle essential tools (e-mail, instant messaging, and other software capabilities. Finally, all of the above, or at least the survivors in this highly competitive field, have gone multi-platform; no social medium of the future will thrive unless it is available on desktops, laptops, tablets, and smartphones, and has some system of synchronizing all its users' preferences and updates.

So what next? The spaghetti is still being hurled at the (virtual) refrigerator wall; Blippy, a site that enabled shoppers to instantly "share" posts about their purchases was hacked, and credit cards compromised -- so much for that! -- Google tried to launch its own "Wikipedia killer," dubbed Knol, but the site filled up with spam so quickly that it became almost useless, and Google discontinued it; it also failed to generate "Buzz," a hot-button social networking site that irritated users with its auto-generated list of "contacts," and Apple stumbled with Ping! an addition to its popular iTunes platform meant to enable people to share news about music purchases and performances. The latest entry Pinterest, allows users to "pin" content to one another, with a focus on bargain shopping, and has the unusual distinction that a majority of its users, in many surveys, are women. But will it go the way of the Lifetime network?

It may seem we're already "shared" too much in this era of TMI, and these social media may be reaching their limits -- but I wouldn't bet on it.

Tuesday, March 25, 2014

Social Media I

The evolution of social media can be conceived of in many ways -- in one sense, it could be said that language itself was the first social medium. Even then, considering a "social medium" to be any means of transmitting or recording language over time and space, alphabetic writing could well be seen as the earliest, followed swiftly by the development of the "letter" as a social form, which dates back to at least the seventh century BCE. The ancient Library of Ashurbanipal, King of Assyria from 668 to 627, included personal letters written in cuneiform on clay tablets.

The telegraph and telephone come next in line; even if, as a recent NY Times article noted, the phone is experiencing a slow decline, it remains our oldest electronic social media. I'm old enough to remember the old "Reach out and touch someone" adverts for Ma Bell, and for a while, there was nothing more direct and personal than a phone call. Electronic mail protocols over ARPANET and its successors debuted in 1969, but did not become a common form of communication until the late 1980's; well before then, home computer users setting up BBS sites where they could post notices and download simple programs. My home town of Cleveland had a huge site, Freenet, where you could also get medical advice from doctors at Case Western Reserve and University Hospitals. The WELL, a large social site based in San Francisco, was the first home of integrated mail, chatroom, and file services; perhaps not coincidentally, it was also the site of the first case of online impersonation that went to court (a man was sued by two women for pretending to be a different, older woman who was a mutual friend).

In academia, the LISTSERV protocol brought people together by field and interest, and made it possible to, in effect, send a message to hundreds of people at once in search of advice or response; LISTSERVs were often associated with archives where you could search through older messages. Early online game spaces, such as MUDs and MOOs go back to the late 1970's, and many became highly social, with tens of thousands of "inhabitants" maintaining spaces there. All of these interactions were exclusively text-based, and the only "graphics" consisted of what could be cobbled together out of ASCII characters.

It wasn't until the arrival of the commercial internet in 1993, and the WWW protocol the next year, that social media really took off; by the end of the decade, Six Degrees, LiveJournal, Blogger, and eOpinion had launched. In 2003, Second Life offered its users a virtual retake on their first lives, albeit with a graphical interface that looks primitive by today's standards; that same year, MySpace became the first modern social networking platform, and a model for Facebook two years later. With half a billion users, including everyone from the President to the Pope to Adam West, it certainly has the critical mass to change the face of human communication -- and yet, in recent years, the loss of many of its younger ("Millenial" generation) users has some people wondering whether it may someday go the way of MySpace.

Thursday, March 20, 2014

History of Computing II: Mouse forward

The move toward the possibility of a computer that could truly be called "personal" begins in many ways with Douglas Engelbart's question: "If in your office, you as an intellectual worker were supplied with a computer display backed up by a computer that was alive for you all day and was instantly responsive, how much value could you derive from that?” The question was posed on December 9, 1968, at what has come to be called the "Mother of all Demos," where Engelbart and his team at Augmentation Research Center at Stanford Research Institute in Menlo Park, California. Unlike anyone else in 1968, Engelbart had some concrete answers to this seemingly abstract question: he was seated at a console which included a chorded keyboard (not unlike that employed by the operator of the "Voder" at the 1939 World's Fair) as well as the first operational three-button mouse, which Engelbart had designed together with Bill English starting in 1963. Using this interface, as well as an audio and video projector, Engelbart demonstrated the other capacities of his system, which included collapsible and relational menus, a simple mapping system, a text editor, and basic programming tools.

Of course the computers that backed up Engelbart's console were still massive, and required a number of other human operators and technicians (he chats with several of them in the course of the demo). It would be nearly another sixteen years before advances in microchips, display screens, and hardware would enable the production of the Apple Macintosh, the first computer to incorporate a mouse along with a graphical user interface (GUI) and some degree of WYSIWYG (What you see is what you get) graphics. It was these technologies, much more than earlier screen and keyboard machines, that turned the modest interest in home computers into the revolution in personal computers that enabled the "Internet" age.

Interestingly, although Engelbart is actually depending on a remote set of machines connected over a cable, it would be a long time before the computer was not only an independent platform but also a means of communications. Early modems were slow and unreliable, and took hours to send long files; even then, they more often connected to a remote "host" which was itself isolated from the 'net, such as a BBS system. The earliest version of the Internet, known as ARPANET, was created from plans developed for the US Department of Defense by the RAND corporation, its architecture designed to link DoD facilities with contractors and research universities, with a "distributed" set of nodes which was chosen as the most likely to survive a Soviet nuclear attack. Even well into the late 1980's, when I sent my first e-mail (I was then a grad student doing a work study job at Brown's Graduate School offices) 90% of the traffic on the Internet went from one big host computer to another at universities and research institutes. I remember sending a message to someone with an odd-sounding hostname, and finding out only later that the user was in Tel Aviv, Israel!

The Internet was not opened to commercial traffic of any kind until 1993, and it was around this time that Sir Tim Berners-Lee released his hypertext "world wide web" protocol, and Mosaic, the first widely-used browser, came into use. This software, because it enabled terminal-to-terminal communication using an interface which worked in much the same way as the GUI's of individual computers (well, Macs in any case!), was the key step toward the 'net becoming a true mass medium. And it was only then that the answer, or rather answers, to Engelbart's question became clear, with nearly two billion Internet users worldwide, and global e-commerce quickly becoming the dominant means of trade and exchange throughout the developed world. And, of course, the humble "intellectual worker" -- such as yours truly -- has, and continues to derive great value from all this; in the case of my most recent book, which took about six years to research, I'd estimate that, without access to Internet-based historical materials, the project would have taken at least twice and long, and cost tens of thousands of dollars in airfare to travel to and search through archives around the world.

History of Computing I: The Colossi

The earliest notion of a "computer" that most people had in the nineteenth and twentieth centuries had one key feature: enormous size. Babbage's 1837 "Analytical Engine," widely regarded as the earliest ancestor of the computer, would -- had it ever been completed -- have filled a warehouse-sized room and weighed nearly 30,000 pounds. Babbage's designs used interlocking gears with various ratios to perform calculations, and his system contemplated a punch-card I/O unit, a calculating unit known as the "Mill" (a sort of CPU), and a storage unit he called the "Store." In his search for backers, he enlisted Lord Byron's daughter Ada Lovelace, who wrote an erudite explanation of the Engine's operations for a French journal; in 1983, a new computer language designed for the US Department of Defense was christened "Ada" in her honor. Two working models of his machine have been built in recent years; you can see one of them in action here.

Few significant advances in computing were made until the late 1930's and early 1940's, when military needs -- calculating target data, and (most importantly) breaking secret codes such as Germany's ENIGMA, provided both the impetus and the funding. In the UK, researchers at Bletchley Park, led by the young computer genius Alan Turing, constructed machines they named "bombes" which used electrical relays and motors to run through hundreds of thousands of possible combinations of the wheels and wires of an Enigma machine. Later, they constructed a far more advanced machine, known literally as "Colossus," for the same task.

The first fully digital machine along these lines was ENIAC (short for Electronic Numerical Integrator And Computer) which used vacuum tubes -- more than 17,000 of them! -- as relays and switches. The machine had to be literally re-wired for each different kind of operation; the task was entrusted to a group of young women who, even though many of them had college degrees in mathematics and engineering, were regarded at first as little more than glorified switchboard operators. All of the units together weighed more than 60,000 pounds, and consumed 150 kilowatts of power -- all to perform roughly 5,000 calculations per second. While this was many times faster than any earlier machine, it's equivalent to a CPU speed of 5 kHz -- fifty million times slower than the average desktop computer of today.

The key invention which began the change from room- and-building-sized machines to something that could actually fit in an office or a home was of course the transistor, developed at Bell Labs in 1947. The basic idea was to use a semiconductor sandwiched between more conductive materials; such a device, like a radio tube, could be used either as a signal amplifier or a switch. There were several key advantages: transistors produced less heat, were cheaper to manufacture, and -- even in their early state -- much smaller. Each of the women of ENIAC shown in the photo above is holding a unit with the same storage capacity; the first two decreases in size are due to smaller, specially-made tubes, but the last is due to transistors. A typical smart phone today has nearly a million times the number of transistors in this smallest unit.

With the war over, business demands drove the computer market. The first commercial computer introduced for this market was the UNIVAC, introduced in the early 1950's. For around $750,000, you got a CPU speed of 1.9 kHz, about 1.5 Kb of memory, and tape drives, each the size of a small refrigerator, which held about 1.5 Kb per tape. A decade later, IBM introduced its 1401 system; with the top model, one could now have 16 Kb of memory, and perform almost 23,000 calculations per second -- 23 kHz. IBM did not sell the 1401, but you could lease one for around $2,500 a month. Home computing on a practical scale was still far in the future; although the SIMON and other home-kit computers were available throughout this period for home hobbyists, their size -- 8 binary switches -- made them useless for any but the most limited tasks.

Monday, March 3, 2014

3D Movies: Always Just Over the Horizon

From its first appearance in 1922 to the current wave of films today, 3D has always been hailed as a great technical advance which would bring the cinema closer to its future as an all-encompassing form of entertainment. This future, alas, has always remained just over the horizon, and the reason is plain to see: it has always required special, add-on technologies that have made films more expensive to produce, project and view. This has led to cost, which has led to its being seen as a premium entertainment, which has prevented it from becoming more widely used. Doubtless the current wave of 3D will fade, but in the meantime, it might be educational to take a look at Teleview, the very first 3D system for the cinema, as nearly all of the technological elements -- and all of the hurdles -- were there are the start, nearly ninety years ago.

Basically, there have always been two methods of achieving the effect of 3D -- one, as with Kinemacolor, was an active method using alternating frames of the film for left-eye and right-eye views; such systems then required either a polarizing filter (with the projected images also alternating in polarity) or a synchronized, electrical shutter for every viewer (this was the method of Teleview, and seen in the diagram of the viewer above). Oddly, this is not only the earliest, but the latest, system: 3D television similarly uses alternating frames, along with a special set of electronic glasses designed so that each eye sees only the frames made from "its" perspective (at $50 a pair, they're hardly cheap).

The other method, the passive one, is to project both left-eye and right-eye perspectives simultaneously, and use either red/blue or polarizing eyeglasses so that the overlapping images are "sorted out" by each eye. This has the advantage of cheap, disposable means of reception, but the disadvantage that the image on the screen will be poor to anyone without the eyewear. While we often associate this system and its red/blue glasses with the earlier heyday of 3D in the 1950's, polarizing glasses were in fact far more commonly used, primarily because such films did not have to be printed on colored stock, or use color at all.

Today, converting a modern multiplex cinema to 3D costs about $300,000 a screen -- which, at some larger houses, would mean several millions of dollars. The practice has therefore been to convert only a few screens, which means that any film released in 3D will be on fewer screens, and even with a premium will make less for both the studios and the exhibitors. The dwindling economic returns of such a thing, especially in the current recession, have caused some studios, such as Warner Brothers, to pull out of earlier commitments to making films, such as the last Harry Potter features, in 3D. The jury is still out on 3D TV, and my bet is that, before too long, we will once again associate 3D, that magnificent technology of the future, with the past.

Wednesday, February 26, 2014

Later Developments in Cinema

The history of the development of cinema after the early portion of the silent era is largely -- though not entirely -- a question of the gradual progress towards both sound and color. Each of these, as we've already seen, started much earlier than generally imagined; sound began with Dickson's "Experimental Sound Film" of 1894, and hand-painted color had already reached a high-water mark with Georges Méliès's 1900 version of Joan of Arc. With sound, the great problem was synchronization; there were all kinds of schemes for keeping sound -- as a phonograph record, an optical code, or any other pre-recorded substrate -- in time with image. When it came to color, hand-painted films -- even with stencils, and armies of (mostly female) colorists, it remained a premium mode without a premium payback. The main use of color in commercial film, in fact, was with tinting -- a process in which certain segments of film to be edited were run through chemical baths. An emotional scene might be bathed in red, while another encounter would be shown in blue or purple. The advantage of tinting was that all the varied colors could be achieved in post-production, at the director's discretion. Such scenes as the "mellow yellow" of the frame from an unknown film of this era, were common indeed. In some cases, tinted prints survive and have been restored; in others, the indications for tinting have been recreated in restoration. For a particularly fine example of tinting, have a look at the Flicker Alley restoration of F.W. Murnau's Phantom (we may view a few scenes from this in class).

At the same time, efforts progressed toward a technology that would bring about the appearnce (at least) of full color. The pioneer in this field was Charles Urban, an American expat in England who had already achieved success with his black-and-white films in the era of the "Cinema of Attractions." Urban realized that persistence of vision, the same principle that enabled the illusion of motion, could enable an illusion of color as well; this was the basis of his "Kinemacolor" system. Black-and-white was shot through a special camera using a spinning filter which filtered alternate frames in red and green. After developing the film, it was played back through alternating color filters, so that the "red" frames were tinted red and the "green" frames green; the result was something very close to the feeling of full color (though in fact the process missed part of the spectrum -- with dark blue being very imperfectly reproduced). Urban's process also had the huge technical advantage that, although special cameras and projectors were needed, the film was just ordinary black-and-white stock. Urban promoted his system through ambitious, epic-sized films shown in specially built, luxurious cinemas. Unfortunately for Urban, he was sued by cinema pioneer William Friese-Greene, who (falsely) claimed he had had the idea for this kind of color alternation before. As has happened with modern patent lawsuits, the British judges had no grasp of the technology on which they were ruling, confusing concept with practical art, and Friese-Greene's scheme of staining alternate frames (which produced only a muddy mess) with Urban's far superior pictures. They ruled in favor of Friese-Green, and Urban was eventually forced into bankruptcy. Friese-Greene was never able to bring his system to the point commercial success, though his son Claude, using a process much more like Urban's system than his father's, made a number of fine early color films.

Ironically, it was to be one of William Friese-Greene's original concepts -- dyed film which was glued or bonded together -- which would ultimately be the precursor of modern color processes. The Technicolor company started out with a red/green system much like Urban's; they called this "System 1." Films made with this system have a haunting, greenish-yellowish hue which, while perfect for horror features such as "Dr. X" (1932) was less well suited for dramatic or comedic subjects. They next developed "System 2," a subtractive color process in which two dyed films were cemented together, but the finished film was prone to bubbling and cupping. A third system transferred the dyed prints to a fresh single film, but was still limited to two colors.

By the mid-1903's Technicolor shifted to a three-strip system, which was shot on three separate films, which were then dyed and transferred to produce the final prints. This offered the first commercially successful full color image, although red and green still had the most zing -- thus Victor Fleming's choice of ruby slippers and green witch's makeup for 1939's The Wizard of Oz. Not many people realize it, but "Color by Technicolor" was a licensed process not owned by the studios; directors had to hire Technicolor's camera operators and technical consultants, as well as entrusting post-production to their facilities.

Now, as to sound: at nearly the same time, different technologies were being tried to synchronize sound with moving pictures. Emile Berliner was involved with a disc-based system; Edison offered a cylinder-based one, but neither achieved real success. All the various attempts at sound stumbled with the issue of synchronization until the development of optical soundtrack systems, which in turn had to wait until amplified electrical recording became possible in the mid-1920's. These, because they could be recorded on to the actual film, and duplicated along with it, were both reliable and economically feasible, though of course exhibitors would have to invest in new equipment. Although hailed as the first sound picture, 1927's "The Jazz Singer" in fact only had sound in certain portions of the film, and still relied on the old sound-on-disc system. Rival technologies -- RCA's "Photophone" system, Western Electric's variable density system -- vied for the new industry standard.

The introduction of sound to film brought with it a host of technical problems: microphones had limited range, and had to be hidden in potted plants and tableware; camera noise was too easily picked up, and cameras had to be encased in sound-proof coverings. Mary Pickford, one of the greatest stars of her day and a founder of United Artists, had a terrible experience with her 1929 sound film, "Coquette"; she had to strain her voice to get it picked up by the microphones, and the results were far from complimentary. Her UA partner Charlie Chaplin, though he eventually embraced the idea of using musical scores on his soundtracks, put off the use of voice; aside from a phonograph recording, a one-liner ("Get back to work!") and a nonsense song in 1936's "Modern Times," Chaplin did not use spoken dialogue in any of his films until "The Great Dictator" in 1940, though some years later he recorded narrative voice-overs for many of his early features. Nevertheless, sound, well before color, became a standard feature of film very soon after its introduction.

Next up: 3D film -- in 1922?!

Monday, February 24, 2014

Color Television

Color television in the United States had a protracted history due to conflicting technical systems vying for approval by the Federal Communications Commission for commercial use. Mechanically scanned color television was developed by John Logie Baird in 1928, and a more complex system was demonstrated by Bell Laboratories in June 1929 (it used three complete systems of photoelectric cells, amplifiers, glow-tubes, and color filters, with a series of mirrors to superimpose the red, green, and blue images into one full color image).

In the electronically scanned era, the first color television demonstration was on February 5, 1940, when RCA privately showed to members of the FCC at the RCA plant in Camden, New Jersey, a television receiver producing images in color by a field sequential color system. CBS began non-broadcast color experiments using film as early as August 28, 1940, and live cameras by November 12. The CBS "field sequential" color system was partly mechanical, with a disc made of red, blue, and green filters spinning inside the television camera at 1,200 rpm, and a similar disc spinning in synchronization in front of the cathode ray tube inside the receiver set. RCA's later "dot sequential" color system had no moving parts, using a series of dichroic mirrors to separate and direct red, green, and blue light from the subject through three separate lenses into three scanning tubes, and electronic switching that allowed the tubes to send their signals in rotation, dot by dot. These signals were sorted by a second switching device in the receiver set and sent to red, green, and blue picture tubes, and combined by a second set of dichroic mirrors into a full color image.

The first field test of color television was by NBC (owned by RCA) on February 20, 1941. CBS began daily color field tests on June 1, 1941. These color systems were not compatible with existing black and white television sets, and as no color television sets were available to the public at this time, viewership of the color field tests was limited to RCA and CBS engineers and the invited press. The War Production Board halted the manufacture of television and radio equipment for civilian use from April 1, 1942 to October 1, 1945, limiting any opportunity to introduce color television to the general public.

The post-war development of color television was dominated by three systems competing for approval by the FCC as the U.S. color broadcasting standard: CBS's field sequential system, which was incompatible with existing black and white sets without an adaptor; RCA's dot sequential system, which in 1949 became compatible with existing black and white sets; and CTI's system (also incompatible with existing black and white sets), which used three camera lenses, behind which were color filters that produced red, green, and blue images side by side on a single scanning tube, and a receiver set that used lenses in front of the picture tube (which had sectors treated with different phosphorescent compounds to glow in red, green, or blue) to project these three side by side images into one combined picture on the viewing screen.

After a series of hearings beginning in September 1949, the FCC found the RCA and CTI systems fraught with technical problems, inaccurate color reproduction, and expensive equipment, and so formally approved the CBS system as the U.S. color broadcasting standard on October 11, 1950. An unsuccessful lawsuit by RCA delayed the world's first network color broadcast until June 25, 1951, when a musical variety special titled simply Premiere was shown over a network of five east coast CBS affiliates. Viewership was again extremely limited: the program could not be seen on black and white sets, and Variety estimated that only thirty prototype color receivers were available in the New York area. Regular color broadcasts began that same week with the daytime series The World Is Yours and Modern Homemakers.

While the CBS color broadcasting schedule gradually expanded to twelve hours per week (but never into prime time), and the color network expanded to eleven affiliates as far west as Chicago, its commercial success was doomed by the lack of color receivers necessary to watch the programs, the refusal of television manufacturers to create adaptor mechanisms for their existing black and white sets, and the unwillingness of advertisers to sponsor broadcasts seen by almost no one. In desperation, CBS bought a television manufacturer, and on September 20, 1951, production began on the first and only CBS color television model. But it was too little, too late. Only 200 sets had been shipped, and only 100 sold, when CBS pulled the plug on its color television system on October 20, 1951, and bought back all the CBS color sets it could to prevent lawsuits by disappointed customers.

Starting before CBS color even got on the air, the U.S. television industry, represented by the National Television Standards Committee, worked in 1950-1953 to develop a color system that was compatible with existing black and white sets and would pass FCC quality standards, with RCA developing the hardware elements. When CBS testified before Congress in March 1953 that it had no further plans for its own color system, the path was open for the NTSC to submit its petition for FCC approval in July 1953, which was granted in December. The first publicly announced experimental TV broadcast of a program using the NTSC-RCA "compatible color" system was an episode of NBC's Kukla, Fran and Ollie on August 30, 1953.

NBC made the first coast-to-coast color broadcast when it covered the Tournament of Roses Parade on January 1 1954, with public demonstrations given across the United States on prototype color receivers. A few days later Admiral brought out the first commercially made color television set using the RCA standards, followed in March by RCA's own model. Television's first prime time network color series was The Marriage, a situation comedy broadcast live by NBC in the summer of 1954. NBC's anthology series Ford Theatre became the first color filmed series that October.

NBC was naturally at the forefront of color programming because its parent company RCA manufactured the most successful line of color sets in the 1950s, and by 1959 RCA was the only remaining major manufacturer of color sets. CBS and ABC, which were not affiliated with set manufacturers, and were not eager to promote their competitor's product, dragged their feet into color, with ABC delaying its first color series (The Flintstones) until 1962. The DuMont network, although it did have a television-manufacturing parent company, was in financial decline by 1954 and was dissolved two years later. Thus the relatively small amount of network color programming, combined with the high cost of color television sets, meant that as late as 1964 only 3.1 percent of television households in the U.S. had a color set. NBC provided the catalyst for rapid color expansion by announcing that its prime time schedule for fall 1965 would be almost entirely in color (the exception being I Dream of Jeannie). All three broadcast networks were airing full color prime time schedules by the 1966–67 broadcast season. But the number of color television sets sold in the U.S. did not exceed black and white sales until 1972, which was also the first year that more than fifty percent of television households in the U.S. had a color set.

(This article adapted from one I co-wrote for citizendium.org)

Thursday, February 13, 2014

Ghosts in the Machine: Early Television from 1928

The image to the left is a single frame from the earliest known television recording of a human face, made by the inventor John Logie Baird. The subject, a Mr. Wally Fowlkes, was a young lab assistant undistinguished save by his willingness to sit for lengthy periods under the bright, hot lights required to make television recordings. And, amazingly, these recordings were made almost entirely using mechanical means -- a giant disc with glass lenses was linked directly to a Columbia Records turntable equipped with a cutting stylus -- and predate any electronic images of humans by several years! They were preserved on discs that look much like audio recordings, and the frequency of the image data is so low that, if played through speakers, a sound in the audible range is produced. Indeed, Baird claimed that he could distinguish, just by listening to them, a recording of a face from say, a recording of a pair of scissors or a soccer ball. Baird called his process Phonovision, and although he abandoned it as offering too brief, and posing too many technical obstacles, it was nevertheless the first system of recorded television in history.

These recordings were little-known until a few years ago, when recording engineer Donald McLean collected several of them, and transferred their analog signal into digital form. Once this was done, he was able to correct for all kinds of problems that plagued Baird's engineers -- mechanical resonance ("rumble"), pops and scratches on the disc, speed irregularities, and problems with frame registration. The earliest recordings are still quite primitive, but one can at least recognize the faces.

Even more remarkably, in addition to these laboratory discs, there exist home recordings, made using "Silvatone" aluminum discs (one of these was referenced recently in The King's Speech). Silvatone discs used a heavy, weighted cutting stylus, and could record any sort of signal, whether of the human voice or a radio broadcast. And, due to the relatively low frequency of the signal, they could be used to record television broadcasts as well. During the brief period from the late 1920's through to the early 1930's, when Baird was able to send out television signals with the BBC's co-operation, a number of amateur recordings were made; these, too, have been restored by Mr. Mclean. There are about a half-dozen different snippets: dancing girls (of course!), a marionette show, and a singer by the name of Betty Bolton. McLean actually located Miss Bolton, by then 92 years old, and she was able to personally identify herself as the subject of the recording!

During this era -- in 1930 -- the BBC broadcast the very first television drama, an adaptation of Pirandello's play "The Man with a Flower in his Mouth." Although this does not survive, there is a re-enacted version, using the exact same script, the original music and title cards, and an identical 30-line Baird camera system -- you can watch it here, along with comments on the original broadcast and the recreation.

Mr. McLean has kindly permitted me to show his restored original Baird recordings to you -- but in class only -- as he is concerned to protect his rights in the restored versions. So look for some haunting images at Tuesday's class!

SIDEBAR: Here's a chart I've prepared showing the relative frequency and bandwidth of television signals, from the days of the Baird discs to HDTV.

ADDITIONAL LINKS: The excellent Television History site, a film of the 1936 Radiolympia demonstration broadcast as well as the High-def opening ceremony later that year. Both feature versions of the commissioned theme song, with its curious lyrics:
A mighty maze, of mystic, magic rays
Is all about us in the blue
And in sight and sound they trace
Living pictures out of space
To bring this enchantment to you ...
Here also you can see a modern 32-line mechanical TV in action; a 1938 Nazi TV station ident (they named the station after Paul Nipkow, inventor of the Nipkow disc, so as to claim TV as an "Aryan" invention); and lastly, a TV advert for Dumont TV featuring Wally Cox, later a "Hollywood Squares" regular and voice of Underdog.

Saturday, February 8, 2014

The Birth of Radio

Despite popular belief, the basic principle of radio was not discovered by Marconi in 1895, but by Nikola Tesla in 1893. Tesla, a brilliant, eccentric man, who eschewed coffee in favor of whiskey and opined that gum chewing "leads to an early grave," was the inventor of the basic principle of alternating current, as well as the two-pole AC motor/generator -- inventions without which the long-distrance transmission of electricity, and nearly all the electrical technology of the twentieth century and after, would have been impossible. When building his circuits, Tesla experimented no only with 60-cycle current (the eventual US standard) but with much higher frequencies. He noticed that, when a wire coil tuned to a certain frequency -- say 6000 Hz -- was placed nearby a non-powered coil, a high portion of the electrical energy from the charged coil went into the uncharged one -- the first transmission of power through the air (see here for a contemporary demonstration). Tesla realized that by changing the frequency of the coils dynamically, could transmit simple information such as a binary signal, and built a small radio-controlled submarine that responded to wireless signals from an operator fifty yards away. He did not, initially, pursue radio as a means of mass transmission, but his US patents were eventually ruled by the US Supreme Court to have superseded the later Marconi patents.

Marconi, unlike Tesla, was a man with a head for business. He demonstrated his radio system publicly, sold shares in his company, and estabished the trade name "Marconigram" for messages sent via his machines; the radio men aboard RMS "Titanic" used his transmitters. Initially, this was all accomplished via a narrow binary continuous-wave signal that could only send Morse code; the transmission of the human voice depended on use of a vacuum tube for amplification, which was developed by Lee DeForest in 1906 (this same tube was essential to the later development of electrically-recorded music in the 1920's). The first radio "stations" (so called because they used a stationary antenna with a fixed wavelength) were established around 1920.

The Early Radio History site has some valuable links and information about the early development of voice transmission via radio. Unfortunately, very few recordings of this early era have survived, but the following sites have some recordings that at least sketch out this period:

Documenting Early Radio (all pre-1932 recordings)
George V Speech from 1924 (recorded, then broadcast that evening)
LKY Radio Airchecks (includes one from the 1920's)
Radio Spirits (a go-to site for "Old Time Radio")
Orson Welles' Black Museum series
Other "Old Time Radio" files at archive.org