LETTERS TO THE EDITOR
In the first section of this page, immediately below, are some "letters to the editor" that I wrote and were published. (These have been, or may soon be, removed from those publications' websites. They are preserved here for your reading pleasure.) In the second section of this page are some letters that I wrote to various publications that were not published, and which I consider deserving of a readership.
A. PUBLISHED LETTERS:
Alternate Endings for Tinseltown's Digital Drama
Business Week, August 4, 2003, referring to an article about declining CD sales (supposedly due to piracy) and the future threat to DVD sales:
The
fact that CD sales fell during the period when digital audio technology became
widely available does not prove causality. CD sales were kept at
"unnatural" levels from 1983 to 1999 by the baby boomers, who
gradually repurchased, in the new format, music already owned on vinyl. But by
1999, boomers had repurchased all the material they ever wanted, including many
overlapping compilations containing many unwanted duplicates. (I count no less
than eight copies of Hey Paula in my collection!)
Malcolm Hamer
New York
This was a rather-slimmed-down version of the letter that I sent them, the full
text of which was:
In “Stealing Hollywood”, July 14,
you stated that the music industry is “nearly paralyzed by piracy” and you imply
that the burning of CDs and exchanging of MP3 files is central to the fall-off
in CD sales in the last four years. However, the fact that CD sales have fallen
over the same period that digital audio technology became widely available does
not prove causality. Based on timing, one might also argue that digital audio
technology has caused an increase in terrorism. The music industry’s assertion
that a causal relationship exists is self-serving and should be questioned. If
we could re-run history without digital audio technology it is quite possible
and, I believe, likely that the fall-off in sales would have happened anyway,
for two reasons.
First, younger CD buyers have always tended to buy as a group – sharing their
purchases and making “compilation” tapes (now CDs) for one another. As a group
they have a certain number of dollars that they are prepared to spend on music.
This is more or less fixed, regardless of the extent of sharing within the
group. Digital sharing technology has not significantly changed this situation.
However, other demands on these buyers’ budgets have caused them to reduce the
amount allotted to CD purchases, notably mobile phone charges and, to some
extent, DVD purchases.
Second, CD sales were kept at “unnatural” levels from 1983 to 1999 by the baby
boomers, who gradually re-purchased, in the new format, music already owned in
vinyl format. The industry has known this all along and it has devoted great
energy to releasing new combinations of old material, with just enough new-to-CD
tracks and remixes to get the boomers to hit the “buy” button. But by 1999 the
boomers had re-purchased all the material they ever wanted, including many
overlapping compilations containing many unwanted duplicates. (I count no less
than 8 copies of “Hey Paula” in my collection!) That the boomers’ CD-buying has
fallen off dramatically is obvious if you compare the top-twenty charts of 2003
with, say, 1998.
While the music industry and Hollywood can, and should, take legal action
against organized pirates, their obsession with trying to use technology to
protect material sold to normal consumers is misplaced. It will not stop
organized pirates, who have the resources to crack any technical protection, and
it will take away our rights to fair use (as the industry tried, unsuccessfully,
to do via the courts when the VCR appeared on the scene in the early 1980s).
MPEG-4 And The HD-DVD Standard
Widescreen Review, November 2002
Dear
Gary,
I
was pleased to see that in your October (Issue 65) editorial you made mention of
MPEG-4 (in connection with the hoped-for HD-DVD standard).
It seems to me that MPEG-4 is being treated by some players in the video
technology market as some sort of rogue technology, rather than the outcome of
years of serious work by the industry’s respected group of experts (i.e. the
MPEG). Steve Jobs and Apple seem to
be the only players openly supporting and praising MPEG-4.
Other players seem to want to brand it as “that thing that video
pirates are using to exchange ripped videos over the Internet”.
And Microsoft’s message to the MPEG regarding MPEG-4 seems to be
“Thank you for all your hard work, guys, but we’ll take it from here.” It is not clear what deals Microsoft is striking with the
other players, particularly the studios, but it seems possible that they are
along the lines of: “We will build rock-solid copy protection into the
compression algorithm and you will support the use of a Microsoft algorithm so
we can make billions of dollars in licensing fees, just as we did with
Windows.”
While
I agree with you that the present red laser approach, with an average bit rate
of 5 Mbps, combined with the best possible compression algorithm, is not going
to give us the HD quality that we want (Warner’s approach), those other
manufacturers that are proposing to use Blu-ray discs in combination with MPEG-2
must be crazy. Why would anyone opt
to use MPEG-2 after the MPEG spent almost a decade developing the improvements
and additions to MPEG-2 that resulted in MPEG-4?
If the industry had followed similar logic in moving from laserdiscs to
DVDs, then DVDs would be using analog video!
Surely the HD-DVD standard should combine the best available hardware and
software elements, namely 20 to 30 Mbps bit rate Blu-ray discs with the MPEG-4
algorithm? In very rough terms this
would give us an overall quality improvement factor versus DVDs of something
like 16 to 24 (based on bit rate improvement of 4 to 6 multiplied by an
estimated compression efficiency improvement factor of 4).
On
this last point (compression efficiency) it seems that there is a general
reluctance to publish independent comparisons of MPEG-2 versus MPEG-4 using
useful bit rates. Claims in various
articles and on various websites about the efficiency of MPEG-4 versus MPEG-2
vary wildly, from 3:1 to 10:1, and are presumably not based on tests performed
under carefully controlled conditions. It
is important to know the real number so that we can have some idea what to
expect when we read about the bit rate of a particular proposed disc, cable,
satellite, or tape technology that is going to be used in combination with
MPEG-4. Interesting and useful
comparisons might be:
(a)
MPEG-2 at 5 Mbps (the present DVD standard) versus MPEG-4 at various bit rates
over the range 0.75 Mbps to 2Mbps, in order to find the average bit rate at
which DVD-like quality is obtained using MPEG-4; and
(b) MPEG-2 at 28.2 Mbps with HD content viewed on an HD display (D-VHS) versus MPEG-4 with the same content at various bit rates from 5 Mbps upwards, to find the “yes, this really is HD” point for MPEG-4.
Malcolm Hamer
New York
Editor-In-Chief Gary Reber Comments:
Malcolm,
thank you for your thought-provoking letter. This is exactly the prying
that is needed to move the development of a HD version of DVD to the best
possible performance level. I hope that in future issues of Widescreen
Review we can explore some of the issues you raise and serve our readers by
expressing our desire for “the best that it can be” in a HD-DVD format that
will also be backward compatible with present-day DVD.
Big Data
Engineering & Technology Magazine, May 2013
‘Journey to the center of Big Data’ (E&T, April 2013) was an interesting account of the impact of information technology on our daily lives. However, the examples given depended very little on Big Data, but rather on data matching between traditional databases, supported by advances in data mapping and tagging, plus underpinning standards such as browser cookies.
To the extent that 'a Big Data database' is more than a marketing term, a useful definition would be 'a database about very long binary strings that, as data entities, are described by the database and, as data elements, are contained in the database'. Such binary strings can be email messages and other unstructured documents, images, sound files, or even video files. Of course, the length of such data elements necessarily means that such databases often hold a lot of bytes.
However, in understanding Big Data, it helps to remember that size isn’t
important.
Malcolm Hamer CEng MIET
Winners in the Mobile Standards Game
Engineering & Technology Magazine, May 2017
William Webb’s Comment column, ‘Outdated strategies are the wrong approach to implementing 5G’ (March 2017) was refreshing in its frankness about the cellular standards game. Without substantial allocations of extra bandwidth to consumer mobile use, each new xG standard provides higher speeds only to the first few users active in a cell in the morning. When everyone is using the service, things are little better for individual users than with the previous xG standard. Try running a speed test at 4am then at 9:30am to see what I mean.
Without more bandwidth, the only way to really improve things is by shrinking
the cell size. But since cells must form a honeycomb grid, the carrier’s only
choice is to add sites midway between the existing ones. This means quadrupling
the number of sites, with 300% more capital investment and four times the
ongoing operational costs. With monthly fees falling, carriers simply cannot
afford to do this. So instead, they slip the latest xG electronics package into
the cell-site chassis, place a few TV ads, and let the device manufactures do
the rest.
Manufacturers make a pile of money selling another generation of devices to
customers using creative advertising, the carriers are slightly worse off, and
the customer gets to enjoy the higher speeds only at 4am.
Malcolm Hamer CEng MIET
Noncoding DNA's Value PDF version of this letter
Scientific American, February 2013
In
Stephen S. Hall’s interview with Ewan Birney [“Journey to the Genetic
Interior”], Birney deprecates the term “junk DNA” for sequences whose function
we do not know, but he is conservative on how much of nonprotein-coding DNA may
be functional (“between 9 and 80 percent”). The fact that the entire
genome is copied at every cell division suggests that close to 100 percent of
DNA must be functional. Had any significant portion of DNA been
nonfunctional in the past, evolutionary pressure to develop an editing-out
mechanism to increase the cell’s energy efficiency would have been tremendous.
Birney also uses the conservative term “regulation” to describe how the 98.8
percent of nonprotein-coding DNA interacts with the 1.2 percent of
protein-coding segments. It is more useful to describe the entire genome
as software: instructions for cells to build copies of themselves and assemble
cells into life-forms. In this view, the protein-coding segments are
thought of as fixed-value strings within the code.
If we could send a personal computer with Microsoft Excel and a copy of its
source code back in time to Alan Turing, it seems unlikely that on comparing the
screen output with the source code, Turing would conclude that fixed values such
as “File,” “Edit” and “View” were the essence of the software and that the other
99 percent merely “regulated” the operation of the fixed values.
Malcolm Hamer
New York City
Note: this was the third letter that I wrote to Scientific American on this topic. The first two (not published) appear immediately below.
_____________________________________________________________________
B. LETTERS THAT WERE NOT PRINTED BY THE PUBLICATION TO WHICH I SENT THEM:
The Unseen Genome: Gems among the Junk
PDF version of this letter
The publication of the article “The Unseen Genome: Gems among the Junk”, by
W.Wayt Gibbs (October 2003), may turn out to be a critical step in facing up to
the possibility that almost the whole of every DNA strand in a cell is
potentially-executable “code”. The original “junk” hypothesis – that only the
sections of code that form a template for protein construction represent “valid”
code – was hastily arrived at, and led to researchers treating these sections of
code as uninteresting. Now the hypothesis is being tested and found to be, at
least in part, wrong. Perhaps it is entirely wrong.
A useful analog for DNA might be a computer program. Suppose that we knew
absolutely nothing about a programming language, but had studied samples of code
and noticed that it contained strings of characters enclosed in quotation marks,
and that these strings could be matched with words in the program output, like
“RESULTS” and “ ACHIEVED” in:
1000 PRINT "RESULTS ";
1100 PRINT X1,X2,X3;
1200 PRINT " ACHIEVED"
1300 GOTO 700
Looking at this code, would we say that everything other than the printable
strings is “junk” that does nothing? Surely not. By analogy, we should think of
protein-defining sections of DNA as being like “the stuff in quotes”, and we
should expect the rest of the DNA to be just as important – possibly more
important. This is a much more logical conclusion. If the “junk” were really
junk, and cells were able to recognize and ignore junk when “executing” the DNA,
then cells that edit out the junk while copying the DNA would have evolved and
prospered, being more energy-efficient. Yet the “junk” is still there; so it is
probably not junk.
Of course, not all of the code in the DNA may be executed in a particular cell
at a particular point in its life. As in a computer program, we should expect
the code to contain conditional statements that cause branching, such that only
certain sections are executed in a given situation.
If researchers were to approach the task of understanding how DNA as a whole
operates, with the assumption that the majority of DNA is “potentially
executable code” that plays a vital role in cell construction (and multicellular
lifeform assembly), then they would be more likely to find out how it all works.
Malcolm Hamer
New York
Sent to Scientific American in 2004
The
article “The Hidden Genetic Program”, by John S. Mattick (October 2004), is
the latest in a series of articles which reveal the growing recognition that
introns (also known as “junk DNA” – the large sections of DNA in between
the protein-defining strings) is not, after all, “junk”.
It is becoming clear that intronic DNA, through a number of mechanisms
which we are only starting to observe and understand, plays a role in assembling
proteins into operating structures.
Viewing
DNA as something analogous to a computer program, rather than a series of
protein-defining strings separated by junk, is much more likely to lead to a
full understanding of how DNA works: it implies that we accept that at least
some, and possibly most, of the intronic DNA consists of instructions about how
to build a cell, or at least how to split one cell into two cells. Of course, not all of the intronic DNA code may be
“executed” in a particular cell at a particular point in its life. As in a computer program, we should expect the intronic DNA
“program” to contain conditional statements that cause branching, such that
only certain sections are executed in a given situation.
One
point that has not yet been covered in the articles on this topic is the need to
search for two distinct classes of “program” within the intronic DNA of
eukaryotes. The first class
consists of programs that play a vital role in intracellular processes,
particularly during cell division. The
second class consists of programs that play a role in multicellular lifeform
assembly. Overall, we should
expect at least half of intronic DNA to be concerned with intracellular
processes, rather than multicellular lifeform assembly.
Multicellular
lifeforms, such as human beings, are clearly seen to be complex when studied
with an optical microscope. However,
the information required to assemble roughly 100 trillion cells into a human
being (given a means of generating cells) is no more than, and is possibly less
than, the information required to assemble about 100 trillion molecules of
various compounds into a cell. (We
are reminded of the remarkable complexity of cells only occasionally by
molecular biologists.)
Congenital
diseases, and susceptibility to other diseases, can arise from three distinct
characteristics of a person’s DNA: (A) errors in protein-defining strings, (B)
errors in the intracellular-process-defining intronic DNA, or (C) errors in the
multicellular-lifeform-assembly intronic DNA.
Diseases related to Type (A) errors have been easy to identify. However,
in order to understand the relationships between Type (B) and Type (C) errors
and the diseases that they may cause, or make an individual susceptible to, it
is first necessary to identify which parts of intronic DNA are related to
intracellular processes, and which parts are related to multicellular lifeform
assembly.
Malcolm
Hamer
New
York
Sent to Widescreen Review in 2003
Dear Gary,
The article “DVI and HDMI” by Alen Koebel in Issue 69 gave an excellent review of the state of the DVI, HDMI and IEEE 1394 standards. I was particularly interested in Alen’s comment about the advantages of 1080p/24 in delivering movie frames without 3:2 pulldown – a point which I believe you have touched on in earlier issues.
Having noted that use of 1080p/24 is no more of a flicker
problem than 24fps film, and that flicker perception can be prevented by the
same technique in both cases (frame repetition), we can now look at how 1080p/24
measures up as a standard for storage and transport of content, versus
other possible formats like 1080p/30. In
the following I consider only 1080-line formats because these are, I think, true
HD. Anything less is not.
Also I consider only progressive scan.
Interlacing was a less-than-ideal expedient to eliminate flicker within
the technical constraints of 1950s analog technology and really has no place in
the processes of video display, transport, or storage today.
The storage of content (on disc or tape) and the transport of content (via cable, satellite, or broadcast) both consume scarce resources (such as space on a single disc, or bandwidth). In choosing a format for storage and transport we do not want to insert repeated frames. Repeated frames waste space or bandwidth. Of course, the basic signal will generally be compressed before storage/transport (using MPEG-2, or better still, MPEG-4, or a further development of MPEG-4). In comparing a repeated frame with the previous frame the compression algorithm will represent the second frame as a “zero difference” P-frame, which takes up a relatively small number of bytes. Nevertheless, this is still slightly wasteful. Also, using a pre-compression signal with repeated frames gets in the way of the user selecting the repetition rate himself or herself at the point of display. It makes a lot more sense to store/transport only the distinct frames that appear in the original content.
The Super Dimension-70 standard opens up the possibility that at least some content will start to be available with an actual frame rate of 48fps. In order to accommodate this we should demand that the HD-DVD standard be able to handle both 1080p/24 and 1080p/48 content, in the latter case preserving all the distinct 48 frames each second. This could be accomplished by simple flags in the bitstream to indicate where 48fps is in effect. Note that using 48fps will reduce the number of minutes of content that will fit on one HD-DVD, but not halve it. The differences between adjacent frames will be smaller at 48fps than at 24fps, so the ratio of the 48fps post-compression bit-rate to its 24fps bit-rate equivalent will be significantly less that 2-to-1. (It would be interesting and useful to conduct tests to determine the exact ratio.)
Although I have referred to 1080p/24 and 1080p/48 mainly in
the context of HD-DVDs, ultimately the use of these formats might be extended to
cable and satellite transport. Their
use for HD-DVDs would probably encourage the operators to head in this
direction.
Moving on to what happens within our homes, we can consider what we, as consumers, should ask of the HD industry for HDMI interconnectivity (player-to-monitor, cable-box-to-monitor, cable-box-to-recorder, and so on). In contrast to the scarce resources used in storage and transport, bandwidth on the cables that interconnect devices has practically zero marginal cost. So, while we seek to optimize the efficiency of storage and transport (for example, using 1080p/24 for standard movies), we should seek the option to move the content between devices at any frame rate we want.
When I select, for example, 72fps, the player would simply be pumping out each frame it reads off the disc three times. The display device would be faithfully displaying each instance, without trying to do anything clever itself to manipulate the frames. Such an arrangement future-proofs my expensive display device. By contrast, if the manufacturers of display devices include signal processing circuitry which tries to take control of the display process (e.g. deciding when to repeat frames to eliminate flicker, or attempting its own interframe interpolation, or dropping repeated frames), our display devices will be less future-proof and the manufacturers will take away our ability to control our viewing experience.
In summary, since most of us will spend a lot more time watching movies than watching made-for-TV content on our HD equipment, we should strongly support 1080p/24 as a basic storage and transport standard, especially on HD-DVDs (because this removes 3:2 pulldown from the end-to-end process entirely). Second, we should support an “enhanced format” of 1080p/48 to handle content that has been filmed at an actual frame rate of 48fps (or up-converted to 48fps by computer-based interframe interpolation). Third, we should not think “flicker problem” when we read “1080p/24”. Rather, we should think of flicker management as something that is handled at the time of content display. And fourth, we should ask manufacturers, as they implement HDMI, to place control of the frame rate in the hands of the sending-end device, not in the display device itself.
Malcolm Hamer
New York
Sent to Widescreen Review in 2004
Dear Gary,
I was very pleased to read your editorial about the problems
caused by discs being “naked”. I
am always horrified by the state of the DVDs that I rent, even ones that have
been in circulation for only a couple of weeks.
What on earth do people do with their rented DVDs?
They are almost always covered in scratches, finger marks, and other
gunk. I generally have to wipe them
thoroughly before they will play properly.
The naked design of the audio CD, in 1982, went against two
decades of development in consumer media, during which cassettization became a
universal approach to protecting the information-bearing surface (audio
cassette, mini-audio cassette, floppy disks, VHS, Video 8, MiniDisc, and so on).
The designers of the CD ignored this trend, probably because of a desire
to minimize the cost of CD manufacture, knowing that the error-correcting code
would mask read errors due to dirt and minor scratches.
Unfortunately, this decision set the scene for VCDs, and then DVDs, to be
naked also. In 1993, to make it
possible to use CD-production plant and drives for VCDs, the designers of VCD
technology stuck with the naked disc, in spite of the likelihood of a
significant rental market for VCDs and the consequent less-careful handling that
VCDs would be subjected to. During
the next three years the designers of the DVD settled on a naked disc,
presumably because of their desire to make DVD players able to play CDs and VCDs
(and possibly because they thought that consumers would expect a DVD to look
like a CD).
From the remarkable and largely unexpected success of the
DVD, it is clear that consumers will gladly pay for quality and convenience.
It is also clear that, contrary to initial fears, nobody really minds
having two, or even three, separate video-playing boxes in their video-viewing
room (a DVD player, a VCR, and a TiVo). Had
the designers of the DVD known what a runaway success the DVD would become, they
would have been much less timid about breaking away from the physical design of
the CD.
The opportunity now exists to make the right choice for the
design of the HD DVD, given that (a) the much higher information density on HD
DVDs will make them even more vulnerable to the effects of scratches and dirt,
and (b) consumers will probably be happy to add a fourth video-playing box to
their pile of boxes if it delivers high quality and greater convenience.
(An HD DVD design that places the disc in a plastic enclosure will
certainly be much easier to handle than a naked disc.)
I, for one, would appreciate the convenience of not having to
wipe the surfaces of rented HD DVDs before playing them; and I am certain that
the rental companies would welcome much lower damaged-disc losses.
Malcolm Hamer
New York
Sent to Business Week in 2003
Several U.S. Corporations have had spectacular successes in the last thirty years as global leaders in technology, establishing standards that have been adopted worldwide (for example, IBM, Microsoft, and Cisco). However, U.S. corporations can never be expected to score 10 out of 10. Yet Business Week sometimes seems to shy away from mentioning cases where the U.S. is a technology follower and not a leader. One such case is mobile telephony. The global standard today is GSM and has been for almost a decade. The U.S. cellular service providers and equipment builders made a huge mistake in deciding not to join the “GSM club” early on.
GSM is more than just one of several options for the design of the radio transmission components of a mobile phone. Adoption of GSM by a network provider means that the provider can join the “global GSM club”, so that the provider’s customers can roam internationally in other GSM countries and overseas visitors can roam via the provider’s network. Outbound and inbound roaming bring effortless incremental revenues to the provider in both cases, especially the latter. Joining the global GSM club also means that the provider’s customers can exchange SMS messages with any GSM network subscriber globally, regardless of whether the sender or recipient is in his or her home country or is roaming in another country.
A large number of BW’s readers travel internationally from the U.S., as I do, or live in GSM countries. They must be as puzzled as I am when they read, almost weekly, in BW the phrase “Europe’s GSM standard” (most recently in “Go East, Young Chipmaker”, December 30/January 6 issue). Your message to readers seems to be that there are many regional and local standards and it is therefore quite understandable that the U.S. providers have their own standards.
GSM is not “Europe’s standard”. It is the most important global standard, adopted from the early 1990s in all major markets except the U.S. and Japan. It does not matter if theoretically more efficient standards exist. Like VHS (versus Betamax), it is where the world is at. Get over the fact that the U.S. was not the global leader on this one. In future, write “GSM, the global standard that the U.S. failed, initially, to adopt”.
In spite of the initial failure of most U.S. providers to recognize the importance of GSM, at least one brave provider saw GSM’s potential – VoiceStream (T-Mobile). VoiceStream, as most users still know it, built the first national GSM-compatible network in the U.S., using the GSM standard, but operating in the 1900 MHz band (presumably because the 900 and 1800 MHz bands were already used for other purposes). They then persuaded the major phone manufacturers, such as Ericsson and Nokia, to add the 1900 MHz band to many of their models to create “tri-band” phones that could roam between the U.S. and the rest of the GSM world. Once these phones hit the market around the world, VoiceStream effortlessly tapped into lucrative international roaming revenue, as overseas visitors switched on their phones on arrival in the U.S., locked on to VoiceStream, and started making domestic and international calls.
AT&T Wireless has now seen the wisdom of VoiceSteam’s strategy and has embarked upon a rapid roll-out of a national GSM network, providing its customers with promotional discounts to migrate from TDMA to GSM. This is a bold and visionary move which deserves praise and mention in BW. And please do not forget to say that they are migrating to the global GSM standard.
Malcolm Hamer
New York