by Jennifer Dunnam
From the Editor: Few things upset people as much as trying to change the things they love. Change the logo of the National Federation of the Blind, and a chorus of voices shouts that the older one was better. Change the convention schedule, and we are absolutely certain that the convention will never be the same. Tear down one of our buildings at the 1800 Johnson Street location to make way for the Jernigan Institute at 200 East Wells Street, and some are saddened by the loss of a smokestack.
But none of these outcries compares with the emotions that arise at the prospect of changing Braille. It is all too easy to say “If it ain’t broke, don’t fix it,” or “Leave my Braille alone,” but, as folksy and comfortable as these heartfelt statements are, it has become clear over the years that print is changing, and, if Braille is truly the closest tactile equivalent, the code blind people use can no more be set in stone than the visual representation it purports to express accurately.
Jennifer Dunham represents the National Federation of the Blind on the Braille Authority of North America Board. She helped to draft our resolution in 2002, but she has concluded that a decade of changing software and hardware has forced her to reevaluate the recommendations she would make to us about our primary means of reading and writing. Here is what she says:
Since 2004 it has been my honor to serve as the National Federation of the Blind's representative on the board of the Braille Authority of North America (BANA). In this capacity it is my responsibility to express and support the views of our organization to the best of my ability, which I am honored to do. It is also my responsibility to do all I can to ensure that members of the organization have full and complete information with which to formulate their views.
The three-part article by BANA entitled "The Evolution of Braille: Can the Past Help Plan the Future?" communicated a great deal of information about the changing nature of print and Braille in the May, October, and December 2011 issues of the Braille Monitor. I urge readers to read that essential information in order to understand the larger context of where things stand today regarding Braille codes in the U.S. In what follows, speaking only for myself, I will describe the process by which I am coming to understand these issues as well as the conclusions I have reached. Before I do so, I should address my background—not to hold myself out as an expert with all the answers, but to explain what informs my opinions.
First and foremost for this discussion, I am a Braille user. I began learning Braille in kindergarten, and I was fortunate to have all the books I needed through high school transcribed into Braille on paper. All of the books and tests for my mathematics classes were transcribed in the Nemeth code, and, although I was not a math genius, I was a competent math student. In high school I took basic computer programming classes in which the books were transcribed in computer code. I also studied music and foreign languages using Braille. During college, however, in the late 1980s, the stream of Braille dried up. None of my textbooks were available in Braille, and most of the Braille I read was of my own creation (generally the copious notes I took using a slate and stylus). In time, as Braille translation software, refreshable Braille displays, and sources of downloadable Braille books became more common, like many other Braille readers I have gained more access to Braille than I could have dreamed of during college. These days most of my Braille is in refreshable rather than paper format (although at times nothing beats full Braille pages on paper). In my job and in the rest of my life, I read and write Braille practically all day, every day, for all sorts of purposes. With a refreshable Braille display and a mobile phone, for example, I can have the previously unimaginable experience of reading the daily newspapers in Braille during my commute to work. I see firsthand the advantages and the limitations of machine-generated Braille translation and backtranslation.
Next, I am a Braille transcriber. Some ask how it is possible to be a transcriber if one is unable to see standard print. I would be glad to elaborate more on that another time, but for purposes of this discussion, suffice it to say that in addition to the other rigorous training needed to become a good transcriber, a blind transcriber must be skilled in the use of many different tools for discovering the exact content and format of the print (including pictures) and must clearly understand the limitations of every tool in order to use all of them to best effect. In the early 90s, perhaps as a reaction to the dearth of Braille available to me during college, I became very interested in learning to use Braille translation software (which is one of the tools a transcriber may use), and over the last twenty years I have become experienced in working with several different translation programs. I was also certified as a literary Braille transcriber by the Library of Congress and have detailed familiarity with the formats required for producing textbooks. For almost ten years I transcribed and taught others to transcribe materials into Braille for the University of Minnesota. That work familiarized me with the complex nature of today's textbooks and the problems of transferring the print contents onto a Braille page in a way that gives the blind reader the same information that the sighted reader gets. It also exposed me to the advantages and challenges of working with publishers' files as opposed to print on paper. In my current job I coordinate courses for those seeking certification as Braille transcribers and proofreaders.
I have also taught Braille to blind students. For four years I worked as the Braille instructor at BLIND, Inc., teaching Braille to adults—some who were newly blinded and some who did not receive the Braille instruction they should have received as blind children. I learned much from working with these students through their struggles and successes.
Since the late 90s I have followed the discussions about the unification of the Braille codes with interest. I was not part of BANA at the time, but I read everything I could get my hands on about the origins and progress of the development of a unified code and the controversy surrounding it. Keeping an open mind, I attended the workshops, evaluated the samplers, and talked to as many people as I could who knew something about the issues. Ultimately I reached the conclusion that, although a code bringing together the literary, math, and computer codes was a good idea in theory, the benefits of doing so were not as great as the problems it would cause. I therefore agreed wholeheartedly with the resolutions we passed in 2002 opposing any drastic changes to Braille.
Since that time more access to refreshable Braille, the changes in communication technology for everyone, further developments in Braille unification efforts, and my involvement with the work of BANA have caused me to revise my view. During the late 90s and into the early 2000s, many of us had little experience with reading dynamically generated content in refreshable Braille. The Braille notetakers at the time were only just beginning to include mainstream connectivity features, so most of what we read with these devices, if we had access at all, was content we had written ourselves or which had been created specifically for use in Braille. Fast forward ten years—it is now possible for us to read the screens of some mainstream mobile devices in Braille, and we can type in contracted Braille on these same mainstream mobile devices and computers. What we type using six keys is no longer just for ourselves to read—we of any age can email it, text it, or even Braille into a document on which we are working together with a sighted colleague. These technological developments have tremendous potential to boost the support for Braille literacy of blind children and to increase the utility of Braille for all of us. It is more apparent to me now that some changes to our Braille codes would help us realize that potential more fully.
Knowledgeable Braille transcribers are essential, but much of the Braille available today is not produced by transcribers. Teachers of blind students must often spend significant time preparing Braille for their students. Any number of people who know how to operate a computer but are not trained in Braille are called upon to prepare Braille materials.
Although the words, numbers, and punctuation in an electronic book or other document may look perfect when reviewed in print format, many errors are usually introduced when the same document is electronically translated into Braille. If a person is preparing the document, he or she must manipulate certain details to eliminate these errors. In reading contracted Braille on a refreshable Braille display, such as when downloading an eBook from a mainstream bookseller or even reading the web, human intervention is not part of the equation, so these same errors find their way right to our fingertips. For example, without human intervention, email and web addresses usually do not display in computer code, so it can sometimes be unclear which characters are intended. If a symbol does not exist in current literary code, like the bullet or the "greater-than" symbol, that symbol is either written in words, skipped entirely, or displayed as a random unrelated Braille symbol. Dashes often show up as hyphens. The indicators required by Braille rules to show footnotes and end notes do not appear, and the usual print superscripting of the numbers is ignored, so the numbers show up at the ends of sentences without spaces. The current method used to deal with punctuation occurring in the middle of words creates some ambiguity for the reader about the actual symbol intended.
I could go on and on with these examples. It is true that experienced Braille users can figure these things out and work around them (often by relying on speech output to clear things up). The work-arounds pose more problems for children dealing with educational materials or other communication which is more frequently on the web or in other electronic format readily available to the student without the intervention of a trained transcriber. A teacher who often does not spend much time directly reading from a Braille display unless he or she is also blind may not be aware just how often errors and ambiguities appear in on-the-fly translation. Frequent exposure to such errors can undermine the process of learning correct Braille and of learning the material being read in Braille. Some changes to Braille itself would reduce the time and effort needed by people preparing Braille and by Braille readers themselves to deal with these fussy code details. Simply tweaking the current codes here and there would introduce different errors. The problems will just become worse, as they have over the last twenty years.
Problems in the current literary code--which, by the way, is officially called English Braille, American Edition (EBAE)--become even more apparent when a Braille user types Braille into a document and the Braille is backtranslated to print. Two categories of problems contribute here. First, even if the user follows the Braille code rules correctly, errors in backtranslation may still occur because of ambiguities in the code. Will the "dot 4 e" I typed translate as an accented e of some sort or as a euro symbol? If I type a word like "FanNation,” will the "dot 6 n" backtranslate as the intended capital N, or as "ation?" What about "k4"—will it come out as "kbled?" If I want to send a text mentioning the performer Will.I.Am, how do I avoid having my text say "WillddIddAmdd" or even just "WddIddAmdd?" The developers of backtranslation software work hard to keep the programs as accurate as possible, but, because of the state of current code, they often have to create a programming exception for each new brand name or word, since there is no systematic way to handle the problems. Yes, one can go through certain contortions to ensure that these and similar items turn out right in backtranslation, but one must first be aware of a possible problem and must be familiar with the work-arounds. One could also just do the typing on a QWERTY keyboard. However, writing is as important for developing literacy as is reading it. Braille-reading children need to type much of their schoolwork in Braille while at the same time letting their teachers or peers read it in print. The potential exists for anyone to work entirely in Braille while communicating with non-Braille-readers; it would go much better if we eliminated unnecessary reasons for errors.
The second category of backtranslation trouble occurs if the user does not know and apply the rules of the code perfectly. When we wrote Braille primarily for our own use, exact observance of the rules mattered much less. Now the user must have a better grasp of the rules and the exceptions. I have been involved in the judging of a number of Braille contests in which fluent Braille users put forth their best work to try to win the prizes. Although these Braille users obviously knew Braille well, in a surprising number of instances they did not follow the rules, and, if the work had been backtranslated, errors would have resulted. For instance, confusion sometimes arises over spacing between the words "in" and "the." In correct Braille, there is no space between the words "into" and "the,” and no space is left between words such as "and" and "the." In a number of cases Braille readers omit the space between the words "in" and "the," which is not permitted; a backtranslation would therefore also omit the space in print. For another example, certain contractions, such as the one for "con," are permitted to be used only at the beginning of words—with a few exceptions. Some Braille users, observing that the "con" contraction can be used within "O'Connor,” sometimes use the "con" contraction after a prefix such as in "inconvenient," yielding "inccvenient" in backtranslation. Again these are relatively small matters by themselves, but, if these things occur with fluent Braille readers, how much more must they occur for people still developing their Braille skills and working to apply what they are learning?
Some say that, if the Braille codes were made simpler, that might help teachers learn Braille better and be more inclined to teach it. I am fairly skeptical that code complexity has been a major barrier to Braille education. The negative attitudes about Braille in the education system (and in society in general) run much deeper. Of course I would love to be proven wrong in this belief and to see code changes cause more children to learn Braille, but I do not think we have enough evidence to assert improved Braille education as a reason to make code changes. However, some simplification of rules would be helpful to anyone who needs to write Braille that will be read in print.
Earlier I mentioned that some common print symbols currently have no representation in literary Braille. It seems baffling that, after all this time, we still have no consistent way to represent the + sign in EBAE. The reason is that the addition of any of the acceptable possibilities for this and other such symbols into the code as it currently exists would simply increase the conflicts and backtranslation problems we have just been discussing. For decades the BANA committee charged with updating the literary code—made up of Braille readers, teachers, and Braille producers with expertise in code development—has put in enormous amounts of time working to ensure that the literary code is adequate to express today's literary material without creating more conflicts within the literary code or with other codes currently in use. Since being more involved in BANA, I have been able to observe the struggles more closely. Their task is tremendous, and the current state of affairs remains unresolved, not because of the lack of effort and expertise, but because we have a code right now that is in a state much like a Scrabble board at the end of a game in which few if any openings are available to fit in new words.
While working on the literary code and various other projects such as tactile graphics guidelines, BANA has also continued to observe the development of Unified English Braille (UEB) as well as of the Nemeth Uniform Braille System (NUBS). Please see the BANA article mentioned earlier for more on the origins and history of these efforts. During the last ten years Braille readers from around the world have worked to make refinements and improvements to UEB. For example, the technical sampler that was distributed in the U.S. in the early 2000s is now out of date because of updates and improvements. On the website of the International Council on English Braille, the rule book for the code and guidelines for presentation of technical material are available for download by anyone. Although these publications are full of illustrative examples, remember that they are books about rules and are therefore not particularly compelling reading—just as is the case with our current EBAE rule book.
The basic characteristics of UEB were discussed in the BANA article. Another important feature, the misunderstanding of which has sometimes given people a negative impression of UEB, is its ability to show different typeforms, such as boldface, underlining, and the like. A common misconception is that these indicators would appear in everything, creating much distracting clutter. In fact, the UEB rules, like our current rules, call for most typeface indication in print to be ignored in Braille and used only when needed for emphasis and distinction. The different types of emphasis are needed for transcribing textbooks and other specific materials. These typeform indicators are also present in NUBS, with similar restrictions.
UEB is also capable of handling many types of technical material, even to the advanced level. The representation of these materials is, however, very different from what we use in the U.S. When the work was basically completed on the Nemeth Uniform Braille System two years ago, I eagerly studied the sampler and the rule book, hoping that the Nemeth Uniform Braille System would be the answer we had been seeking—a way to minimize the difficulties with the current literary code while preserving our tried-and-true system for working with math and science material. The basic features of NUBS are also discussed in the BANA article, and I will not repeat all of them here. NUBS is not simply our current literary Braille with lower numbers. It is my view that, although NUBS takes a systematic approach to addressing the problems and would offer some real benefits for working with technical material, it introduces some possible difficulties into material Brailled for everyday use.
As discussed in the BANA article, most Braille codes, including our current ones, use "modes,” in which a given Braille character has different meanings depending on which mode is in use. Mostly the everyday Braille reader need not pay much mind to modes because their application is quite intuitive. NUBS has two modes, narrative for normal literary material, and notational for numeric and technical material. Punctuation is different in the two modes—the comma, period, and colon are completely different, and the other marks of punctuation require an extra indicator in notational mode. The Braille learner must grasp the concept of these two modes right from the very beginning, because notational mode is not used just for math and technical material. It is used anytime a number is present—in a numbered list of spelling words, in references to time of day or money, in the mention of a year. It is also used in cases where no numbers are present—email addresses and middle initials, among others. The Braille user must be mindful of the two modes when writing, or backtranslation errors will creep in. To have two sets of punctuation throughout all texts creates more complexity for everyone. Additionally, NUBS uses a method for indicating accented letters that requires a special symbols page in order to give the reader information about the accents while avoiding a great deal of clutter in the text. The code therefore misses an opportunity to improve the experience of students learning foreign languages as well as providing the general reader better information about the accented letters used in English.
As we look for the right path forward for Braille readers in the United States, we need a solution that addresses the problems of current codes as much as possible but provides the flexibility to allow for agility and precision in the representation of both technical and non-technical material. Each of the challenges with current codes discussed above may not seem like a major issue by itself, but together they make it clear that some change is necessary. The conversation about code change has been going on for more than twenty years now, while mainstream technology, speech-based solutions, and print in general continue to change and grow. We all want and need for Braille to remain a vital part of this equation, and, if we can lay the right infrastructure, it will do so. We need a balance between universality and flexibility in the Braille codes, and we need a solution that moves us to a better state of affairs than currently exists. We must try to improve what can be improved without disrupting what need not be disrupted.
Given all the complex issues to consider in this decision, the path that seems the most reasonable to me would encompass these three elements: 1) adopt Unified English Braille to replace English Braille, American Edition and the computer Braille code as the standard for general-purpose materials; 2) maintain the current Nemeth Code for use in mathematical and technical material; and 3) develop a gradual implementation plan involving a minimum of disruption to the education of blind children, taking into account the needs of Braille users of all ages and walks of life and providing clear guidance to Braille producers and teachers about when to use which code.
It may be tempting to reject this solution out of hand because it seems to undermine most of the original goals for unifying the codes. It may be easy to think of reasons why it would not work. Yet, imperfect as it may be, I think it worth careful consideration. Even with two separate codes, if the general purpose code was optimized to meet the demands of today's "general purposes," and if the code optimized for math and science was maintained, we would be in a far better position than we are now. It would allow for time and space to learn more about the things we do not know while fixing some known problems and maintaining things that are known to work well. This strategy would allow us to develop and use every tool to facilitate blind students and professionals in STEM subjects.
Lest it seem that the current Nemeth Code would be left behind in the digital age, note that at least one Nemeth backtranslator is now in use. Also, regardless of the code used for technical materials, transcribers are still very much needed. Our technology and work with publishers' files has simply not advanced enough to eliminate the need. The educational materials produced for children must be accurate. We must still push to get school districts to understand the importance of certified transcribers.
I will not try to propose here what the implementation plan should look like, but the input of teachers, parents, transcribers, Braille readers, and others with an interest in Braille is needed. For a transcriber who has been trained in all the Braille rules and how to manipulate those dots on the computer screen to get them into print, the perception of these issues will be vastly different from that of a student working with a Braille display, trying to type her assignment for French class so that the teacher can read it, or even just trying to type an email address into her mobile phone. The perception will be different for a person who reads mostly books and magazines from the library, does not have a refreshable Braille display, and writes most Braille on paper or labeling tape. Parents and teachers of a child in a school district with plenty of access to Braille transcribers may experience these issues differently from a child in a district where the resources are few. Yet all of these perspectives are very important in making the decisions and crafting the implementation plan.
Let us not allow fear of change to hold us back but rather let us work together to use our energy to move Braille forward so that it remains an integral part of the work and lives of blind people for generations to come.