This is an archive of the original site, preserved as a historical reference. Some of the content is outdated. Please consult our other sites for more current information:, ScriptSource, FDBP, and silfontdev


Contact Us


Initiative B@bel

WSI Guidelines


















Type Design


Design Tools



Font Downloads










Font FAQ



Computers & Writing Systems


You are here: General
Short URL:

NRSI Update #11 – January 2000

Lorna A. Priest (ed.), 2000-01-01

In this issue:

Introducing Alan Conner

by Alan Conner

Nearly five years ago, on an assignment for Hewlett-Packard in Germany, I sat wearing headphones in a tri-lingual setting struggling to understand as a presentation was shared and concurrently translated into English. It affected me deeply, I had no idea cross-cultural communication was so difficult. Unknowingly, that moment changed my direction toward working with SIL International and the NRSI.

I was born and grew up, though not taller, in the Northwest, United States. 21 years ago I met and married my wife Cyndi; together we have four boys: Ryan-19, Wes-17, Kurt-12 and Trenton-9. The family moved to the Dallas area to begin our first assignment with SIL International this past June after spending the past 20+ years in Boise, Idaho. We have purchased a home here and are all adjusting, slowly, to the climate and culture.

I graduated with an Advertising Design major from Utah State University in 1975, where I developed a love for letterforms, logos and visual communications. My professional career started at a design studio in San Diego, California, proceeded to an advertising agency in Boise, Idaho, then for 18 years, with the Hewlett-Packard Company (HP). At HP I had the privilege of being part of the advent of laser printing and fonts as we know (and love?) them today. I even had opportunities to “flip bits” on early 180- and 300-dpi characters long before Fontographer-like tools. I worked as a graphics designer, managed designers/typesetters and for the past nine years, served in laser-printer marketing communications and program management. It was interesting to have a front row seat in the transition to desktop typography.

After four years of preparation, my family and I are very happy to finally be here. We look forward to meeting and working with each of you associated with the NRSI team.

Introducing Dennis Drescher

by Dennis Drescher

My name is Dennis Drescher. My wife Nancy and daughter Leila (age 3) are settling into our new roles here in Dallas. I came to SIL as an auto mechanic with over 10 years experience. Our first term of service with SIL, which started in 1990, was in Togo, West Africa. My original training and assignment was in construction and maintenance. I did some management but the bulk of my first term revolved around administrative support. Because of various factors beyond the control of the branch I was never able to establish an auto repair shop as originally planned.

On our first furlough I decided that I should get some training in Scripture publishing, as this was an area of great interest. I attended the International Publishing Services’ (IPub) DTP training course while we were here in Dallas. As we were in the midst of the adoption of our daughter Leila our furlough was extended. We stayed in Dallas during that time and I was on loan to the NRSI.

Since then we returned to complete our assignment with the Togo/Benin Branch and now have been reassigned to serve in the NRSI. My title is Publishing Systems Developer. I am currently engaged in two areas. First, to establish corporate-wide goals and strategies concerning Scripture publishing in light of Vision 2025. Second, research of XML (eXtensible Markup Language) and its potential for use as a markup language for Scripture text publishing.

Introducing Lorna Priest

by Lorna Priest

In 1985 I joined SIL and took the first semester of SIL training at Norman, Oklahoma in 1986. I spent the first nine months of 1987 in IPub, training to be a typesetter, and I typeset two books during that time. I moved to Horsleys Green, England in September of that year and spent two years working in SIL’s typesetting office there. During that time I typeset a variety of materials from West Africa and also developed a bitmap font for a book in Mongolia’s modified Cyrillic alphabet (that was in the days before NRSI existed or I would never have attempted it!). Also during that time, we switched from using Scriptset for our publishing to Ventura Publisher. In August of 1989 I moved to Kenya and have been a member of the Eastern Africa Group since then. One of the challenges I enjoyed during those years was the typesetting of books from Ethiopia. While in the US in 1992 I helped out in the training section here at IPub and discovered that was something I enjoyed. A few years later (back in Kenya) I decided I needed a break from typesetting and so in 1996–1997 I taught computer courses to Kenyans, other Africans, and SIL members. This included Microsoft Office programs for office workers and SIL programs for linguists. I was the computer department manager during FY 1998.

About 4 years ago I became ill and was eventually diagnosed with Chronic Fatigue Syndrome. I was unable to get the help I needed in Kenya, and so returned to the US for an extended period to see if a doctor in this country could help me. I will be working part-time with time out for medical tests as necessary.

I have been asked to work with Dennis Drescher in developing a Non-Roman Script Publishing system. At the moment I am focussing on developing expertise in TeX and working towards being able to typeset Ethiopic material in TeX. I look forward to working in the NRSI with this great group of people.

New Tai Lue fonts released in public beta

by Victor Gaultney and Peter Constable

We are pleased to announce the public beta release of The SIL Xishuang Banna Fonts.

The SIL Xishuang Banna Fonts are a new rendering of the New Tai Lue (Xishuangbanna Dai) script. Two font families, differing only in weight, allow for a wide range of uses. The fonts are available for both Windows and Macintosh systems and include keyboard definitions.

The New Tai Lue script is used by approximately 700,000 people who speak the Tai Lue language in Yunnan, China. It is a simplification of the original Lanna script as used for the Tai Lue language for hundreds of years.

Although this is a beta release it contains all the features and components that will be in the future release. The only changes between releases will be the font names, minor design enhancements and bug fixes. Also, the name “Xishuang Banna” is strictly temporary and will be changed in the final release. Comments regarding the fonts and suggestions for names are welcome. We also invite feedback regarding bugs.

For additional information or to download the SIL Xishuang Banna Fonts package, click here.

We wish to thank the Chinese Academy of Social Sciences Institute for Nationality Studies, Beijing, for their assistance in developing this font package.

Some Recent Font and Rendering Technology Developments

by Peter Constable

In recent months, there have been some technology developments that are of interest to us:

  • In September, Apple and Microsoft developed extensions to the TrueType font specification to provide a mechanism for mapping Unicode characters defined by surrogate pairs to glyphs in the font. This is a key step in permitting use of all 1,000,000+ possible characters in Unicode. (Fonts are still limited to < 64K glyphs, however.) See for details.
  • Adobe has made available a new set of tools for font developers. (These tools focus on Postscript Type 1 format, however.)
  • Microsoft has made available to developers an evaluation version of its OpenType Services Library. They have also released a set of libraries that can be used by font tool developers; included is support for OpenType. See and for details.
  • Microsoft has made available a beta of their Visual OpenType Layout Tool (VOLT), which can be used for visually creating and adding OpenType font tables to a TrueType font. See for details.
  • Microsoft has defined additional bits for the UnicodeRange field in TrueType fonts to support all scripts in Unicode 3.0.
  • Microsoft is continuing to work on adding support in Uniscribe/OpenType for additional scripts, and to enhance support for existing scripts. At least some of these should appear in Windows 2000. (The behaviour is still, to some extent, hardwired, and based upon the needs of major languages, however, and so still falls short of what is needed for minority languages.)
  • Microsoft Office 2000 is able to make use of Uniscribe/OpenType if these are installed on a machine. Uniscribe doesn’t come with Office 2000, but it can be added as an option when installing Internet Explorer 5 (see Errata). What this means is that it is not necessary to have international versions of Windows in order to get rendering support for scripts such as Arabic and Thai in apps like Word. (Note, however, that this does not provide any mechanisms for input—that has to be dealt with separately.)

Update on SDF-related technologies

by Tim Erickson

RENDER (current version 3.9) The SDF-based Rendering Engine

  • Handles left-to-right, right-to-left and mixed-direction scripts (i.e. those that use right-to-left direction for most text but left-to-right direction for numeric sequences).
  • Supports either ANSI/ASCII or UNICODE output.
  • Supports 16-bit or 32-bit operating systems.
  • Supports 2-form and 4-form contextualization, as well as word-final, word-initial and word-isolate forms.
  • Supports environment constraints.
  • Supports character classes in environment constraints.
  • Supports substitutions, including ligatures; supports reordering (when defined through sequence-to-sequence substitution).
  • Supports kerning.
  • Supports mouse clicks, caret positioning and text selection.
  • Registers a simple EDIT control with automatic rendering support for dialog boxes, etc.
  • High-level API includes functions for built-in “Insert character” and “Rendering options” dialogs.
  • Supports character literals, decimal codes and UNICODE hex values in script descriptions.
  • Supports custom line-break characters other than space and hyphen.
  • Optimized to run ten times faster than early versions.

SDFE (current version 1.3) The SDF Editor

  • Simple, WYSIWYG editor for script descriptions.
  • French or English interface
  • Supports RENDER features except UNICODE output.
  • Integrated testbed for quick feedback on changes to SDF’s.
  • Extensive online help

RTL for Windows.


  • enables keyboarding rendered text into any Windows program
  • automatic support for LTR, RTL or mixed-direction scripts
  • supports all RENDER features except UNICODE output

ScriptPad (current beta test version 0.19) Word processor

  • full-featured word processor
  • graphics, OLE objects, text boxes, lines and boxes supported
  • no fee / distribute freely
  • novice mode (greatly simplified interface)
  • advanced user mode (powerful feature set)
  • Spanish, French or English interface
  • supports all RENDER features, including UNICODE
  • saves files in Rich Text Format or as ASCII text
  • imports Standard Format text, converting fields to styles

Shoebox (current version 4.1) Database editor.


  • advanced linguistic/anthrop. database editor with extensive feature set
  • supports all RENDER features except UNICODE output
  • supports rendered export to Microsoft Word

LinguaLinks (current version 3.5, version 4.0 to be released by end January) Database editor.


To order:

  • can be customized to support any RENDER script, except those that use UNICODE output; see LinguaLinks team for assistance.

Paratext (current version 5.0) Scripture editing tool

  • goal is by end of 2000 should support all features of RENDER fully (version 6.0)

Eventual integration also planned with CARLA Studio and Santa Fe.

15th International Unicode Conference

by Peter Constable (with contributions from John Thomson and Dan Edwards)

The 15th International Unicode Conference was held in San Jose, California August 30–September 2, 1999. Attending from SIL were John Thomson (Academic Computing), Dan Edwards (CCG) and I. As usual, there were two days of tutorial sessions followed by two days of conference, and as usual there were both plenary and parallel sessions (three separate sessions going concurrently most of the time). John, Dan and I tried to split up most of the time so as to cover as much as possible, though there were some sessions that two or all three of us wanted to attend, so there were some that weren’t covered.

Unicode 3.0

As of last summer, version 3.0 of Unicode has been created. (Publication of the new version of the standard is expected for end of February, 2000.) There were sessions presenting what is new in version 3.0. Of particular interest, Unicode now includes support for several new scripts (though not always the entire repertoires used by minority languages), including Ethiopic, Myanmar, Khmer, Canadian Syllabics (used for Algonkian & Inuit languages), Cherokee, Syriac, Sinhala, Thaana, Mongolian, Ogham, Runic, Braille and Yi. There were also presentations on upcoming developments, including the addition of another 40,000 odd Han Chinese characters (to be added to “plane 2”—likely to be finalised within the next year).

Font Technologies

There were various sessions at which font technologies came into focus. In each of these sessions, and at other points in the conference, there were themes I kept hearing repeated that struck me as being the issues most concerning the font vendors and developers of font technology; they all had to do with providing fonts for the variety of scripts in Unicode, and for the large inventory of characters in Unicode. There’s one thing that many clients seem to be wanting: fonts that cover the full range of the latest version of Unicode. I heard that repeated many times. There’s also one thing that the font providers are really trying to avoid: fonts that cover the full range of Unicode.

People are wanting fonts to cover all of Unicode either because they want to market products that can be marketed globally, or because they are having to work in many places around the world. They see single fonts that cover all of Unicode as being the easiest solution. There are various reasons why the font vendors want to avoid this, however:

  • Fonts that cover all of Unicode are huge (Arial Unicode MS, which covers Unicode 2.1, is about 23MB) and severely tax OS resources, resulting in poor performance (I can confirm that to be true).
  • It is impossible to create single designs that cover different scripts well. For example, it was pointed out in a session discussion CJK that not only are there differences in how Han characters are structured for Simplified Chinese, Traditional Chinese, Japanese and Korean, but there are differences in design style preferences for each of these contexts. Bigelow & Holmes discussed the difficulties of creating single designs that worked well for two different scripts, let alone dozens of scripts.
  • It is definitely the exception for any single user to need glyphs for anything but a small portion of Unicode.

During the conference, I learned of some font technology developments, most notably that there is a new extension to the TrueType spec that allows for 32-bit cmap tables. This has already been implemented by Microsoft. What this provides is a mechanism for handling surrogate characters (the potential 1,000,000+ characters that lie beyond the basic set of 65K). Interestingly, though, TrueType fonts can still only handle up to 65K glyphs. (So it’s possible to refer to 1,000,000+ characters in a font, but it’s only possible to have glyphs for up to 65K.) What’s more, there are no plans to change this. The reasons are that it would take major redesign of the font format (16-bit glyph IDs are assumed throughout the many different tables in a TTF), and the concerns regarding huge fonts.

The font vendors and font technology developers are really concerned with coming up with ways to provide small fonts that meet peoples needs. They feel the need to come up with some solutions pretty quickly. Within the next couple of years, we’ll probably start seeing things like virtual fonts: a collection of small fonts that are treated in the system as though they were, effectively, a single font.

Internationalisation and Localisation

Moving on from fonts, Richard Ishida of Xerox gave a presentation on I18N (internationalisation) design issues, i.e. how to design and implement software so that it’s easy to adapt it for use for different languages and international markets. Both John & I attended this, and the information presented will be useful in helping our developers make our language software more localisable (i.e. easier to provide user interfaces in more than just English).

Developing Unicode-conformant Software

There were various presentations either explaining technical details involved in developing Unicode-conformant software, or by developers giving their experience in developing Unicode-conformant software. It was beneficial for one of our key developers to be there for these. As a result of such sessions, John Thomson was able to identify some key areas of concern that will need to be considered in SIL software development (e.g. normalisation).

There were numerous sessions related to Java and Unicode, and John and Dan were able to attend several of these. John was also able to interact with others involved in open-source efforts to develop a set of Java object classes that can be used for processing Unicode text. These have potential to be useful to SIL, and this also presents an opportunity for SIL to have an impact on the software industry and, in that way, develop contacts and gain influence that may be able to serve our organisation’s goals.

Commercial Software

Once again, there were presentations by Microsoft on their latest versions of applications, particularly Internet Explorer 5.0 and Office 2000, which are Unicode conformant. Among developers of popular business applications, Microsoft is clearly furthest along in providing support for Unicode and multilingual text. At this point, they are wholly committed to Unicode, to I18N software development methodologies (writing code so that it is easily adapted for use in various international markets), and, in fact, to single binaries—to having a single version of software that works in all locales. (In the past, they had used a multiple-localised-binary approach—create software that works for English and then adapt separate versions to use a second language needed in a particular market.) They have made enormous progress in these regards over the past few years, and are still progressing. Of other major developers, Adobe is clearly moving in the same direction as Microsoft, though they haven’t made nearly the same progress. I met a rep from Corel at the conference, and it was clear that they are lagging in this area.

Unicode on the Internet and Web

There were various sessions discussing Unicode in relation to the Internet and the World Wide Web. The people involved in developing Internet technologies have worked hard to incorporate Unicode and I18N technologies into the Internet, so that a lot is technically feasible at this point. The internet consists of software, hardware and other resources controlled by millions of different parties, however, and many have not yet adopted the latest technologies. Still, progress is being made, and interest in e-commerce is helping drive things in the right direction (at least for major languages of the world).

Personal Interactions

Without question, one of the most useful aspects to the Unicode conference is the opportunity to interact with people in industry and to build relationships with them. Dan, John and I were able to have particularly valuable interactions with people from Microsoft, IBM, Netscape, Adobe, Apple, Xerox, Mitre, Library of Congress and the Unicode Technical Committee. Some of these contacts have already been of considerable help to our work.


While there were many sessions repeated from previous conferences, I still learned a lot by attending, and it was particularly beneficial to expose John and Dan to the conference. The contacts gained have been extremely valuable. Overall, our attendance at this conference should have very clear benefit to all of SIL and beyond.

ATypI Conference Report—October 7–10, 1999

by Victor Gaultney

Once a year the Association Typographique Internationale (ATypI) sponsors a conference for type designers, typographers and anyone else interested in fonts and type design. A historically European organization, ATypI often meets in Europe. This year, however, well-known type designers David Berlow and Matthew Carter (who both worked on SIL projects when they were with Bitstream many years ago) hosted the event in Boston. Since it was in the US I was able to attend for the first time.

About 400 people were registered, probably about half US and the rest European. I’d say 50–150 of these are directly involved in designing type, with the rest being design teachers, students and lots of people in the graphic design industry. David Berlow remarked that it was the largest gathering of type designers in the world.

Two other things struck me right away: The proportion of women in attendance was smaller than Apple Developer Conference (!) and the proportion of Macs to PCs was higher—yes, a far greater number of Macs for every PC I saw. It was notable, though, that I saw far less computers in general. It was kind of refreshing to come to a conference where computing was not the center of attention!


The first technology-related session was a joint session between Adobe (David Lemon), Microsoft (Aiman Copty) and a guy from Bitstream. Although the Bitstream guy had a few minutes on his topics, the clear focus was on OpenType.

OpenType fonts are supported in the next major releases of ATM on both platforms, and are natively supported in Windows 2000. OT fonts work just like current fonts in “legacy” applications. Adobe InDesign ships with Tekton Pro—the first public OT font—and supports certain, specific OT layout features (ligatures/alternates/etc.)—on both platforms. It works because new Adobe apps are using an identical type model (the proprietary CoolType) across platforms. More OT fonts are to come from Adobe (Garamond, Minion, Myriad) with ISO Latin 8859-1 and -2, some Greek and some Cyrillic support. Eventually all Adobe fonts will be OT fonts. Adobe is working with MS and Apple on native OT support. Adobe will also be releasing some OT font tools soon that will be text-based and free.

Microsoft is also continuing development of their tools. Typography Integrated Development Environment (TIDE) is a component software layer to support type development apps. Visual OpenType Layout Tool (VOLT) is now available. Visual TrueType 4.2 (VTT) now has hundreds of licensees, but is now going into no-maintenance mode since the main programmer (Greg Hitchcock) will be working on ClearType (LCD technologies for clear type display) for about 18 months. A new Font Digital Signature SDK was released. They also have plans to extend the Font Properties Editor to edit all font tables and validate them. Note that all of these tools (except VTT) are Windows-only. (Strangely enough, the VTT demo was done on a Mac running the Win version on Virtual PC!)

The general attitude among font designers toward OpenType was “so what?”. Designers are not interested in getting any more technical then they have to, and were extremely ambivalent about it. Adobe alone mentioned that they are doing OT development. I had a later conversation with David Lemon, asking if Adobe had any plans to let the type design community know what features InDesign and other Adobe apps will support. He said no, but I pressed him to at least release some sort of “basic features spec” that states the features Adobe will put in their fonts and that “the rest of us” could trust would be supported in Adobe apps. He was amazingly noncommittal and seemed uninterested.

The most interesting moment in the conference was when the emcee asked how many type designers were in the audience (about 50 raised our hands). He then asked how many of us were doing type development on Windows (and didn’t work for MS). Only a single hand remained up—someone who also happens to consult for MS. Very interesting. The emcee tried to get the MS guys to explain why all their tools were Win-only. No answer.

It seems that Adobe and MS are pushing OT but do not seem serious about releasing accessible tools (on the preferred platform) that would help the type design community build OT fonts. Hence the type design community is noncommittal. This attitude may change if and when tools are released and OT application support becomes widespread (which may be a long time).

In an interesting peripheral announcement, Adobe is killing off Multiple Master technology, and have removed MM support from the OpenType spec. They said that their users were finding it too difficult to manage. I suspect the real story was that they could never get it working right in Windows (others agree with me). They will continue to sell current MM fonts, but will not release any new ones.

Apple had a pretty low profile, although four of the five type team members were there. In one session they mentioned their tools, reiterating their focus toward text-based tools. Peter Lofting said that text was the “ultimate tool interface”!


I never found anyone from Macromedia, and the general consensus was that Fontographer was a dead tool. Nevertheless, most people still use it. Pyrus (makers of FontLab) came on strong in the conference. They now have more people working on Mac tools then Windows ones. Most people agreed, however, that it had odd, hard-to-use drawing tools and suffered from serious stability problems (especially on Mac). They will, however, be making a FontLab communications library available (with sample code) that will allow others to communicate with FL.

RoboFog had a lot of air time, but its weaknesses (hinting in particular) and technical orientation make it a niche tool. It also seems that the RoboFog folks have hit a wall in trying to fix some problems (due to the core limitations of the original Fog code), so they are beginning work on “Mark II”—a brand new app. But it will be a while.

TTX—a text-based font dumping/fusing tool—is now available. It dumps a TrueType font into a XML-based text syntax and can recompile it. It has a ways to go, though, as it does not yet have XML structures for many of the tables. It is being done by one of the RoboFog guys, and so is written in Python. This also means it works on all platforms.

The Type Design Process

Of all the meta-issues that wove their way throughout the conference the most pervasive was the growing technical difficulty of type development. Like never before, type designers are faced with daunting technical issues just to get their fonts to work. It used to be very easy with Type 1 and ATM. Now there’s TrueType, OpenType, Unicode, growing cross-platform headaches, and a wide variety of tools—none of which does the complete job. Now it is getting almost impossible for many designers to handle the technical sophistication required. They all said that we were going back to the days where it took a whole team of people to make type—designers, hinters, programmers, etc.

I heard one designer say that he draws in Illustrator, then imports into Fontographer/RoboFog for more Bezier work and some special tweaking, then brings it into FontLab for hinting and some encoding, but then uses tools from MS and Apple to fix all the glitches FontLab causes. Eeeek! He was not the only one, either. Three different designers, on separate occasions said basically the same thing.

On a more positive note, I watched these three master designers walk through the process of scanning and drawing a font. They each described their philosophies with regard to point placement, testing, etc. I found out that I do it “the right way”!

Cross-Platform Fonts

Tom Rickner of Monotype (previously Apple) gave a session on creating fonts that work reliably cross-platform. His bottom line was that it was impossible! He (and some Apple folks) did give some pointers, though. He once counted over 160 different parameters in a TrueType font alone that had to be set correctly for the font to work right. He recommended getting the Windows fonts working first (the most difficult), then wrapping them in a Mac suitcase and tweaking the suitcase’s FOND resources.

Web Fonts

Web fonts seem to be a hot thing these days, but the bottom line is that no one is using them. There is no way to create web fonts that work on both Netscape and IE on the two major platforms. Lots of people complained that they look terrible (they used stronger words I’d rather not quote). Most type foundries are also scared of piracy. As a result, Bitstream is now beginning to license WEFT technology from MS (!) and will now respect the embedding bit, allowing font designers to truly restrict their typefaces from Web use (which 99% of them will do).

One of the presenters’ final words on Web fonts was: “get used to Arial”—which caused visible shudders through the crowd.


Looking back on the conference, it was a few days well spent. The greatest benefit was learning how my colleagues in the typeface industry do their work and what were their key issues in development. Although technology has made basic type design tools more available than ever before, that same technology has made type design and development more difficult than at anytime in the past decade and a half.

Circulation & Distribution Information

The purpose of this periodic e-mailing is to keep you in the picture about current NRSI research, development and application activities.

© 2003-2024 SIL International, all rights reserved, unless otherwise noted elsewhere on this page.
Provided by SIL's Writing Systems Technology team (formerly known as NRSI). Read our Privacy Policy. Contact us here.