0% found this document useful (0 votes)
51 views

Newfaq

Uploaded by

makisleras2
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views

Newfaq

Uploaded by

makisleras2
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 235

The UK TeX FAQ

Your 408 Questions Answered


version 3.16b-1, date 2006/10/04

October 5, 2006

N OTE

This document is an updated and extended version of the FAQ article that was
published as the December 1994 and 1995, and March 1999 editions of the UK TUG
magazine Baskerville (which weren’t formatted like this).
The article is also available via the World Wide Web.

Contents
A Introduction 9
B The Background 10
1 What is TeX? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2 How should I pronounce “TeX”? . . . . . . . . . . . . . . . . . . . 10
3 What is MetaFont? . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
4 What is MetaPost? . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
5 How can I be sure it’s really TeX? . . . . . . . . . . . . . . . . . . . 11
6 Are TeX and friends Y2K compliant? . . . . . . . . . . . . . . . . . 11
7 What is e-TeX? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
8 What is PDFTeX? . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
9 What is LaTeX? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
10 What is LaTeX 2ε ? . . . . . . . . . . . . . . . . . . . . . . . . . . 12
11 How should I pronounce “LaTeX(2e)”? . . . . . . . . . . . . . . . . 13
12 Should I use Plain TeX or LaTeX? . . . . . . . . . . . . . . . . . . 13
13 How does LaTeX relate to Plain TeX? . . . . . . . . . . . . . . . . 13
14 What is ConTeXt? . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
15 What are the AMS packages (AMSTeX, etc.)? . . . . . . . . . . . . 14
16 What is Eplain? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
17 What is Lollipop? . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
18 What is Texinfo? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
19 If TeX is so good, how come it’s free? . . . . . . . . . . . . . . . . 15
20 What is the future of TeX? . . . . . . . . . . . . . . . . . . . . . . . 15
21 Reading (La)TeX files . . . . . . . . . . . . . . . . . . . . . . . . . 15
22 Why is TeX not a WYSIWYG system? . . . . . . . . . . . . . . . . . 16
23 TeX User Groups . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
C Documentation and Help 17
24 Books on TeX and its relations . . . . . . . . . . . . . . . . . . . . 17
25 Books on Type . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
26 Where to find FAQs . . . . . . . . . . . . . . . . . . . . . . . . . . 20
27 How to get help . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
28 Specialist mailing lists . . . . . . . . . . . . . . . . . . . . . . . . . 21
29 How to ask a question . . . . . . . . . . . . . . . . . . . . . . . . . 21
30 How to make a “minimum example” . . . . . . . . . . . . . . . . . 22
31 (La)TeX Tutorials, etc. . . . . . . . . . . . . . . . . . . . . . . . . . 23
32 Online introductions: Plain TeX . . . . . . . . . . . . . . . . . . . . 23
33 Online introductions: LaTeX . . . . . . . . . . . . . . . . . . . . . 23
34 Specialised (La)TeX tutorials . . . . . . . . . . . . . . . . . . . . . 23
35 Reference documents . . . . . . . . . . . . . . . . . . . . . . . . . 24
1
36 WIKI pages for TeX and friends . . . . . . . . . . . . . . . . . . . 24
37 Typography tutorials . . . . . . . . . . . . . . . . . . . . . . . . . . 25
38 Directories of (La)TeX information . . . . . . . . . . . . . . . . . . 25
39 Freely available (La)TeX books . . . . . . . . . . . . . . . . . . . . 25
40 Documentation of packages . . . . . . . . . . . . . . . . . . . . . . 25
41 Learning to write LaTeX classes and packages . . . . . . . . . . . . 26
42 MetaFont and MetaPost Tutorials . . . . . . . . . . . . . . . . . . . 26
43 BibTeX Documentation . . . . . . . . . . . . . . . . . . . . . . . . 27
44 Where can I find the symbol for . . . . . . . . . . . . . . . . . . . . . 27
45 The PiCTeX manual . . . . . . . . . . . . . . . . . . . . . . . . . . 28
D Bits and pieces of (La)TeX 28
46 What is a DVI file? . . . . . . . . . . . . . . . . . . . . . . . . . . 28
47 What is a driver? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
48 What are PK files? . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
49 What are TFM files? . . . . . . . . . . . . . . . . . . . . . . . . . . 28
50 Virtual fonts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
51 \special commands . . . . . . . . . . . . . . . . . . . . . . . . . 29
52 How does hyphenation work in TeX? . . . . . . . . . . . . . . . . . 29
53 What are LaTeX classes and packages? . . . . . . . . . . . . . . . . 30
54 Documented LaTeX sources (.dtx files) . . . . . . . . . . . . . . . 30
55 What are encodings? . . . . . . . . . . . . . . . . . . . . . . . . . . 31
56 What are the EC fonts? . . . . . . . . . . . . . . . . . . . . . . . . 32
57 What is the TDS? . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
58 What is “Encapsulated PostScript” (“EPS”) . . . . . . . . . . . . . . 33
59 Adobe font formats . . . . . . . . . . . . . . . . . . . . . . . . . . 33
60 What are “resolutions” . . . . . . . . . . . . . . . . . . . . . . . . . 34
61 What is the “Berry naming scheme” . . . . . . . . . . . . . . . . . 34
E Acquiring the Software 35
62 Repositories of TeX material . . . . . . . . . . . . . . . . . . . . . 35
63 What’s the CTAN nonfree tree? . . . . . . . . . . . . . . . . . . . 35
64 Contributing a file to the archives . . . . . . . . . . . . . . . . . . . 35
65 Finding (La)TeX files . . . . . . . . . . . . . . . . . . . . . . . . . 36
66 Finding new fonts . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
67 The TeX Live distribution . . . . . . . . . . . . . . . . . . . . . . . 37
F TeX Systems 37
68 (La)TeX for different machines . . . . . . . . . . . . . . . . . . . . 37
69 TeX-friendly editors and shells . . . . . . . . . . . . . . . . . . . . 39
70 Commercial TeX implementations . . . . . . . . . . . . . . . . . . 40
G DVI Drivers and Previewers 42
71 DVI to PostScript conversion programs . . . . . . . . . . . . . . . . 42
72 DVI drivers for HP LaserJet . . . . . . . . . . . . . . . . . . . . . . 42
73 Output to “other” printers . . . . . . . . . . . . . . . . . . . . . . . 42
74 DVI previewers . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
75 Generating bitmaps from DVI . . . . . . . . . . . . . . . . . . . . . 43
H Support Packages for TeX 43
76 Fig, a (La)TeX-friendly drawing package . . . . . . . . . . . . . . . 43
77 TeXCAD, a drawing package for LaTeX . . . . . . . . . . . . . . . 44
78 Spelling checkers for work with TeX . . . . . . . . . . . . . . . . . 44
79 How many words have you written? . . . . . . . . . . . . . . . . . . 44
I Literate programming 45
80 What is Literate Programming? . . . . . . . . . . . . . . . . . . . . 45
81 WEB systems for various languages . . . . . . . . . . . . . . . . . 45
J Format conversions 46
82 Conversion from (La)TeX to plain text . . . . . . . . . . . . . . . . 46
83 Conversion from SGML or HTML to TeX . . . . . . . . . . . . . . 46
84 Conversion from (La)TeX to HTML . . . . . . . . . . . . . . . . . 47
85 Other conversions to and from (La)TeX . . . . . . . . . . . . . . . . 48
86 Using TeX to read SGML or XML directly . . . . . . . . . . . . . . 49
87 Retrieving (La)TeX from DVI, etc. . . . . . . . . . . . . . . . . . . 49
88 Translating LaTeX to Plain TeX . . . . . . . . . . . . . . . . . . . . 50

2
K Installing (La)TeX files 50
89 Installing a new package . . . . . . . . . . . . . . . . . . . . . . . . 50
90 Where to put new files . . . . . . . . . . . . . . . . . . . . . . . . . 51
91 Installing MiKTeX “known packages” . . . . . . . . . . . . . . . . 52
92 “Temporary” installation of (La)TeX files . . . . . . . . . . . . . . . 52
93 “Private” installations of files . . . . . . . . . . . . . . . . . . . . . 53
94 Installing a new font . . . . . . . . . . . . . . . . . . . . . . . . . . 54
95 Installing a font provided as MetaFont source . . . . . . . . . . . . . 54
96 Installing a PostScript printer built-in font . . . . . . . . . . . . . . 54
97 Installing the Bluesky versions of the CM fonts . . . . . . . . . . . . 55
98 Installing a Type 1 font . . . . . . . . . . . . . . . . . . . . . . . . 55
L Fonts 57
L.1 MetaFont fonts 57
99 Getting MetaFont to do what you want . . . . . . . . . . . . . . . . 57
100 Which font files should be kept . . . . . . . . . . . . . . . . . . . . 58
101 Acquiring bitmap fonts . . . . . . . . . . . . . . . . . . . . . . . . 58
L.2 Adobe Type 1 (“PostScript”) fonts 59
102 Using PostScript fonts with TeX . . . . . . . . . . . . . . . . . . . 59
103 Previewing files using Type 1 fonts . . . . . . . . . . . . . . . . . . 59
104 TeX font metric files for PostScript fonts . . . . . . . . . . . . . . . 60
105 Deploying Type 1 fonts . . . . . . . . . . . . . . . . . . . . . . . . 60
106 Choice of scalable outline fonts . . . . . . . . . . . . . . . . . . . . 61
107 Weird characters in dvips output . . . . . . . . . . . . . . . . . . . . 65
L.3 Macros for using fonts 65
108 Using non-standard fonts in Plain TeX . . . . . . . . . . . . . . . . 65
L.4 Particular font families 66
109 Using the “Concrete” fonts . . . . . . . . . . . . . . . . . . . . . . 66
110 Using the Latin Modern fonts . . . . . . . . . . . . . . . . . . . . . 67
M Hypertext and PDF 68
111 Making hypertext documents from TeX . . . . . . . . . . . . . . . . 68
112 Making Acrobat PDF documents from (La)TeX . . . . . . . . . . . 68
113 Quality of PDF from PostScript . . . . . . . . . . . . . . . . . . . . 69
114 The wrong type of fonts in PDF . . . . . . . . . . . . . . . . . . . . 69
115 Fuzzy fonts because Ghostscript too old . . . . . . . . . . . . . . . 70
116 Fonts go fuzzy when you switch to T1 . . . . . . . . . . . . . . . . 70
117 Characters missing from PDF output . . . . . . . . . . . . . . . . . 71
118 Finding ‘8-bit’ Type 1 fonts . . . . . . . . . . . . . . . . . . . . . . 71
119 Replacing Type 3 fonts in PostScript . . . . . . . . . . . . . . . . . 72
120 Hyperref and repeated page numbers . . . . . . . . . . . . . . . . . 72
121 Searching PDF files . . . . . . . . . . . . . . . . . . . . . . . . . . 73
N Graphics 73
122 How to import graphics into (La)TeX documents . . . . . . . . . . . 73
123 Imported graphics in dvips . . . . . . . . . . . . . . . . . . . . . . . 74
124 Imported graphics in PDFLaTeX . . . . . . . . . . . . . . . . . . . 75
125 Imported graphics in dvipdfm . . . . . . . . . . . . . . . . . . . . . 75
126 Importing graphics from “somewhere else” . . . . . . . . . . . . . . 76
127 Portable imported graphics . . . . . . . . . . . . . . . . . . . . . . 77
128 Repeated graphics in a document . . . . . . . . . . . . . . . . . . . 77
129 Limit the width of imported graphics . . . . . . . . . . . . . . . . . 78
130 Top-aligning imported graphics . . . . . . . . . . . . . . . . . . . . 78
131 Displaying MetaPost output in ghostscript . . . . . . . . . . . . . . 79
132 Drawing with TeX . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
133 Drawing Feynman diagrams in LaTeX . . . . . . . . . . . . . . . . 80
134 Labelling graphics . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
O Bibliographies and citations 82

3
O.1 Creating bibliographies 82
135 Creating a BibTeX bibliography file . . . . . . . . . . . . . . . . . . 82
136 Creating a bibliography style . . . . . . . . . . . . . . . . . . . . . 82
137 Capitalisation in BibTeX . . . . . . . . . . . . . . . . . . . . . . . 83
138 Accents in bibliographies . . . . . . . . . . . . . . . . . . . . . . . 83
139 ‘String too long’ in BibTeX . . . . . . . . . . . . . . . . . . . . . . 83
140 BibTeX doesn’t understand lists of names . . . . . . . . . . . . . . 84
141 URLs in BibTeX bibliographies . . . . . . . . . . . . . . . . . . . . 84
142 Using BibTeX with Plain TeX . . . . . . . . . . . . . . . . . . . . . 85
143 Reconstructing .bib files . . . . . . . . . . . . . . . . . . . . . . . 85
144 BibTeX sorting and name prefixes . . . . . . . . . . . . . . . . . . . 86
145 Transcribed initials in BibTeX . . . . . . . . . . . . . . . . . . . . . 86
O.2 Creating citations 86
146 “Normal” use of BibTeX from LaTeX . . . . . . . . . . . . . . . . 86
147 Choosing a bibliography style . . . . . . . . . . . . . . . . . . . . . 87
148 Separate bibliographies per chapter? . . . . . . . . . . . . . . . . . 88
149 Multiple bibliographies? . . . . . . . . . . . . . . . . . . . . . . . . 88
150 Putting bibliography entries in text . . . . . . . . . . . . . . . . . . 90
151 Sorting and compressing citations . . . . . . . . . . . . . . . . . . . 90
152 Multiple citations . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
153 References from the bibliography to the citation . . . . . . . . . . . 91
154 Sorting lists of citations . . . . . . . . . . . . . . . . . . . . . . . . 92
155 Reducing spacing in the bibliography . . . . . . . . . . . . . . . . . 92
156 Table of contents rearranges “unsrt” ordering . . . . . . . . . . . . . 92
157 Non-english bibliographies . . . . . . . . . . . . . . . . . . . . . . 93
158 Format of numbers in the bibliography . . . . . . . . . . . . . . . . 93
O.3 Manipulating whole bibliographies 94
159 Listing all your BibTeX entries . . . . . . . . . . . . . . . . . . . . 94
160 Making HTML of your Bibliography . . . . . . . . . . . . . . . . . 94
P Adjusting the typesetting 94
P.1 Alternative document classes 94
161 Replacing the standard classes . . . . . . . . . . . . . . . . . . . . . 94
162 Producing slides . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
163 Creating posters with LaTeX . . . . . . . . . . . . . . . . . . . . . 96
164 Formatting a thesis in LaTeX . . . . . . . . . . . . . . . . . . . . . 96
165 Setting papers for journals . . . . . . . . . . . . . . . . . . . . . . . 96
166 A ‘report’ from lots of ‘article’s . . . . . . . . . . . . . . . . . . . . 97
167 Curriculum Vitae (Résumé) . . . . . . . . . . . . . . . . . . . . . . 97
168 Letters and the like . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
169 Other “document font” sizes? . . . . . . . . . . . . . . . . . . . . . 98
P.2 Document structure 99
170 The style of document titles . . . . . . . . . . . . . . . . . . . . . . 99
171 The style of section headings . . . . . . . . . . . . . . . . . . . . . 99
172 Appendixes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
173 Indent after section headings . . . . . . . . . . . . . . . . . . . . . 101
174 How to create a \subsubsubsection . . . . . . . . . . . . . . . . . 101
175 The style of captions . . . . . . . . . . . . . . . . . . . . . . . . . . 102
176 Alternative head- and footlines in LaTeX . . . . . . . . . . . . . . . 102
177 Wide figures in two-column documents . . . . . . . . . . . . . . . . 103
178 1-column abstract in 2-column document . . . . . . . . . . . . . . . 103
179 Really blank pages between chapters . . . . . . . . . . . . . . . . . 103
180 Balancing columns at the end of a document . . . . . . . . . . . . . 104
181 My section title is too wide for the page header . . . . . . . . . . . . 105
182 Page numbering “hni of hmi” . . . . . . . . . . . . . . . . . . . . . 106
183 Page numbering by chapter . . . . . . . . . . . . . . . . . . . . . . 106

4
P.3 Page layout 106
184 Printer paper sizes . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
185 Changing the margins in LaTeX . . . . . . . . . . . . . . . . . . . . 107
186 Packages to set up page designs . . . . . . . . . . . . . . . . . . . . 107
187 How to set up page layout “by hand” . . . . . . . . . . . . . . . . . 108
188 Changing margins “on the fly” . . . . . . . . . . . . . . . . . . . . 108
189 How to get rid of page numbers . . . . . . . . . . . . . . . . . . . . 109
190 \pagestyle{empty} on first page in LaTeX . . . . . . . . . . . . . 109
191 How to create crop marks . . . . . . . . . . . . . . . . . . . . . . . 110
192 ‘Watermarks’ on every page . . . . . . . . . . . . . . . . . . . . . . 110
193 Typesetting things in landscape orientation . . . . . . . . . . . . . . 110
194 Putting things at fixed positions on the page . . . . . . . . . . . . . 111
195 Preventing page breaks between lines . . . . . . . . . . . . . . . . . 112
196 Parallel setting of text . . . . . . . . . . . . . . . . . . . . . . . . . 113
197 Typesetting epigraphs . . . . . . . . . . . . . . . . . . . . . . . . . 114
198 (La)TeX PDF output prints at wrong size . . . . . . . . . . . . . . . 114
P.4 Spacing of characters and lines 115
199 Double-spaced documents in LaTeX . . . . . . . . . . . . . . . . . 115
200 Changing the space between letters . . . . . . . . . . . . . . . . . . 115
201 Setting text ragged right . . . . . . . . . . . . . . . . . . . . . . . . 116
202 Cancelling \ragged commands . . . . . . . . . . . . . . . . . . . . 116
P.5 Typesetting specialities 116
203 Including a file verbatim in LaTeX . . . . . . . . . . . . . . . . . . 116
204 Including line numbers in typeset output . . . . . . . . . . . . . . . 117
205 Code listings in LaTeX . . . . . . . . . . . . . . . . . . . . . . . . 117
206 Typesetting pseudocode in LaTeX . . . . . . . . . . . . . . . . . . . 118
207 Generating an index in (La)TeX . . . . . . . . . . . . . . . . . . . . 119
208 Typesetting URLs . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
209 Typesetting music in TeX . . . . . . . . . . . . . . . . . . . . . . . 121
210 Zero paragraph indent . . . . . . . . . . . . . . . . . . . . . . . . . 122
211 Big letters at the start of a paragraph . . . . . . . . . . . . . . . . . 122
212 The comma as a decimal separator . . . . . . . . . . . . . . . . . . 122
213 Breaking boxes of text . . . . . . . . . . . . . . . . . . . . . . . . . 123
214 Overstriking characters . . . . . . . . . . . . . . . . . . . . . . . . 123
215 Realistic quotes for verbatim listings . . . . . . . . . . . . . . . . . 123
216 Printing the time . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
217 Redefining counters’ \the-commands . . . . . . . . . . . . . . . . 124
P.6 Tables of contents and indexes 124
218 The format of the Table of Contents, etc. . . . . . . . . . . . . . . . 124
219 Unnumbered sections in the Table of Contents . . . . . . . . . . . . 124
220 Bibliography, index, etc., in TOC . . . . . . . . . . . . . . . . . . . 125
221 Table of contents, etc., per chapter . . . . . . . . . . . . . . . . . . 126
222 Multiple indexes . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
P.7 Labels and references 127
223 Referring to things by their name . . . . . . . . . . . . . . . . . . . 127
224 Referring to labels in other documents . . . . . . . . . . . . . . . . 128
Q How do I do. . . ? 129
Q.1 Mathematics 129
225 Proof environment . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
226 Roman theorems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
227 Defining a new log-like function in LaTeX . . . . . . . . . . . . . . 130
228 Set specifications and Dirac brackets . . . . . . . . . . . . . . . . . 130
229 Cancelling terms in maths expressions . . . . . . . . . . . . . . . . 130
230 Adjusting maths font sizes . . . . . . . . . . . . . . . . . . . . . . . 130
231 Ellipses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
232 Sub- and superscript positioning for operators . . . . . . . . . . . . 131
233 Text inside maths . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
234 Re-using an equation . . . . . . . . . . . . . . . . . . . . . . . . . 133
235 Line-breaking in in-line maths . . . . . . . . . . . . . . . . . . . . . 134

5
Q.2 Lists 134
236 Fancy enumeration lists . . . . . . . . . . . . . . . . . . . . . . . . 134
237 How to reduce list spacing . . . . . . . . . . . . . . . . . . . . . . . 135
238 Interrupting enumerated lists . . . . . . . . . . . . . . . . . . . . . 136
Q.3 Tables, figures and diagrams 138
239 The design of tables . . . . . . . . . . . . . . . . . . . . . . . . . . 138
240 Fixed-width tables . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
241 Variable-width columns in tables . . . . . . . . . . . . . . . . . . . 139
242 Spacing lines in tables . . . . . . . . . . . . . . . . . . . . . . . . . 140
243 Tables longer than a single page . . . . . . . . . . . . . . . . . . . . 141
244 How to alter the alignment of tabular cells . . . . . . . . . . . . . . 141
245 The thickness of rules in LaTeX tables . . . . . . . . . . . . . . . . 143
246 Flowing text around figures in LaTeX . . . . . . . . . . . . . . . . . 143
247 Diagonal separation in corner cells of tables . . . . . . . . . . . . . 144
248 How to change a whole row of a table . . . . . . . . . . . . . . . . . 144
249 Merging cells in a column of a table . . . . . . . . . . . . . . . . . 145
Q.4 Floating tables, figures, etc. 146
250 Floats on their own on float pages . . . . . . . . . . . . . . . . . . . 146
251 Extra vertical space in floats . . . . . . . . . . . . . . . . . . . . . . 146
252 Placing two-column floats at bottom of page . . . . . . . . . . . . . 147
253 Floats in multicolumn setting . . . . . . . . . . . . . . . . . . . . . 147
254 Facing floats on 2-page spread . . . . . . . . . . . . . . . . . . . . 148
255 Vertical layout of float pages . . . . . . . . . . . . . . . . . . . . . 148
Q.5 Footnotes 148
256 Footnotes in tables . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
257 Footnotes in LaTeX section headings . . . . . . . . . . . . . . . . . 149
258 Footnotes in captions . . . . . . . . . . . . . . . . . . . . . . . . . 149
259 Footnotes whose texts are identical . . . . . . . . . . . . . . . . . . 150
260 More than one sequence of footnotes . . . . . . . . . . . . . . . . . 151
261 Footnotes numbered “per page” . . . . . . . . . . . . . . . . . . . . 151
Q.6 Document management 152
262 What’s the name of this file . . . . . . . . . . . . . . . . . . . . . . 152
263 All the files used by this document . . . . . . . . . . . . . . . . . . 153
264 Marking changed parts of your document . . . . . . . . . . . . . . . 153
265 Conditional compilation and “comments” . . . . . . . . . . . . . . . 154
266 Bits of document from other directories . . . . . . . . . . . . . . . . 156
267 Version control using RCS, CVS or Subversion . . . . . . . . . . . . 157
268 Makefiles for LaTeX documents . . . . . . . . . . . . . . . . . . . . 158
269 How many pages are there in my document? . . . . . . . . . . . . . 158
270 Including Plain TeX files in LaTeX . . . . . . . . . . . . . . . . . . 158
Q.7 Hyphenation 159
271 My words aren’t being hyphenated . . . . . . . . . . . . . . . . . . 159
272 Weird hyphenation of words . . . . . . . . . . . . . . . . . . . . . . 159
273 (Merely) peculiar hyphenation . . . . . . . . . . . . . . . . . . . . 160
274 Accented words aren’t hyphenated . . . . . . . . . . . . . . . . . . 160
275 Using a new language with Babel . . . . . . . . . . . . . . . . . . . 160
276 Stopping all hyphenation . . . . . . . . . . . . . . . . . . . . . . . 161
277 Preventing hyphenation of a particular word . . . . . . . . . . . . . 162
278 Hyphenation exceptions . . . . . . . . . . . . . . . . . . . . . . . . 162
Q.8 Odds and ends 163
279 Typesetting all those TeX-related logos . . . . . . . . . . . . . . . . 163
280 How to do bold-tt or bold-sc . . . . . . . . . . . . . . . . . . . . . . 163
281 Automatic sizing of minipage . . . . . . . . . . . . . . . . . . . . 164
R Symbols, etc. 165
282 Symbols for the number sets . . . . . . . . . . . . . . . . . . . . . . 165
283 Better script fonts for maths . . . . . . . . . . . . . . . . . . . . . . 166
284 Setting bold Greek letters in LaTeX . . . . . . . . . . . . . . . . . . 167
285 The Principal Value Integral symbol . . . . . . . . . . . . . . . . . 167
286 How to use the underscore character . . . . . . . . . . . . . . . . . 167
287 How to type an ‘@’ sign? . . . . . . . . . . . . . . . . . . . . . . . 168
288 Typesetting the Euro sign . . . . . . . . . . . . . . . . . . . . . . . 168
6
289 How to get copyright, trademark, etc. . . . . . . . . . . . . . . . . . 169
S Macro programming 169
S.1 “Generic” macros and techniques 169
290 Finding the width of a letter, word, or phrase . . . . . . . . . . . . . 169
291 Patching existing commands . . . . . . . . . . . . . . . . . . . . . 169
292 Comparing the “job name” . . . . . . . . . . . . . . . . . . . . . . 170
293 Is the argument a number? . . . . . . . . . . . . . . . . . . . . . . . 171
294 Defining macros within macros . . . . . . . . . . . . . . . . . . . . 172
295 Spaces in macros . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
296 How to break the 9-argument limit . . . . . . . . . . . . . . . . . . 174
297 Defining characters as macros . . . . . . . . . . . . . . . . . . . . . 175
298 Active characters in command arguments . . . . . . . . . . . . . . . 176
299 Defining a macro from an argument . . . . . . . . . . . . . . . . . . 177
300 Transcribing LaTeX command definitions . . . . . . . . . . . . . . 177
301 Detecting that something is empty . . . . . . . . . . . . . . . . . . 178
302 Am I using PDFTeX? . . . . . . . . . . . . . . . . . . . . . . . . . 179
303 Subverting a token register . . . . . . . . . . . . . . . . . . . . . . 180
S.2 LaTeX macro tools and techniques 180
304 Using Plain or primitive commands in LaTeX . . . . . . . . . . . . 180
305 \@ and @ in macro names . . . . . . . . . . . . . . . . . . . . . . . 181
306 What’s the reason for ‘protection’? . . . . . . . . . . . . . . . . . . 182
307 \edef does not work with \protect . . . . . . . . . . . . . . . . . 182
308 The definitions of LaTeX commands . . . . . . . . . . . . . . . . . 182
309 Optional arguments like \section . . . . . . . . . . . . . . . . . . 184
310 More than one optional argument . . . . . . . . . . . . . . . . . . . 184
311 Commands defined with * options . . . . . . . . . . . . . . . . . . 185
312 LaTeX internal “abbreviations”, etc. . . . . . . . . . . . . . . . . . 185
313 Defining LaTeX commands within other commands . . . . . . . . . 186
S.3 LaTeX macro programming 187
314 How to change LaTeX’s “fixed names” . . . . . . . . . . . . . . . . 187
315 Changing the words babel uses . . . . . . . . . . . . . . . . . . . . 187
316 Running equation, figure and table numbering . . . . . . . . . . . . 188
317 Making labels from a counter . . . . . . . . . . . . . . . . . . . . . 188
318 Finding if you’re on an odd or an even page . . . . . . . . . . . . . 189
319 How to change the format of labels . . . . . . . . . . . . . . . . . . 189
320 Adjusting the presentation of section numbers . . . . . . . . . . . . 190
321 There’s a space added after my environment . . . . . . . . . . . . . 191
322 Finding if a label is undefined . . . . . . . . . . . . . . . . . . . . . 191
323 Master and slave counters . . . . . . . . . . . . . . . . . . . . . . . 191
T Things are Going Wrong. . . 192
T.1 Getting things to fit 192
324 Enlarging TeX . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192
325 Why can’t I load PiCTeX? . . . . . . . . . . . . . . . . . . . . . . . 192
T.2 Making things stay where you want them 193
326 Moving tables and figures in LaTeX . . . . . . . . . . . . . . . . . . 193
327 Underlined text won’t break . . . . . . . . . . . . . . . . . . . . . . 194
328 Controlling widows and orphans . . . . . . . . . . . . . . . . . . . 195
T.3 Things have “gone away” 195
329 Old LaTeX font references such as \tenrm . . . . . . . . . . . . . . 195
330 Missing symbol commands . . . . . . . . . . . . . . . . . . . . . . 195
331 Where are the msx and msy fonts? . . . . . . . . . . . . . . . . . . . 196
332 Where are the am fonts? . . . . . . . . . . . . . . . . . . . . . . . . 196
U Why does it do that? 196
U.1 Common errors 196
333 LaTeX gets cross-references wrong . . . . . . . . . . . . . . . . . . 196
334 Start of line goes awry . . . . . . . . . . . . . . . . . . . . . . . . . 197
335 Why doesn’t verbatim work within . . . ? . . . . . . . . . . . . . . . . 197
336 No line here to end . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
337 Two-column float numbers out of order . . . . . . . . . . . . . . . . 200
338 Accents misbehave in tabbing . . . . . . . . . . . . . . . . . . . . 200
339 Package reports “command already defined” . . . . . . . . . . . . . 200
7
340 Why are my sections numbered 0.1 . . . ? . . . . . . . . . . . . . . . 201
341 Link text doesn’t break at end line . . . . . . . . . . . . . . . . . . . 201
342 Page number is wrong at start of page . . . . . . . . . . . . . . . . . 202
343 My brackets don’t match . . . . . . . . . . . . . . . . . . . . . . . 202
U.2 Common misunderstandings 203
344 What’s going on in my \include commands? . . . . . . . . . . . . 203
345 Why does it ignore paragraph parameters? . . . . . . . . . . . . . . 203
346 Case-changing oddities . . . . . . . . . . . . . . . . . . . . . . . . 204
347 Why does LaTeX split footnotes across pages? . . . . . . . . . . . . 204
348 Getting \marginpar on the right side . . . . . . . . . . . . . . . . . 205
349 Where have my characters gone? . . . . . . . . . . . . . . . . . . . 205
350 “Rerun” messages won’t go away . . . . . . . . . . . . . . . . . . . 206
351 Commands gobble following space . . . . . . . . . . . . . . . . . . 206
352 (La)TeX makes overfull lines . . . . . . . . . . . . . . . . . . . . . 207
353 Maths symbols don’t scale up . . . . . . . . . . . . . . . . . . . . . 208
354 Why doesn’t \linespread work? . . . . . . . . . . . . . . . . . . 208
355 Only one \baselineskip per paragraph . . . . . . . . . . . . . . . 209
356 Numbers too large in table of contents, etc. . . . . . . . . . . . . . . 209
357 Why is the inside margin so narrow? . . . . . . . . . . . . . . . . . 210
U.3 Why shouldn’t I? 210
358 Why use fontenc rather than t1enc? . . . . . . . . . . . . . . . . . . 210
359 Why bother with inputenc and fontenc? . . . . . . . . . . . . . . . . 211
360 Why not use eqnarray? . . . . . . . . . . . . . . . . . . . . . . . . 211
361 Why use \[ . . . \] in place of $$ . . . $$? . . . . . . . . . . . . . . . . 211
362 What’s wrong with \bf, \it, etc.? . . . . . . . . . . . . . . . . . . 212
363 What’s wrong with \newfont? . . . . . . . . . . . . . . . . . . . . 213
V The joy of TeX errors 213
364 How to approach errors . . . . . . . . . . . . . . . . . . . . . . . . 213
365 The structure of TeX error messages . . . . . . . . . . . . . . . . . 214
366 An extra ‘}’?? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 215
367 Capacity exceeded [semantic nest . . . ] . . . . . . . . . . . . . . . . 216
368 No room for a new ‘thing’ . . . . . . . . . . . . . . . . . . . . . . . 216
369 epsf gives up after a bit . . . . . . . . . . . . . . . . . . . . . . . . 217
370 Improper \hyphenation will be flushed . . . . . . . . . . . . . . . 217
371 “Too many unprocessed floats” . . . . . . . . . . . . . . . . . . . . 217
372 \spacefactor complaints . . . . . . . . . . . . . . . . . . . . . . 218
373 \end occurred inside a group . . . . . . . . . . . . . . . . . . . . . 218
374 “Missing number, treated as zero” . . . . . . . . . . . . . . . . . . . 219
375 “Please type a command or say \end” . . . . . . . . . . . . . . . . 219
376 “Unknown graphics extension” . . . . . . . . . . . . . . . . . . . . 220
377 “Missing $ inserted” . . . . . . . . . . . . . . . . . . . . . . . . . . 220
378 Warning: “Font shape . . . not available” . . . . . . . . . . . . . . . 221
379 Unable to read an entire line . . . . . . . . . . . . . . . . . . . . . . 221
380 “Fatal format file error; I’m stymied” . . . . . . . . . . . . . . . . . 222
381 Non-PDF special ignored! . . . . . . . . . . . . . . . . . . . . . . . 222
382 Mismatched mode ljfour and resolution 8000 . . . . . . . . . . . . . 223
383 “Too deeply nested” . . . . . . . . . . . . . . . . . . . . . . . . . . 223
384 Capacity exceeded — input levels . . . . . . . . . . . . . . . . . . . 224
385 PDFTeX destination . . . ignored . . . . . . . . . . . . . . . . . . . 224
386 Alignment tab changed to \cr . . . . . . . . . . . . . . . . . . . . . 224
387 Graphics division by zero . . . . . . . . . . . . . . . . . . . . . . . 225
388 Missing \begin{document} . . . . . . . . . . . . . . . . . . . . . 225
389 \normalsize not defined . . . . . . . . . . . . . . . . . . . . . . . 226
390 Too many math alphabets . . . . . . . . . . . . . . . . . . . . . . . 226
391 Not in outer par mode . . . . . . . . . . . . . . . . . . . . . . . . . 226
392 Perhaps a missing \item? . . . . . . . . . . . . . . . . . . . . . . . 227
393 Illegal parameter number in definition . . . . . . . . . . . . . . . . 228
394 Float(s) lost . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228
395 Option clash for package . . . . . . . . . . . . . . . . . . . . . . . 229

8
W Current TeX-related projects 230
396 The LaTeX3 project . . . . . . . . . . . . . . . . . . . . . . . . . . 230
397 Future WWW technologies and (La)TeX . . . . . . . . . . . . . . . 230
398 Making outline fonts from MetaFont . . . . . . . . . . . . . . . . . 231
399 The TeX document preparation environment . . . . . . . . . . . . . 232
400 The ANT typesetting system . . . . . . . . . . . . . . . . . . . . . 233
401 The ExTeX project . . . . . . . . . . . . . . . . . . . . . . . . . . . 233
402 Omega and Aleph . . . . . . . . . . . . . . . . . . . . . . . . . . . 233
403 PDFTeX becomes LUATeX . . . . . . . . . . . . . . . . . . . . . . 233
404 The XeTeX project . . . . . . . . . . . . . . . . . . . . . . . . . . 234
X You’re still stuck? 234
405 You don’t understand the answer . . . . . . . . . . . . . . . . . . . 234
406 Submitting new material for the FAQ . . . . . . . . . . . . . . . . . 234
407 Reporting a LaTeX bug . . . . . . . . . . . . . . . . . . . . . . . . 234
408 What to do if you find a bug . . . . . . . . . . . . . . . . . . . . . . 235
§ § § § § § § § § § § § § § § § § §

A Introduction
This is a set of Frequently Asked Questions (FAQ) for English-speaking users of TeX.
The questions answered here cover a wide range of topics, but the actual typesetting
issues are mostly covered from the viewpoint of a LaTeX user.
You may use the FAQ

• by reading a printed document,


• by browsing a PDF file: copies, with hyperlinks to assist browsing, are to be
found on CTAN at help/uk-tex-faq/newfaq.pdf (formatted for A4 paper) or at
help/uk-tex-faq/letterfaq.pdf (formatted for North American “letter” pa-
per), or
• by using the FAQ’s web interface (base URL: http://www.tex.ac.uk/faq); this
version provides simple search capabilities, as well as a link to Google for a more
sophisticated search restricted to the FAQ itself.

Finding the Files


Unless otherwise specified, all files mentioned in this FAQ are available from a CTAN
archive, or from one of their mirrors. gives details of the CTAN archives, and how
to retrieve files from them. If you don’t have access to the Internet, the TeX Live
distribution offers off-line snapshots of the archives.
The reader should also note that the first directory name of the path name of every
file on CTAN has been elided from what follows, for the simple reason that it’s always
the same (tex-archive/).
To avoid confusion, we’ve also elided the full stop from the end of any sentence
whose last item is a path name (such sentences are rare, and only occur at the end of
paragraphs). Though the path names are set in a different font from running text, it’s
not easy to distinguish the font of a single dot!

Origins
The FAQ was originated by the Committee of the UK TeX Users’ Group (UK TUG)
as a development of a regular posting to the Usenet newsgroup comp.text.tex that
was maintained for some time by Bobby Bodenheimer. The first UK version was much
re-arranged and corrected from the original, and little of Bodenheimer’s work now
remains.
Most members of the committee of UK TUG, over the years since 1994, have con-
tributed to this FAQ to some extent. The following people, who have never been mem-
bers of the committee, have also contributed help or advice: Donald Arseneau, Bar-
bara Beeton, Karl Berry, Giuseppe Bilotta, Charles Cameron, Damian Cugley, Michael
Dewey, Michael Downes, Jean-Pierre Drucbert, Thomas Esser, Ulrike Fischer, An-
thony Goreham, Norman Gray, Eitan Gurari, William Hammond, Hartmut Henkel,
John Hobby, Morten Høgholm, Berthold Horn, Ian Hutchinson, Werner Icking, Reg-
nor Jernsletten, David Kastrup, Oleg Katsitadze, Isaac Khabaza, Ulrich Klauer, Markus

9
Kohm, Simon Law, Daniel Luecking, Sanjoy Mahajan, Andreas Matthias, Brooks
Moses, Iain Murray, Vilar Camara Neto, Ted Nieland, Hans Nordhaug, Pat Rau, Heiko
Oberdiek, Piet van Oostrum, Scott Pakin, Oren Patashnik, Steve Peter, Philip Ratcliffe,
José Carlos Santos, Walter Schmidt, Hans-Peter Schröcker, Joachim Schrod, Maarten
Sneep, James Szinger, Ulrik Vieth, Mike Vulis, Chris Walker, Peter Wilson, Rick Zac-
cone and Reinhard Zierke.

B The Background
1 What is TeX?
TeX is a typesetting system written by Donald E. Knuth, who says in the Preface to
his book on TeX (see books about TeX) that it is “intended for the creation of beautiful
books — and especially for books that contain a lot of mathematics”.
Knuth is Emeritus Professor of the Art of Computer Programming at Stanford Uni-
versity in California, USA. Knuth developed the first version of TeX in 1978 to deal
with revisions to his series “the Art of Computer Programming”. The idea proved pop-
ular and Knuth produced a second version (in 1982) which is the basis of what we use
today.
Knuth developed a system of ‘literate programming’ to write TeX, and he provides
the literate (WEB) source of TeX free of charge, together with tools for processing the
web source into something that can be compiled and something that can be printed;
there’s never any mystery about what TeX does. Furthermore, the WEB system pro-
vides mechanisms to port TeX to new operating systems and computers; and in order
that one may have some confidence in the ports, Knuth supplied a test by means of
which one may judge the fidelity of a TeX system. TeX and its documents are there-
fore highly portable.
TeX is a macro processor, and offers its users a powerful programming capabil-
ity. For this reason, TeX on its own is a pretty difficult beast to deal with, so Knuth
provided a package of macros for use with TeX called Plain TeX; Plain TeX is effec-
tively the minimum set of macros one can usefully employ with TeX, together with
some demonstration versions of higher-level commands (the latter are better regarded
as models than used as-is). When people say they’re “programming in TeX”, they
usually mean they’re programming in Plain TeX.
2 How should I pronounce “TeX”?
The ‘X’ is “really” the Greek letter Chi (χ, in lower case), and is pronounced by
English-speakers either a bit like the ‘ch’ in the Scots word ‘loch’ ([x] in the IPA)
or (at a pinch, if you can’t do the Greek sound) like ‘k’. It definitely is not pronounced
‘ks’ (the Greek letter with that sound doesn’t look remotely like the Latin alphabet
‘X’).
This curious usage derives from Knuth’s explanation in the TeXbook that the name
comes from the Greek word for ‘art’ or ‘craft’ (‘τε χνη’), which is the root of the
English word ‘technology’. Knuth’s logo for TeX is merely the uppercase version of
the first three (Greek) letters of the word, jiggled about a bit; we don’t use that logo
(and logos like it) in this FAQ (see Typesetting TeX-related logos).
3 What is MetaFont?
MetaFont was written by Knuth as a companion to TeX; whereas TeX defines the lay-
out of glyphs on a page, MetaFont defines the shapes of the glyphs and the relations
between them. MetaFont details the sizes of glyphs, for TeX’s benefit, and details
the rasters used to represent the glyphs, for the benefit of programs that will produce
printed output as post processes after a run of TeX.
MetaFont’s language for defining fonts permits the expression of several classes of
things: first (of course), the simple geometry of the glyphs; second, the properties of
the print engine for which the output is intended; and third, ‘meta’-information which
can distinguish different design sizes of the same font, or the difference between two
fonts that belong to the same (or related) families.
Knuth (and others) have designed a fair range of fonts using MetaFont, but font
design using MetaFont is much more of a minority skill than is TeX macro-writing.
The complete TeX-user nevertheless needs to be aware of MetaFont, and to be able to
run MetaFont to generate personal copies of new fonts.
10
4 What is MetaPost?
The MetaPost system (by John Hobby) implements a picture-drawing language very
much like that of MetaFont except that it outputs Encapsulated PostScript files in-
stead of run-length-encoded bitmaps. MetaPost is a powerful language for producing
figures for documents to be printed on PostScript printers, either directly or embed-
ded in (La)TeX documents. It includes facilities for directly integrating TeX text and
mathematics with the graphics. (Knuth tells us that he uses nothing but MetaPost for
diagrams in text that he is writing.)
Although PDFLaTeX cannot ordinarily handle PostScript graphics, the output of
MetaPost is sufficiently simple and regular that PDFLaTeX can handle it direct, using
code borrowed from ConTeXt — see graphics in PDFLaTeX.
Much of MetaPost’s source code was copied from MetaFont’s sources, with
Knuth’s permission.
A mailing list discussing MetaPost is available; subscribe via the TUG mailman
interface. The TUG website also hosts a MetaPost summary page.
5 How can I be sure it’s really TeX?
TeX (and MetaFont and MetaPost) are written in a ‘literate’ programming language
called Web which is designed to be portable across a wide range of computer systems.
How, then, is a new version of TeX checked?
Of course, any sensible software implementor will have his own suite of tests to
check that his software runs: those who port TeX and its friends to other platforms do
indeed perform such tests.
Knuth, however, provides a ‘conformance test’ for both TeX (trip) and Meta-
Font (trap). He characterises these as ‘torture tests’: they are designed not to check
the obvious things that ordinary typeset documents, or font designs, will exercise, but
rather to explore small alleyways off the main path through the code of TeX. They are,
to the casual reader, pretty incomprehensible!
Once an implementation of TeX has passed its trip test, or an implementation of
MetaFont has passed its trap test, then it may in principe be distributed as a working
version. (In practice, any distributor would test new versions against “real” documents
or fonts, too; trip and trap don’t actually test any for real world problems.
6 Are TeX and friends Y2K compliant?
Crashing: None of TeX, MetaFont or MetaPost can themselves crash due to any
change whatever in the date of any sort.
Timestamps: In the original sources, a 2-digit year was stored as the creation time
for format files and that value is printed in logfiles. These items should not be of
general concern, since the only use of the date format file is to produce the log
output, and the log file is designed for human readers only.
Knuth announced in 1998 that implementators could alter this code without fear
of being accused of non-compliance. Nearly all implementations that are being
actively maintained had been modified to generate 4-digit years in the format file
and the log, by the end of 1998. The original sources themselves have now been
modified so that 4-digit year numbers are stored.
The \year primitive: Certification of a TeX implementation (see trip/trap testing)
does not require that \year return a meaningful value (which means that TeX
can, in principle, be implemented on platforms that don’t make the value of the
clock available to user programs). The TeXbook (see TeX-related books) defines
\year as “the current year of our Lord”, which is the only correct meaning for
\year for those implementations which can supply a meaningful value, which is
to say nearly all of them.
In short, TeX implementations should provide a value in \year giving the 4-digit
year Anno Domini, or the value 1776 if the platform does not support a date func-
tion.
Note that if the system itself fails to deliver a correct date to TeX, then \year will
of course return an incorrect value. TeX cannot be considered Y2K compliant, in
this sense, on a system that is not itself Y2K compliant.
Macros: TeX macros can in principle perform calculations on the basis of the value of
\year. The LaTeX suite performs such calculations in a small number of places;

11
the calculations performed in the current (supported) version of LaTeX are known
to be Y2K compliant.
Other macros and macro packages should be individually checked.
External software: Software such as DVI translators needs to be individually checked.

7 What is e-TeX?
While Knuth has declared that TeX will never change in any substantial way, there
remain things that one might wish had been done differently, or indeed implemented at
all.
The NTS project set out to produce an advanced replacement for TeX, to provide a
basis for developing such modifications: this “New Typesetting System” would share
Knuth’s aims, but would implement the work in a modern way taking account of the
lessons learned with TeX. While a first demonstrator NTS did appear, it wasn’t practi-
cally useful, and the project seems no longer active.
In parallel with its work on NTS itself, the project developed a set of extensions
that can be used with a (“true”) TeX system. Such a modified system is known as
an e-TeX system, and the concept has proved widely successful. Indeed, current TeX
distributions are delivered with most formats built with an e-TeX system (for those who
don’t want them, e-TeX’s extensions can be disabled, leaving a functionally standard
TeX system).
The extensions range from the simple (increasing the number of available registers
from 256 to 65536) through to extremely subtle programming support.
ConTeXt has required e-TeX for its operation for some time.
Some LaTeX packages already specify the use of e-TeX. Some such packages may
not work at all on a non-e-TeX system; others will work, but not as well as on an e-
TeX system. The LaTeX team has announced that future LaTeX packages (specifically
those from the team, as opposed to those individually contributed) may require e-TeX
for optimum performance.
8 What is PDFTeX?
PDFTeX has entered the main stream of TeX distributions: most LaTeX and ConTeXt
users nowadays use PDFTeX whether they know it or not (more precisely, they use an
amalgam of PDFTeX and e-TeX). So what is PDFTeX?
PDFTeX is a development of TeX that is capable of generating typeset PDF out-
put in place of DVI. PDFTeX has other capabilities, most notably in the area of fine
typographic detail (for example, its support for optimising line breaks), but its greatest
impact to date has been in the area of PDF output.
PDFTeX started as a topic for Hàn Thế Thành’s Master’s thesis, and seems first to
have been published in TUGboat 18(4), in 1997 (though it was certainly discussed at
the TUG’96 conference in Russia).
While the world was making good use of “pre-releases” of PDFTeX, Thành used
it as a test-bed for the micro-typography which was the prime subject of his Ph.D.
research. Since Thành was finally awarded his Ph.D., day-to-day maintenance and de-
velopment of PDFTeX 1.0 (and later) has been in the hands of a group of PDFTeX
maintainers (which includes Thành); the group has managed to maintain a stable plat-
form for general use.
9 What is LaTeX?
LaTeX is a TeX macro package, originally written by Leslie Lamport, that provides a
document processing system. LaTeX allows markup to describe the structure of a doc-
ument, so that the user need not think about presentation. By using document classes
and add-on packages, the same document can be produced in a variety of different
layouts.
Lamport says that LaTeX “represents a balance between functionality and ease of
use”. This shows itself as a continual conflict that leads to the need for such things as
FAQs: LaTeX can meet most user requirements, but finding out how is often tricky.
10 What is LaTeX 2ε ?
Lamport’s last version of LaTeX (LaTeX 2.09, last updated in 1992) was superseded
in 1994 by a new version (LaTeX 2ε ) provided by the LaTeX team (the LaTeX team).
LaTeX 2ε is now the only readily-available version of LaTeX, and draws together sev-
eral threads of LaTeX development from the later days of LaTeX 2.09.
12
LaTeX 2ε has several enhancements over LaTeX 2.09, but they were all rather mi-
nor, with a view to continuity and stability rather than the “big push” that some had
expected from the team. LaTeX 2ε continues to this day to offer a compatibility mode
in which most files prepared for use with LaTeX 2.09 will run (albeit with somewhat
reduced performance, and subject to bitter complaints in the log file). Differences
between LaTeX 2ε and LaTeX 2.09 are outlined in a series of ‘guide’ files that are
available in every LaTeX distribution (the same directory also contains “news” about
each new release of LaTeX 2ε ).
LaTeX guides and news: macros/latex/doc
11 How should I pronounce “LaTeX(2e)”?
Lamport never recommended how one should pronounce LaTeX, but a lot of people
pronounce it ‘Lay TeX’ or perhaps ‘Lah TeX’ (with TeX pronounced as the program
itself; see the rules for TeX). It is definitely not to be pronounced in the same way as
the rubber-tree gum.
The ‘epsilon’ in ‘LaTeX 2ε ’ is supposed to be suggestive of a small improvement
over the old LaTeX 2.09. Nevertheless, most people pronounce the name as ‘LaTeX-
two-ee’.
12 Should I use Plain TeX or LaTeX?
There’s no straightforward answer to this question. Many people swear by Plain TeX,
and produce highly respectable documents using it (Knuth is an example of this, of
course). But equally, many people are happy to let someone else take the design deci-
sions for them, accepting a small loss of flexibility in exchange for a saving of brain
power.
The arguments around this topic can provoke huge amounts of noise and heat, with-
out offering much by way of light; your best bet is to find out what those around you
are using, and to go with the crowd. Later on, you can always switch your allegiance;
don’t bother about it.
If you are preparing a manuscript for a publisher or journal, ask them what markup
they want before you develop your own; many big publishers have developed their
own (La)TeX styles for journals and books, and insist that authors stick closely to their
markup.
13 How does LaTeX relate to Plain TeX?
LaTeX is a program written in the programming language TeX. (In the same sense, any
LaTeX document is also a program, which is designed to run ‘alongside’, or ‘inside’
LaTeX, whichever metaphor you prefer.)
Plain TeX is also a program written in the programming language TeX.
Both exist because writing your documents in ‘raw’ TeX would involve much rein-
venting of wheels for every document. They both serve as convenient aids to make
document writing more pleasant: LaTeX is a far more extensive aid.
LaTeX is close to being a superset of Plain TeX. Many documents designed for
Plain TeX will work with LaTeX with no more than minor modifications (though some
will require substantial work).
Interpretation of any (La)TeX program involves some data-driven elements, and
LaTeX has a wider range of such elements than does Plain TeX. As a result, the map-
ping from LaTeX to Plain TeX is far less clear than that in the other direction.
14 What is ConTeXt?
ConTeXt is a macro package developed by Hans Hagen, originally to serve the needs
of his (Dutch) firm Pragma. It was designed with the same general-purpose aims as
LaTeX, but (being younger) reflects much more recent thinking about the structure of
markup, etc. In particular, ConTeXt can customise its markup to an author’s language
(customising modules for Dutch, English and German are provided, at present).
ConTeXt is well integrated, in all of its structure, with the needs of hypertext
markup, and in particular with the facilities offered by PDFTeX. The default instal-
lation employs a version of PDFTeX built with the e-TeX extensions, which allows
further flexibility.
ConTeXt doesn’t yet have quite such a large developer community as does LaTeX,
but those developers who are active seem to have prodigious energy. Some support is
available via a WIKI site.
13
ConTeXt distribution: macros/context
15 What are the AMS packages (AMSTeX, etc.)?
AMSTeX is a TeX macro package, originally written by Michael Spivak for the Ameri-
can Mathematical Society (AMS) during 1983–1985 and is described in the book “The
Joy of TeX” (The Joy of TeX). It is based on Plain TeX, and provides many features for
producing more professional-looking maths formulas with less burden on authors. It
pays attention to the finer details of sizing and positioning that mathematical publishers
care about. The aspects covered include multi-line displayed equations, equation num-
bering, ellipsis dots, matrices, double accents, multi-line subscripts, syntax checking
(faster processing on initial error-checking TeX runs), and other things.
As LaTeX increased in popularity, authors asked to submit papers to the AMS in
LaTeX, and so the AMS developed AMSLaTeX, which is a collection of LaTeX pack-
ages and classes that offer authors most of the functionality of AMSTeX. The AMS
no longer recommends the use of AMSTeX, and urges its users to use AMSLaTeX
instead.
AMSTeX distribution: macros/amstex
AMSLaTeX distribution: macros/latex/required/amslatex
16 What is Eplain?
The Eplain macro package expands on and extends the definitions in Plain TeX. Eplain
is not intended to provide “generic typesetting capabilities”, as do LaTeX or Texinfo.
Instead, it defines macro tools that should be useful whatever commands you choose to
use when you prepare your manuscript.
For example, Eplain does not have a command \section, which would format sec-
tion headings in an “appropriate” way, as LaTeX’s \section does. The philosophy of
Eplain is that some people will always need or want to go beyond the macro designer’s
idea of “appropriate”. Such canned macros are fine — as long as you are willing to
accept the resulting output. If you don’t like the results, or if you are trying to match a
different format, you are out of luck.
On the other hand, almost everyone would like capabilities such as cross-referencing
by labels, so that you don’t have to put actual page numbers in the manuscript. The
authors of Eplain are not aware of any generally available macro packages that do not
force their typographic style on an author, and yet provide such capabilities.
Another useful feature of Eplain is the ability to create PDF files with hyper-
links. The cross-referencing macros can implicitly generate hyperlinks for the cross-
references, but you can also create explicit hyperlinks, both internal (pointing to a des-
tination within the current document) and external (pointing to another local document
or a URL).
Several LaTeX packages provide capabilities which Plain TeX users are lacking,
most notably text colouring and rotation provided by the graphics bundle (packages
color and graphics). Although the graphics bundle provides a Plain TeX “loader” for
some of the packages, it is not a trivial job to pass options to those packages under
Plain TeX, and much of the functionality of the packages is accessed through package
options. Eplain extends the loader so that options can be passed to the packages just as
they are in LaTeX. The following packages are known to work with Eplain: graphics,
graphicx, color, autopict (LaTeX picture environment), psfrag, and url.
Eplain documentation is available online, and there’s also a mailing list — sign up,
or browse the list archives, via http://tug.org/mailman/listinfo/tex-eplain
Eplain distribution: macros/eplain
17 What is Lollipop?
Lollipop is a macro package written by Victor Eijkhout; it was used in the production
of his book “TeX by Topic” (see (La)TeX Tutorials). The manual says of it:

Lollipop is ‘TeX made easy’. Lollipop is a macro package that functions as a


toolbox for writing TeX macros. It was my intention to make macro writing
so easy that implementing a fully new layout in TeX would become a matter
of less than an hour for an average document, and that it would be a task that
could be accomplished by someone with only a very basic training in TeX
programming.

14
Lollipop is an attempt to make structured text formatting available for envi-
ronments where previously only WYSIWYG packages could be used because
adapting the layout is so much more easy with them than with traditional TeX
macro packages.

The manual goes on to talk of ambitions to “capture some of the LaTeX market
share”; it’s a very witty package, but little sign of it taking over from LaTeX is de-
tectable. . . An article about Lollipop appeared in TUGboat 13(3).
Lollipop distribution: nonfree/macros/lollipop
18 What is Texinfo?
Texinfo is a documentation system that uses one source file to produce both on-line
information and printed output. So instead of writing two different documents, one
for the on-line help and the other for a typeset manual, you need write only one doc-
ument source file. When the work is revised, you need only revise one document. By
convention, Texinfo source file names end with a .texi or .texinfo extension.
Texinfo is a macro language, somewhat similar to LaTeX, but with slightly less
expressive power. Its appearance is of course rather similar to any TeX-based macro
language, except that its macros start with @ rather than the \ that’s more commonly
used in TeX systems.
You can write and format Texinfo files into Info files within GNU emacs, and read
them using the emacs Info reader. You can also format Texinfo files into Info files using
makeinfo and read them using info, so you’re not dependent on emacs. The distribution
includes a Perl script, texi2html, that will convert Texinfo sources into HTML: the
language is a far better fit to HTML than is LaTeX, so that the breast-beating agonies
of converting LaTeX to HTML are largely avoided.
Finally, of course, you can also print the files, or convert them to PDF using PDF-
TeX.
Texinfo distribution: macros/texinfo/texinfo
19 If TeX is so good, how come it’s free?
It’s free because Knuth chose to make it so. He is nevertheless apparently happy that
others should earn money by selling TeX-based services and products. While several
valuable TeX-related tools and packages are offered subject to restrictions imposed by
the GNU General Public Licence (‘Copyleft’), TeX itself is not subject to Copyleft.
There are commercial versions of TeX available; for some users, it’s reassuring
to have paid support. What is more, some of the commercial implementations have
features that are not available in free versions. (The reverse is also true: some free
implementations have features not available commercially.)
This FAQ concentrates on ‘free’ distributions of TeX, but we do at least list the
major vendors.
20 What is the future of TeX?
Knuth has declared that he will do no further development of TeX; he will continue to
fix any bugs that are reported to him (though bugs are rare). This decision was made
soon after TeX version 3.0 was released; at each bug-fix release the version number
acquires one more digit, so that it tends to the limit π (at the time of writing, Knuth’s
latest release is version 3.141592). Knuth wants TeX to be frozen at version π when
he dies; thereafter, no further changes may be made to Knuth’s source. (A similar rule
is applied to MetaFont; its version number tends to the limit e, and currently stands at
2.71828.)
Knuth explains his decision, and exhorts us all to respect it, in a paper originally
published in TUGboat 11(4) (and reprinted in the NTG journal MAPS).
There are projects (some of them long-term projects: see, for example, the LaTeX3
project) to build substantial new macro packages based on TeX. For the even longer
term, there are various projects to build a successor to TeX; see, for example, the
Omega/Aleph and ExTeX projects.
21 Reading (La)TeX files
So you’ve been sent a TeX file: what are you going to do with it?

15
You can, of course, “just read it”, since it’s a plain text file, but the markup tags in
the document may prove distracting. Most of the time, even TeX experts will typeset a
TeX file before attempting to read it.
So, you need to typeset the document. The good news is that TeX systems are
available, free, for most sorts of computer; the bad news is that you need a pretty
complete TeX system even to read a single file, and complete TeX systems are pretty
large.
TeX is a typesetting system that arose from a publishing project (see what is TeX),
and its basic source is available free from its author. However, at its root, it is just a
typesetting engine: even to view or to print the typeset output, you will need ancil-
iary programs. In short, you need a TeX distribution — a collection of TeX-related
programs tailored to your operating system: for details of the sorts of things that are
available, see TeX distributions or commercial TeX distributions (for commercial dis-
tributions).
But beware — TeX makes no attempt to look like the sort of WYSIWYG system
you’re probably used to (see why is TeX not WYSIWYG): while many modern versions
of TeX have a compile–view cycle that rivals the best commercial word processors in
its responsiveness, what you type is usually markup, which typically defines a logical
(rather than a visual) view of what you want typeset.
However, in this context markup proves to be a blessing in disguise: a good pro-
portion of most TeX documents is immediately readable in an ordinary text editor. So,
while you need to install a considerable system to attain the full benefits of the TeX
document that you were sent, the chances are you can understand quite a bit of it with
nothing more than the ordinary tools you already have on your computer.
22 Why is TeX not a WYSIWYG system?
W YSIWYG is a marketing term (“What you see is what you get”) for a particular style
of text processor. W YSIWYG systems are characterised by two principal claims: that
you type what you want to print, and that what you see on the screen as you type is a
close approximation to how your text will finally be printed.
The simple answer to the question is, of course, that TeX was conceived long be-
fore the marketing term, at a time when the marketing imperative wasn’t perceived as
significant. However, that was a long time ago: why has nothing been done with the
“wonder text processor” to make it fit with modern perceptions?
There are two answers to this. First, the simple “things have been done” (but
they’ve not taken over the TeX world); and second, “there are philosophical reasons
why the way TeX has developed is ill-suited to the WYSIWYG style”. Indeed, there is
a fundamental problem with applying WYSIWYG techniques to TeX: the complexity of
TeX makes it hard to get the equivalent of TeX’s output without actually running TeX.
A celebrated early system offering “WYSIWYG using TeX” came from the VorTeX
project: a pair of (early) Sun workstations worked in tandem, one handling the user in-
terface while the other beavered away in the background typesetting the result. VorTeX
was quite impressive for its time, but the two workstations combined had hugely less
power than the average sub-thousand dollar Personal Computer nowadays, and its code
has not proved portable (it never even made the last ‘great’ TeX version change, at the
turn of the 1990s, to TeX version 3). Modern systems that are similar in their approach
are Lightning Textures (an extension of Blue Sky’s original TeX system for the Macin-
tosh), and Scientific Word (which can also cooperate with a computer algebra system);
both these systems are commercially available (commercially available).
The issue has of recent years started to attract attention from TeX developers, and
several interesting projects addressing the “TeX document preparation environment”
(TeX document preparation environment) are in progress.
Nevertheless, the TeX world has taken a long time to latch onto the idea of WYSI -
WYG . Apart from simple arrogance (“we’re better, and have no need to consider the
petty doings of the commercial word processor market”), there is a real conceptual
difference between the word processor model of the world and the model LaTeX and
ConTeXt employ — the idea of “markup”. “Pure” markup expresses a logical model
of a document, where every object within the document is labelled according to what
it is rather than how it should appear: appearance is deduced from the properties of
the type of object. Properly applied, markup can provide valuable assistance when it
comes to re-use of documents.

16
Established WYSIWYG systems find the expression of this sort of structured markup
difficult; however, markup is starting to appear in the lists of the commercial world’s
requirements, for two reasons. First, an element of markup helps impose style on a
document, and commercial users are increasingly obsessed with uniformity of style;
and second, the increasingly pervasive use of XML-derived document archival formats
demands it. The same challenges must needs be addressed by TeX-based document
preparation support schemes, so we are observing a degree of confluence of the needs
of the two communities: interesting times may be ahead of us.
23 TeX User Groups
There has been a TeX User Group since very near the time TeX first appeared. That
first group, TUG, is still active and its journal TUGboat continues in regular publica-
tion with articles about TeX, MetaFont and related technologies, and about document
design, processing and production. TUG holds a yearly conference, whose proceedings
are published in TUGboat.
TUG’s web site is a valuable resource for all sorts of TeX-related matters, such as
details of TeX software, and lists of TeX vendors and TeX consultants. Back articles
from TUGboat are slowly (subject to copyright issues, etc.) making their way to the
site, too.
Some time ago, TUG established a “technical council”, whose task was to oversee
the development of TeXnical projects. Most such projects nowadays go on their way
without any support from TUG, but TUG’s web site lists its Technical Working Groups
(TWGs).
TUG has a reasonable claim to be considered a world-wide organisation, but there
are many national and regional user groups, too; TUG’s web site maintains a list of
“Local User Groups” (LUGs).
Contact TUG itself via:

TeX Users Group


1466 NW Front Avenue, Suite 3141
Portland, OR 97209
USA
Tel: +1 503-223-9994
Fax: +1 503-223-3960
Email: [email protected]
Web: http://www.tug.org/

C Documentation and Help


24 Books on TeX and its relations
While Knuth’s book is the definitive reference for TeX, there are other books covering
TeX:

The TeXbook by Donald Knuth (Addison-Wesley, 1984, ISBN 0-201-13447-0, paper-


back ISBN 0-201-13448-9)
A Beginner’s Book of TeX by Raymond Seroul and Silvio Levy, (Springer Verlag,
1992, ISBN 0-387-97562-4)
TeX by Example: A Beginner’s Guide by Arvind Borde (Academic Press, 1992, ISBN 0-
12-117650-9 — now out of print)
Introduction to TeX by Norbert Schwarz (Addison-Wesley, 1989, ISBN 0-201-51141-
X — now out of print)
A Plain TeX Primer by Malcolm Clark (Oxford University Press, 1993, ISBNs 0-198-
53724-7 (hardback) and 0-198-53784-0 (paperback))
A TeX Primer for Scientists by Stanley Sawyer and Steven Krantz (CRC Press, 1994,
ISBN 0-849-37159-7)
TeX by Topic by Victor Eijkhout (Addison-Wesley, 1992, ISBN 0-201-56882-9 —
now out of print, but see online books)
TeX for the Beginner by Wynter Snow (Addison-Wesley, 1992, ISBN 0-201-54799-6)

17
TeX for the Impatient by Paul W. Abrahams, Karl Berry and Kathryn A. Hargreaves
(Addison-Wesley, 1990, ISBN 0-201-51375-7 — now out of print, but see online
books)
TeX in Practice by Stephan von Bechtolsheim (Springer Verlag, 1993, 4 volumes,
ISBN 3-540-97296-X for the set, or Vol. 1: ISBN 0-387-97595-0, Vol. 2: ISBN 0-
387-97596-9, Vol. 3: ISBN 0-387-97597-7, and Vol. 4: ISBN 0-387-97598-5)
TeX: Starting from 1 1 by Michael Doob (Springer Verlag, 1993, ISBN 3-540-
56441-1 — now out of print)
The Joy of TeX by Michael D. Spivak (second edition, AMS, 1990, ISBN 0-821-
82997-1)
The Advanced TeXbook by David Salomon (Springer Verlag, 1995, ISBN 0-387-
94556-3)

A collection of Knuth’s publications about typography is also available:

Digital Typography by Donald Knuth (CSLI and Cambridge University Press, 1999,
ISBN 1-57586-011-2, paperback ISBN 1-57586-010-4).

and in late 2000, a “Millennium Boxed Set” of all 5 volumes of Knuth’s “Computers
and Typesetting” series (about TeX and MetaFont) was published by Addison Wesley:

Computers & Typesetting, Volumes A–E Boxed Set by Donald Knuth (Addison-Wesley,
2001, ISBN 0-201-73416-8).

For LaTeX, see:

LaTeX, a Document Preparation System by Leslie Lamport (second edition, Addison


Wesley, 1994, ISBN 0-201-52983-1)
Guide to LaTeX Helmut Kopka and Patrick W. Daly (fourth edition, Addison-Wesley,
2004, ISBN 0-321-17385-6)
The LaTeX Companion by Frank Mittelbach, Michel Goossens, Johannes Braams,
David Carlisle and Chris Rowley (second edition, Addison-Wesley, 2004, ISBN 0-
201-36299-6)
The LaTeX Graphics Companion: Illustrating documents with TeX and PostScript by
Michel Goossens, Sebastian Rahtz and Frank Mittelbach (Addison-Wesley, 1997,
ISBN 0-201-85469-4)
The LaTeX Web Companion: Integrating TeX, HTML and XML by Michel Goossens
and Sebastian Rahtz (Addison-Wesley, 1999, ISBN 0-201-43311-7)
TeX Unbound: LaTeX and TeX strategies for fonts, graphics, and more by Alan Hoenig
(Oxford University Press, 1998, ISBN 0-19-509685-1 hardback, ISBN 0-19-
509686-X paperback)
Math into LaTeX: An Introduction to LaTeX and AMSLaTeX by George Grätzer (third
edition Birkhäuser and Springer Verlag, 2000, ISBN 0-8176-4431-9, ISBN 3-
7643-4131-9)
Digital Typography Using LaTeX Incorporating some multilingual aspects, and use
of Omega, by Apostolos Syropoulos, Antonis Tsolomitis and Nick Sofroniou
(Springer, 2003, ISBN 0-387-95217-9).
A list of errata for the first printing is available from: http://www.springer-
ny.com/catalog/np/jan99np/0-387-98708-8.html
First Steps in LaTeX by George Grätzer (Birkhäuser, 1999, ISBN 0-8176-4132-7)
LaTeX: Line by Line: Tips and Techniques for Document Processing by Antoni Diller
(second edition, John Wiley & Sons, 1999, ISBN 0-471-97918-X)
LaTeX for Linux: A Vade Mecum by Bernice Sacks Lipkin (Springer-Verlag, 1999,
ISBN 0-387-98708-8, second printing)

1 That’s ‘Starting from Square One’

18
A sample of George Grätzer’s “Math into LaTeX”, in Adobe Acrobat format, and ex-
ample files for the three LaTeX Companions, and for Grätzer’s “First Steps in LaTeX”,
are all available on CTAN.
There’s a nicely-presented list of of “recommended books” to be had on the web:
http://www.macrotex.net/texbooks/
The list of MetaFont books is rather short:
The MetaFontbook by Donald Knuth (Addison Wesley, 1986, ISBN 0-201-13445-4,
ISBN 0-201-52983-1 paperback)
Alan Hoenig’s ‘TeX Unbound ’ includes some discussion and examples of using Meta-
Font.
A book covering a wide range of topics (including installation and maintenance) is:
Making TeX Work by Norman Walsh (O’Reilly and Associates, Inc, 1994, ISBN 1-
56592-051-1)
The book is decidedly dated, and is now out of print, but a copy is available via
sourceforge and on CTAN, and we list it under “online books”.
This list only covers books in English: notices of new books, or warnings that
books are now out of print are always welcome. However, this FAQ does not carry
reviews of current published material.
Examples for First Steps in LaTeX: info/examples/FirstSteps
Examples for LaTeX Companion: info/examples/tlc2
Examples for LaTeX Graphics Companion: info/examples/lgc
Examples for LaTeX Web Companion: info/examples/lwc
Sample of Math into LaTeX: info/mil/mil.pdf
25 Books on Type
The following is a partial listing of books on typography in general. Of these,
Bringhurst seems to be the one most often recommended.
The Elements of Typographic Style by Robert Bringhurst (Hartley & Marks, 1992,
ISBN 0-88179-033-8)
Finer Points in the Spacing & Arrangement of Type by Geoffrey Dowding (Hartley &
Marks, 1996, ISBN 0-88179-119-9)
The Thames & Hudson Manual of Typography by Ruari McLean (Thames & Hudson,
1980, ISBN 0-500-68022-1)
The Form of the Book by Jan Tschichold (Lund Humphries, 1991, ISBN 0-85331-623-
6)
Type & Layout by Colin Wheildon (Strathmore Press, 2006, ISBN 1-875750-22-3)
The Design of Books by Adrian Wilson (Chronicle Books, 1993, ISBN 0-8118-0304-
X)
Optical Letter Spacing by David Kindersley and Lida Cardozo Kindersley (The Car-
dozo Kindersley Workshop 2001, ISBN 1-874426-139)
There are many catalogues of type specimens but the following books provide a
more interesting overall view of types in general and some of their history.
Alphabets Old & New by Lewis F. Day (Senate, 1995, ISBN 1-85958-160-9)
An Introduction to the History of Printing Types by Geoffrey Dowding (British Li-
brary, 1998, UK ISBN 0-7123-4563-9; USA ISBN 1-884718-44-2)
The Alphabet Abecedarium by Richar A. Firmage (David R. Godine, 1993, ISBN 0-
87923-998-0)
The Alphabet and Elements of Lettering by Frederick Goudy (Dover, 1963, ISBN 0-
486-20792-7)
Anatomy of a Typeface by Alexander Lawson (David R. Godine, 1990, ISBN 0-
87923-338-8)
A Tally of Types by Stanley Morison (David R. Godine, 1999, ISBN 1-56792-004-7)
19
Counterpunch by Fred Smeijers (Hyphen, 1996, ISBN 0-907259-06-5)
Treasury of Alphabets and Lettering by Jan Tschichold (W. W. Norton, 1992, ISBN 0-
393-70197-2)
A Short History of the Printed Word by Warren Chappell and Robert Bringhurst (Hart-
ley & Marks, 1999, ISBN 0-88179-154-7)
The above lists are limited to books published in English. Typographic styles are
somewhat language-dependent, and similarly the ‘interesting’ fonts depend on the par-
ticular writing system involved.
26 Where to find FAQs
Bobby Bodenheimer’s article, from which this FAQ was developed, used to be posted
(nominally monthly) to newsgroup comp.text.tex. The (long obsolete) last posted
copy of that article is kept on CTAN for auld lang syne.
A version of the present FAQ may be browsed via the World-Wide Web, and its
sources are available from CTAN.
This FAQ and others are regularly mentioned, on comp.text.tex and elsewhere,
in a “pointer FAQ” which is also saved at http://tug.org/tex-ptr-faq
A 2006 innovation from Scott Pakin is the “visual” LaTeX FAQ. This is a document
with (mostly rubbish) text formatted so as to highlight things we discuss here, and pro-
viding Acrobat hyper-links to the relevant answers in this FAQ on the Web. The visual
FAQ is provided in PDF format, on CTAN; it works best using Adobe Acrobat Reader
7 (or later); some features are missing with other readers, or with earlier versions of
Acrobat Reader
Another excellent information source, available in English, is the (La)TeX naviga-
tor.
Both the Francophone TeX usergroup Gutenberg and the Czech/Slovak usergroup
CS-TUG have published translations of this FAQ, with extensions appropriate to their
languages.
Herbert Voß’s excellent LaTeX tips and tricks provides excellent advice on most
topics one might imagine (though it’s not strictly a FAQ) — highly recommended for
most ordinary mortals’ use.
The Open Directory Project (ODP) maintains a list of sources of (La)TeX help,
including FAQs. View the TeX area at http://dmoz.org/Computers/Software/
Typesetting/TeX/
Other non-English FAQs are available (off-CTAN):
German Posted regularly to de.comp.tex, and archived on CTAN; the FAQ also ap-
pears at http://www.dante.de/faq/de-tex-faq/
French An interactive (full-screen!) FAQ may be found at http://www.frenchpro6.
com/screen.pdf/FAQscreen.pdf, and a copy for printing at http://frenchle.
free.fr/FAQ.pdf; A FAQ used to be posted regularly to fr.comp.text.tex,
and is archived on CTAN — sadly, that effort seems to have fallen by the wayside.
Spanish See http://apolo.us.es/CervanTeX/FAQ/
Czech See http://www.fi.muni.cz/cstug/csfaq/
Resources available on CTAN are:
Dante FAQ: help/de-tex-faq
French FAQ: help/LaTeX-FAQ-francaise
Sources of this FAQ: help/uk-tex-faq
Obsolete comp.text.tex FAQ: obsolete/help/TeX,_LaTeX,_etc.:
_Frequently_Asked_Questions_with_Answers
The visual FAQ: info/visualFAQ/visualFAQ.pdf
27 How to get help
So you’ve looked at all relevant FAQs you can find, you’ve looked in any books you
have, and scanned relevant tutorials. . . and still you don’t know what to do.
If you are seeking a particular package or program, look on your own system first:
you might already have it — the better TeX distributions contain a wide range of sup-
porting material. The CTAN Catalogue can also help: you can search it, or you can
20
browse it “by topic” (see http://www.tex.ac.uk/tex-archive/help/Catalogue/
bytopic.html). A catalogue entry has a description of the package, and links to any
documentation known on the net. . . . when the entry is up-to-date.
Failing all that, look to see if anyone else has had the problem before; there are two
places where people ask: browse or search the newsgroup comp.text.tex via Google
groups, and the mailing list texhax via its archive
If those “back question” searches fail, you’ll have to ask the world at large. To ask
a question on comp.text.tex, you can use your own news client (if you have one),
or use the “start a new topic” button on http://groups.google.com/group/comp.
text.tex. To ask a question on texhax, you may simply send mail to texhax@tug.
org, but it’s probably better to subscribe to the list (via http://tug.org/mailman/
listinfo/texhax) first — not everyone will answer to you as well as to the list.

28 Specialist mailing lists


The previous question, “getting help”, talked of the two major forums in which
(La)TeX, MetaFont and MetaPost are discussed; however, these aren’t the only ones
available.
The TUG web site offers many mailing lists other than just texhax via its mail list
management page.
The French national TeX user group, Gutenberg, offers a MetaFont (and, de facto,
MetaPost) mailing list, [email protected]: subscribe to it by sending a message

subscribe metafont

to [email protected]
Note that there’s also a MetaPost-specific mailing list available via the TUG lists;
in fact there’s little danger of becoming confused by subscribing to both.
Announcements of TeX-related installations on the CTAN archives are sent to
the mailing list ctan-ann. Subscribe to the list via its MailMan web-site https:
//lists.dante.de/mailman/listinfo/ctan-ann; list archives are available at
the same address. The list archives may also be browsed via http://www.mail-
archive.com/[email protected]/, and an RSS feed is also available: http:
//www.mail-archive.com/[email protected]/maillist.xml

29 How to ask a question


You want help from the community at large; you’ve decided where you’re going to ask
your question, but how do you phrase it?
Excellent “general” advice (how to ask questions of anyone) is contained in Eric
Raymond’s article on the topic. Eric’s an extremely self-confident person, and this
comes through in his advice; but his guidelines are very good, even for us in the un-
self-confident majority. It’s important to remember that you don’t have a right to advice
from the world, but that if you express yourself well, you will usually find someone
who will be pleased to help.
So how do you express yourself in the (La)TeX world? There aren’t any compre-
hensive rules, but a few guidelines may help in the application of your own common
sense.

• Make sure you’re asking the right people. Don’t ask in a TeX forum about printer
device drivers for the Foobar operating system. Yes, TeX users need printers, but
no, TeX users will typically not be Foobar systems managers.
Similarly, avoid posing a question in a language that the majority of the group
don’t use: post in Ruritanian to de.comp.text.tex and you may have a long wait
before a German- and Ruritanian-speaking TeX expert notices your question.
• If your question is (or may be) TeX-system-specific, report what system you’re
using, or intend to use: “I can’t install TeX” is as good as useless, whereas “I’m
trying to install the mumbleTeX distribution on the Grobble operating system”
gives all the context a potential respondent might need. Another common situ-
ation where this information is important is when you’re having trouble installing
something new in your system: “I want to add the glugtheory package to my
mumbleTeX v12.0 distribution on the Grobble 2024 operating system”.
• If you need to know how to do something, make clear what your environment is:
“I want to do x in Plain TeX”, or “I want to do y in LaTeX running the boggle
class”. If you thought you knew how, but your attempts are failing, tell us what
21
you’ve tried: “I’ve already tried installing the elephant in the minicar directory,
and it didn’t work, even after refreshing the filename database”.
• If something’s going wrong within (La)TeX, pretend you’re submitting a LaTeX
bug report, and try to generate a minimum failing example. If your example needs
your local xyzthesis class, or some other resource not generally available, be sure
to include a pointer to how the resource can be obtained.
• Be as succinct as possible. Your helpers don’t usually need to know why you’re
doing something, just what you’re doing and where the problem is.

30 How to make a “minimum example”


Our advice on asking questions suggests that you prepare a “minimum example” (also
commonly known as a “minimal example”) of failing behaviour, as a sample to post
with your question. If you have a problem in a two hundred page document, it may be
unclear how to proceed from this problem to a succinct demonstration of your problem.
There are two valid approaches to this task: building up, and hacking down.
The “building up” process starts with a basic document structure (for LaTeX,
this would have \documentclass, \begin{document}, \end{document}) and adds
things. First to add is a paragraph or so around the actual point where the problem
occurrs. (It may prove difficult to find the actual line that’s provoking the problem. If
the original problem is an error, reviewing the structure of TeX errors may help.)
Note that there are things that can go wrong in one part of the document as a result
of something in another part: the commonest is problems in the table of contents (from
something in a section title, or whatever), or the list of hsomethingi (from something in
a \caption). In such a case, include the section title or caption (the caption probably
needs the figure or table environment around it, but it doesn’t need the figure or
table itself).
If this file you’ve built up collapses already, then you’re done. Otherwise, try adding
packages; the optimum is a file with only one package in it, but you may find that the
guilty package only works at all with another package loaded (or only fails in the
context of another package).
The “hacking down” route starts with your entire document, and removes bits until
the file no longer fails (and then of course restores the last thing removed). Don’t
forget to hack out any unnecessary packages, but mostly, the difficulty is choosing
what to hack out of the body of the document; this is the mirror of the problem above,
in the “building up” route.
If you’ve added a package (or more than one), add \listfiles to the preamble
too: that way, LaTeX will produce a list of the packages you’ve used and their version
numbers. This information may be useful evidence for people trying to help you.
What if your document simply doesn’t fail in any of these cut-down scenarios?
Whatever you do, don’t post the whole of the document: if you can, it may be useful to
make a copy available on the web somewhere: people will probably understand if it’s
impossible . . . or inadvisable, in the case of something confidential.
If the whole document is necessary, it could be that it’s an overflow of some sort;
the best you can do is to post the code “around” the error, and (of course) the full text
of the error.
It may seem that all this work is rather excessive for preparing a simple post. There
are two responses to that, both based on the relative inefficiency of asking a question
on the internet.
First, preparing a minimum document very often leads you to the answer, without
all the fuss of posting and looking for responses.
Second, your prime aim is to get an answer as quickly as possible; a well-prepared
example stands a good chance of attracting an answer “in a single pass”: if the person
replying to your post finds she needs more information, you have to find that request,
post again, and wait for your benefactor to produce a second response.
All things considered, a good example file can save you a day, for perhaps half an
hour’s effort invested.
The above advice, differently phrased, may also be read on the web at http://
www.latex-einfuehrung.de/mini-en.html; source of that article may be found at
http://www.latex-einfuehrung.de/, in both German and English.

22
31 (La)TeX Tutorials, etc.
From a situation where every (La)TeX user had to buy a book if she was not to find
herself groping blindly along, we now have what almost amounts to an embarrassment
of riches of online documentation. The following answers attempt to create lists of
sources “by topic”.
First we have introductory manuals, for Plain TeX and LaTeX.
Next comes a selection of “specialised” (La)TeX tutorials, each of which concen-
trates on a single LaTeX topic.
A couple of reference documents (it would be good to have more) are listed in an
answer of their own.
Next comes the (rather new) field of TeX-related WIKIs.
A rather short list gives us a Typography style tutorial.
Finally, we have a set of links to Directories of (La)TeX information, and details of
some “books” that were once published conventionally, but are now available on-line.
32 Online introductions: Plain TeX
Michael Doob’s splendid ‘Gentle Introduction’ to Plain TeX (available on CTAN) has
been stable for a very long time.
Another recommendable document is D. R. Wilkins ‘Getting started with TeX’,
available on the web at http://www.ntg.nl/doc/wilkins/pllong.pdf
Gentle Introduction: info/gentle/gentle.pdf
33 Online introductions: LaTeX
Tobias Oetiker’s ‘(Not so) Short Introduction to LaTeX 2ε ’, is regularly updated, as
people suggest better ways of explaining things, etc. The introduction is available on
CTAN, together with translations into a rather large set of languages.
Peter Flynn’s “Beginner’s LaTeX” (which also started as course material) is a pleas-
ing read. A complete copy may be found on CTAN, but it may also be browsed over
the web (http://www.tex.ac.uk/tex-archive/info/beginlatex/html/).
Harvey Greenberg’s ‘Simplified Introduction to LaTeX’ was written for a lecture
course, and is also available on CTAN (in PostScript only, unfortunately).
Edith Hodgen’s LaTeX, a Braindump starts you from the ground up — giving a
basic tutorial in the use of Linux to get you going (rather a large file. . . ). Its parent site,
David Friggens’ documentation page is a useful collection of links in itself.
Andy Roberts’ introductory material is a pleasing short introduction to the use of
(La)TeX; some of the slides for actual tutorials are to be found on the page, as well.
Chris Harrison’s TeX book presents basic LaTeX with useful hints for extensions
Nicola Talbot’s LaTeX for complete novices does what it claims: the author teaches
LaTeX at the University of East Anglia.
Nicola Talbot also provides a set of introductory tutorials, which include exercises
(with solutions). The page was developed as an extension to the LaTeX course Nicola
teaches at the University of East Anglia.
An interesting (and practical) tutorial about what not to do is l2tabu, or “A list of
sins of LaTeX 2ε users” by Mark Trettin, translated into English by Jürgen Fenn. The
tutorial is available from CTAN as a PDF file (though the source is also available).
Beginner’s LaTeX: info/beginlatex/beginlatex-3.6.pdf
Not so Short Introduction: info/lshort/english/lshort.pdf (in English, you
may browse for sources and other language versions at info/lshort)
Simplified LaTeX: info/simplified-latex/simplified-intro.ps
The sins of LaTeX users: info/l2tabu/english/l2tabuen.pdf; source also
available: info/l2tabu/english/l2tabuen.tex
34 Specialised (La)TeX tutorials
The AMS publishes a “Short Math Guide for LaTeX”, which is available (in several
formats) via http://www.ams.org/tex/short-math-guide.html
Herbert Voß is developing a parallel document, that is also very useful; it’s part of
his “tips and tricks” and a copy is maintained on CTAN.
Two documents written more than ten years apart about font usage in TeX are
worth reading: Essential NFSS by Sebastian Rahtz, and Font selection in LaTeX, cast

23
in the form of an FAQ, by Walter Schmidt. A general compendium of font information
(including the two above) may be found on the TUG web site.
Peter Smith’s “LaTeX for Logicians” covers a rather smaller subject area, but is
similarly comprehensive (mostly by links to documents on relevant topics, rather than
as a monolithic document).
Keith Reckdahl’s “Using Imported Graphics in LaTeX 2ε ” is an excellent introduc-
tion to graphics use. It’s available on CTAN, but the sources aren’t available (promised
“some time soon”).
Vincent Zoonekynd provides a set of excellent (and graphic) tutorials on the pro-
gramming of title page styles, chapter heading styles and section heading styles. In
each file, there is a selection of graphics representing an output style, and for each
style, the code that produces it is shown.
An invaluable step-by-step setup guide for establishing a “work flow” through your
(La)TeX system, so that output appears at the correct size and position on standard-
sized paper, and that the print quality is satisfactory, is Mike Shell’s testflow. The
tutorial consists of a large plain text document, and there is a supporting LaTeX file
together with correct output, both in PostScript and PDF, for each of A4 and “letter”
paper sizes. The complete kit is available on CTAN (distributed with the author’s
macros for papers submitted for IEEE publications). The issues are also covered in a
later FAQ answer.
Documentation of Japanese TeX use appears at least twice on the web: Haruhiko
Okumura’s page on typesetting Japanese with Omega (the parent page is in Japanese,
so out of the scope of this FAQ).
One “Tim” documents pTeX (a TeX system widely used in Japan) in his “English
notes on pTeX”.
Some university departments make their local documentation available on the web.
Most straightforwardly, there’s the simple translation of existing documentation into
HTML, for example the INFO documentation of the (La)TeX installation, of which
a sample is the LaTeX documentation available at http://www.tac.dk/cgi-bin/
info2www?(latex)
More ambitiously, some university departments have enthusiastic documenters who
make public record of their (La)TeX support. For example, Tim Love (of Cambridge
University Engineering Department) maintains his deparment’s pages at http://www-
h.eng.cam.ac.uk/help/tpl/textprocessing/
Graphics in LaTeX 2ε : the document is available in PostScript and PDF formats
as info/epslatex/english/epslatex.ps and info/epslatex/english/
epslatex.pdf respectively
testflow : macros/latex/contrib/IEEEtran/testflow
Herbert Voß’s Maths tutorial: info/math/voss/mathmode/Mathmode.pdf
35 Reference documents
For TeX primitive commands a rather nice quick reference booklet, by John W. Ship-
man, is available; it’s arranged in the same way as the TeXbook. By contrast, you can
view David Bausum’s list of TeX primitives alphabetically or arranged by “family”.
Either way, the list has a link for each control sequence, that leads you to a detailed
description, which includes page references to the TeXbook.
There doesn’t seem to be a reference that takes in Plain TeX as well as the primitive
commands.
Similarly, there’s no completely reliable command-organised reference to LaTeX,
but the NASA Hypertext Help with LaTeX is recently much improved. It still talks in
LaTeX 2.09-isms in places, but it’s been updated for current LaTeX; there are a number
of mirrors of the site, and it may be worth choosing a “local” one if you’re going to use
it a lot.
36 WIKI pages for TeX and friends
The WIKI concept can be a boon to everyone, if used sensibly. The “general” WIKI
allows anyone to add stuff, or to edit stuff that someone else has added: while there is
obvious potential for chaos, there is evidence that a strong user community can keep a
WIKI under control.
Following the encouraging performance of the ConTeXt WIKI, both a (Plain) TeX
WIKI and a LaTeX WIKI have been established. Both seem rather sparse, as yet, and
24
the LaTeX WIKI contains some suggestions that go counter to established advice (e.g.,
the LaTeX WIKI has details on the use of the eqnarray environment); however one
may hope that both will become useful resources in the longer term.
37 Typography tutorials
There’s also (at least one) typographic style tutorial available on the Web, the excellent
“Guidelines for Typography in NBCS”. In fact, its parent page is also worth a read:
among other things, it provides copies of the “guidelines” document in a wide variety
of primary fonts, for comparison purposes. The author is careful to explain that he
has no ambition to supplant such excellent books as Bringhurst’s, but the document
(though it does contain its Rutgers-local matter) is a fine introduction to the issues of
producing readable documents.
Peter Wilson’s manual for his memoir class has a lengthy introductory section on
typographic considerations, which is a fine tutorial, written by someone who is aware
of the issues as they apply to (La)TeX users.
memoir distribution: macros/latex/contrib/memoir

38 Directories of (La)TeX information


TUG India is developing a series of online LaTeX tutorials which can be strongly
recommended: select single chapters at a time from http://www.tug.org.in/
tutorials.html — the set comprises two parts, “Text” and “Graphics”, so far.
Herbert Voß’s excellent LaTeX tips and tricks is an excellent source of small articles
on the use of LaTeX.
39 Freely available (La)TeX books
People have long argued for (La)TeX books to be made available on the web, and until
relatively recently this demand went un-answered.
The first to appear was Victor Eijkhout’s excellent “TeX by Topic” in 2001 (it had
been published by Addison-Wesley, but was long out of print). The book is currently
available at http://www.eijkhout.net/tbt/; it’s not a beginner’s tutorial but it’s a
fine reference.
Addison-Wesley have also released the copyright of “TeX for the Impatient” by
Paul W. Abrahams, Karl Berry and Kathryn A. Hargreaves, another book whose un-
availability many have lamented. The authors have re-released the book under the GNU
general documentation licence, and it is available from CTAN.
Norm Walsh’s “Making TeX Work” (originally published by O’Reilly) is also avail-
able (free) on the Web, at http://makingtexwork.sourceforge.net/mtw/; the
sources of the Web page are on CTAN. The book was an excellent resource in its
day, but while it is now somewhat dated, it still has its uses, and is a welcome addition
to the list of on-line resources. A project to update it is believed to be under way.
Making TeX Work: info/makingtexwork/mtw-1.0.1-html.tar.gz
TeX for the Impatient: info/impatient
40 Documentation of packages
These FAQs regularly suggest packages that will “solve” particular problems. In some
cases, the answer provides a recipe for the job. In other cases, or when the solution
needs elaborating, how is the poor user to find out what to do?
If you’re lucky, the package you need is already in your installation. If you’re par-
ticularly lucky, you’re using a distribution that gives access to package documentation
and the documentation is available in a form that can easily be shown. For example, on
a teTeX-based system, the texdoc command is usually useful, as in:

texdoc footmisc

which opens an xdvi window showing documentation of the footmisc package. Accord-
ing to the type of file texdoc finds, it will launch xdvi, a ghostscript-based PostScript
viewer or a PDF reader. If it can’t find any documentation, it may launch a Web browser
to look at its copy of the CTAN catalogue. The catalogue has an entry for package doc-
umentation, and since CTAN now encourages authors to submit documentation of their
packages, that entry may provide a useful lead.

25
If your luck (as defined above) doesn’t hold out, you’ve got to find documentation
by other means. This is where you need to exercise your intelligence: you have to find
the documentation for yourself. What follows offers a range of possible techniques.
The commonest form of documentation of LaTeX add-ons is within the .dtx file
in which the code is distributed (see documented LaTeX sources). Such files are sup-
posedly processable by LaTeX itself, but there are occasional hiccups on the way to
readable documentation. Common problems are that the package itself is needed to
process its own documentation (so must be unpacked before processing), and that the
.dtx file will not in fact process with LaTeX. In the latter case, the .ins file will
usually produce a .drv (or similarly-named) file, which you process with LaTeX in-
stead. (Sometimes the package author even thinks to mention this wrinkle in a package
README file.)
Another common form is the separate documentation file; particularly if a package
is “conceptually large” (and therefore needs a lot of documentation), the documentation
would prove a cumbersome extension to the .dtx file. Examples of such cases are
the memoir class (whose documentation, memman, is widely praised as an introduction
to typesetting concepts), the KOMA-script bundle (whose developers take the trouble
to produce detailed documentation in both German and English), and the fancyhdr
package (whose documentation derives from a definitive tutorial in a mathematical
journal). Even if the documentation is not separately identified in a README file, it
should not be too difficult to recognise its existence.
Documentation within the package itself is the third common form. Such docu-
mentation ordinarily appears in comments at the head of the file, though at least one
eminent author regularly places it after the \endinput command in the package. (This
is desirable, since \endinput is a ‘logical’ end-of-file, and (La)TeX doesn’t read be-
yond it: thus such documentation does not ‘cost’ any package loading time.)
The above suggestions cover most possible ways of finding documentation. If,
despite your best efforts, you can’t find it in any of the above places, there’s the awful
possibility that the author didn’t bother to document his package (on the “if it was hard
to write, it should be hard to use” philosophy). Most ordinary mortals will seek support
from some more experienced user at this stage, though it is possible to proceed in the
way that the original author apparently expected. . . by reading his code.
41 Learning to write LaTeX classes and packages
There’s nothing particularly magic about the commands you use when writing a
package, so you can simply bundle up a set of LaTeX \(re)newcommand and
\(re)newenvironment commands, put them in a file package.sty and you have
a package.
However, any but the most trivial package will require rather more sophistication.
Some details of LaTeX commands for the job are to be found in ‘LaTeX 2ε for class
and package writers’ (clsguide, part of the LaTeX documentation distribution). Be-
yond this, a good knowledge of TeX itself is valuable: thus books such as the TeX-
book or TeX by topic are relevant. With good TeX knowledge it is possible to use the
documented source of LaTeX as reference material (dedicated authors will acquaint
themselves with the source as a matter of course). A complete set of the documented
source of LaTeX may be prepared by processing the file source2e.tex in the LaTeX
distribution, but individual files in the distribution may be processed separately with
LaTeX, like any well-constructed .dtx file.
Writing good classes is not easy; it’s a good idea to read some established ones
(classes.dtx, for example, is the documented source of the standard classes other
than Letter, and may itself be formatted with LaTeX). Classes that are not part of the
distribution are commonly based on ones that are, and start by loading the standard
class with \LoadClass — an example of this technique may be seen in ltxguide.cls
classes.dtx : macros/latex/base/classes.dtx
ltxguide.cls: macros/latex/base/ltxguide.cls
LaTeX documentation: macros/latex/doc
source2e.tex : macros/latex/base/source2e.tex
42 MetaFont and MetaPost Tutorials
Apart from Knuth’s book, there seems to be only one publicly-available tutorial for
MetaFont, by Christophe Grandsire (a copy in PDF form may be downloaded). Geof-
26
frey Tobin’s MetaFont for Beginners (see using MetaFont) describes how the MetaFont
system works and how to avoid some of the potential pitfalls.
There is also an article on how to run both MetaFont and MetaPost (the programs).
Peter Wilson’s Some Experiences in Running MetaFont and MetaPost offers the benefit
of Peter’s experience (he has designed a number of ‘historical’ fonts using MetaFont).
For MetaFont the article is geared towards testing and installing new MetaFont fonts,
while its MetaPost section describes how to use MetaPost illustrations in LaTeX and
PDFLaTeX documents, with an emphasis on how to use appropriate fonts for any text
or mathematics.
Hans Hagen (of ConTeXt fame) offers a MetaPost tutorial called MetaFun (which
admittedly concentrates on the use of MetaPost within ConTeXt). It may be found on
his company’s MetaPost page.
Other MetaPost tutorials that have appeared are two in English by André Heck and
Urs Oswald, and one in French (listed here because it’s clearly enough written that this
author understands it), by Laurent Chéno; both have been recommended for inclusion
in the FAQ
Urs Oswald’s tutorial is accompanied by a super tool for testing little bits of Meta-
Post, which is an invaluable aid to the learner: http://www.tlhiv.org/cgi-bin/
MetaPostPreviewer/index.cgi
Vincent Zoonekynd’s massive set of example MetaPost files is available on CTAN;
the set includes a Perl script to convert the set to html, and the set may be viewed on the
web. While these examples don’t exactly constitute a “tutorial”, they’re most certainly
valuable learning material. Urs Oswald presents a similar document, written more as a
document, and presented in PDF.
Beginners’ guide: info/metafont/beginners/metafont-for-beginners.pdf
Peter Wilson’s “experiences”: info/metafont/metafp/metafp.pdf (PDF format)
Vincent Zoonekynd’s examples: info/metapost/examples
43 BibTeX Documentation
BibTeX, a program originally designed to produce bibliographies in conjunction with
LaTeX, is explained in Section 4.3 and Appendix B of Leslie Lamport’s LaTeX manual
(see TeX-related books). The document “BibTeXing”, contained in the file btxdoc.
tex, expands on the chapter in Lamport’s book. The LaTeX Companion (see TeX-
related books) also has information on BibTeX and writing BibTeX style files.
The document “Designing BibTeX Styles”, contained in the file btxhak.tex, ex-
plains the postfix stack-based language used to write BibTeX styles (.bst files). The
file btxbst.doc is the template for the four standard styles (plain, abbrv, alpha,
unsrt). It also contains their documentation. The complete BibTeX documentation
set (including the files above) is available on CTAN.
A useful tutorial of the whole process of using BibTeX is Nicolas Markey’s “Tame
the BeaST (The B to X of BibTeX)”, which may also be found on CTAN.
BibTeX documentation: biblio/bibtex/distribs/doc
BibTeX documentation, in PDF: biblio/bibtex/contrib/doc
Tame the BeaST: info/bibtex/tamethebeast/ttb_en.pdf
44 Where can I find the symbol for . . .
There is a wide range of symbols available for use with TeX, most of which are not
shown (or even mentioned) in (La)TeX books. The Comprehensive LaTeX Symbol List
(by Scott Pakin et al.) illustrates over 2000 symbols, and details the commands and
packages needed to produce them.
Other questions in this FAQ offer specific help on kinds of symbols:

• Script fonts for mathematics (Script fonts for mathematics)


• Fonts for the number sets (Fonts for the number sets)
• Typesetting the principal value integral (Typesetting the principal value integral)

Symbol List: Browse info/symbols/comprehensive; there are processed


versions in both PostScript and PDF forms for both A4 and letter paper.

27
45 The PiCTeX manual
PiCTeX is a set of macros by Michael Wichura for drawing diagrams and pictures.
The macros are freely available; however, the PiCTeX manual itself is not free. Un-
fortunately, TUG is no longer able to supply copies of the manual (as it once did),
and it is now available only through Personal TeX Inc, the vendors of PCTeX (http:
//www.pctex.com/). The manual is not available electronically.
pictex : graphics/pictex

D Bits and pieces of (La)TeX


46 What is a DVI file?
A DVI file (that is, a file with the type or extension .dvi) is TeX’s main output file,
using TeX in its broadest sense to include LaTeX, etc. ‘DVI’ is supposed to be an
acronym for DeVice-Independent, meaning that the file can be printed on almost any
kind of typographic output device. The DVI file is designed to be read by a driver
(DVI drivers) to produce further output designed specifically for a particular printer
(e.g., a LaserJet) or to be used as input to a previewer for display on a computer screen.
DVI files use TeX’s internal coding; a TeX input file should produce the same DVI file
regardless of which implementation of TeX is used to produce it.
A DVI file contains all the information that is needed for printing or previewing
except for the actual bitmaps or outlines of fonts, and possibly material to be introduced
by means of \special commands.
The canonical reference for the structure of a DVI file is the source of dvitype.
dvitype: systems/knuth/texware/dvitype.web

47 What is a driver?
A driver is a program that takes as input a DVI file (DVI files) and (usually) produces
a file that can be sent to a typographic output device, called a printer for short.
A driver will usually be specific to a particular printer, although any PostScript
printer ought to be able to print the output from a PostScript driver.
As well as the DVI file, the driver needs font information. Font information may
be held as bitmaps or as outlines, or simply as a set of pointers into the fonts that the
printer itself ‘has’. Each driver will expect the font information in a particular form.
For more information on the forms of fonts, see PK files, TFM files, virtual fonts and
Using PostScript fonts with TeX.
48 What are PK files?
PK files (packed raster) contain font bitmaps. The output from MetaFont () includes a
generic font (GF) file and the utility gftopk produces the PK file from that. There are a
lot of PK files, as one is needed for each font, that is each magnification (size) of each
design (point) size for each weight for each family. Further, since the PK files for one
printer do not necessarily work well for another, the whole set needs to be duplicated
for each printer type at a site. As a result, they are often held in an elaborate directory
structure, or in ‘font library files’, to regularise access.
49 What are TFM files?
TFM stands for TeX Font Metrics; TFM files hold information about the sizes of the
characters of the font in question, and about ligatures and kerns within that font. One
TFM file is needed for each font used by TeX, that is for each design (point) size for
each weight for each family; one TFM file serves for all magnifications, so that there
are (typically) fewer TFM files than there are PK files. The important point is that TFM
files are used by TeX (LaTeX, etc.), but are not, generally, needed by the printer driver.
50 Virtual fonts
Virtual fonts for TeX were first implemented by David Fuchs in the early days of TeX,
but for most people they date from when Knuth redefined the format, and wrote some
support software, in 1989 (he published an article in TUGboat at the time, and a plain
text copy is available on CTAN).
Virtual fonts provide a way of telling TeX about something more complicated than
just a one-to-one character mapping. The entities you define in a virtual font look

28
like characters to TeX (they appear with their sizes in a font metric file), but the DVI
processor may expand them to something quite different.
Specifically, TeX itself only looks at a TFM file that contains details of how the
virtual font will appear: but of course, TeX only cares about the metrics of a character,
so its demands are pretty small. The acroDVI processor, however, has to understand
the details of what is in the virtual font, so as to know “what to draw, where”. So, for
every virtual font read by a DVI driver, there has to be a TFM file to be read by TeX.
(PDFTeX, of course, needs both the TFM and the translation of the virtual font, since
it does the whole job in the one program.)
You can use a virtual font:

• simply just to remap the glyphs of a single font,


• to make a composite font with glyphs drawn from several different onts, or
• to build up an effect in arbitrarily complicated ways (since a virtual font may
contain anything which is legal in a DVI file).

In practice, the most common use of virtual fonts is to remap Adobe Type 1 fonts
(see font metrics), though there has also been useful useful work building ‘fake’ maths
fonts (by bundling glyphs from several fonts into a single virtual font). Virtual Com-
puter Modern fonts, making a Cork encoded font from Knuth’s originals by using
remapping and fragments of DVI for single-glyph ‘accented characters’, were the first
“Type 1 format” versions available.
Virtual fonts are normally created in a single ASCII VPL (Virtual Property List)
file, which includes both sets of information. The vptovf program is then used to the
create the binary TFM and VF files.
A “how-to” document, explaining how to generate a VPL, describes the endless
hours of fun that may be had, doing the job by hand. Despite the pleasures to be had of
the manual method, the commonest way (nowadays) of generating VPL files is to use
the fontinst package, which is described in detail PostScript font metrics. Qdtexvpl is
another utility for creating ad-hoc virtual fonts (it uses TeX to parse a description of
the virtual font, and qdtexvpl itself processes the resulting DVI file).
fontinst: fonts/utilities/fontinst
Knuth on virtual fonts: info/knuth/virtual-fonts
Virtual fonts “how to”: info/virtualfontshowto/virtualfontshowto.txt
qdtexvpl: fonts/utilities/qdtexvpl

51 \special commands
TeX provides the means to express things that device drivers can do, but about which
TeX itself knows nothing. For example, TeX itself knows nothing about how to include
PostScript figures into documents, or how to set the colour of printed text; but some
device drivers do.
Such things are introduced to your document by means of \special commands;
all that TeX does with these commands is to expand their arguments and then pass the
command to the DVI file. In most cases, there are macro packages provided (often
with the driver) that provide a comprehensible interface to the \special; for exam-
ple, there’s little point including a figure if you leave no gap for it in your text, and
changing colour proves to be a particularly fraught operation that requires real wiz-
ardry. LaTeX 2ε has standard graphics and colour packages that make figure inclusion,
rotation and scaling, and colour typesetting via \specials all easy.
The allowable arguments of \special depend on the device driver you’re using.
Apart from the examples above, there are \special commands in the emTeX drivers
(e.g., dvihplj, dviscr, etc.) that will draw lines at arbitrary orientations, and commands
in dvitoln03 that permit the page to be set in landscape orientation.
52 How does hyphenation work in TeX?
Everyone knows what hyphenation is: we see it in most books we read, and (if we’re
alert) often spot ridiculous mis-hyphenation from time to time (at one time, British
newspapers were a fertile source).
Hyphenation styles are culturally-determined, and the same language may be hy-
phenated differently in different countries — for example, British and American styles
of hyphenation of English are very different. As a result, a typesetting system that
29
is not restricted to a single language at a single locale needs to be able to change its
hyphenation rules from time to time.
TeX uses a pretty good system for hyphenation (originally designed by Frank
Liang), and while it’s capable of missing “sensible” hyphenation points, it seldom se-
lects grossly wrong ones. The algorithm matches candidates for hyphenation against
a set of “hyphenation patterns”. The candidates for hyphenation must be sequences of
letters (or other single characters that TeX may be persuaded to think of as letters) —
things such as TeX’s \accent primitive interrupt hyphenation.
Sets of hyphenation patterns are usually derived from analysis of a list of valid
hyphenations (the process of derivation, using a tool called patgen, is not ordinarily a
participatory sport).
The patterns for the languages a TeX system is going to deal with may only be
loaded when the system is installed. To change the set of languages, a partial reinstal-
lation is necessary.
TeX provides two “user-level” commands for control of hyphenation: \language
(which selects a hyphenation style), and \hyphenation (which gives explicit instruc-
tions to the hyphenation engine, overriding the effect of the patterns).
The ordinary LaTeX user need not worry about \language, since it is very thor-
oughly managed by the babel package; use of \hyphenation is discussed in hyphen-
ation failure.
53 What are LaTeX classes and packages?
Current LaTeX makes a distinction between the macros that define the overall layout of
a document, and the macros that tweak that layout (to one extent or another) to provide
what the author really wants.
The distinction was not very clear in LaTeX 2.09, and after some discussion (in the
later stages of development of current LaTeX) the names “class” and “package” were
applied to the two concepts.
The idea is that a document’s class tells LaTeX what sort of document it’s dealing
with, while the packages the document loads “refine” that overall specification.
On the disc, the files only appear different by virtue of their name “extension” —
class files are called *.cls while package files are called *.sty. Thus we find that the
LaTeX standard article class is represented on disc by a file called article.cls, while
the footmisc package (which refines article’s definition of footnotes) is represented on
disc by a file called footmisc.sty.
The user defines the class of his document with the \documentclass command
(typically the first command in a document), and loads packages with the \usepackage
command. A document may have several \usepackage commands, but it may have
only one \documentclass command. (Note that there are programming-interface ver-
sions of both commands, since a class may choose to load another class to refine its
capabilities, and both classes and packages may choose to load other packages.)
54 Documented LaTeX sources (.dtx files)
LaTeX 2ε , and many support macro packages, are now written in a literate program-
ming style (literate programming), with source and documentation in the same file.
This format, known as ‘doc’, in fact originated before the days of the LaTeX project as
one of the “Mainz” series of packages. The documented sources conventionally have
the suffix .dtx, and should normally be stripped of documentation before use with
LaTeX. Alternatively you can run LaTeX on a .dtx file to produce a nicely formatted
version of the documented code. An installation script (with suffix .ins) is usually
provided, which needs the standard LaTeX 2ε docstrip package (among other things,
the installation process strips all the comments that make up the documentation for
speed when loading the file into a running LaTeX system). Several packages can be
included in one .dtx file, with conditional sections, and there facilities for indices of
macros etc. Anyone can write .dtx files; the format is explained in The LaTeX Com-
panion, and a tutorial is available from CTAN (which comes with skeleton .dtx and
.ins files).
Composition of .dtx files is supported in emacs by Matt Swift’s swiftex system: it
provides a doc-tex mode which treats .dtx files rather better than AUC-TeX manages.
Another useful way of generating .dtx files is to write the documentation and the
code separately, and then to combine them using the makedtx system. This technique
has particular value in that the documentation file can be used separately to generate
30
HTML output; it is often quite difficult to make LaTeX to HTML conversion tools deal
with .dtx files, since they use an unusual class file.
.dtx files are not used by LaTeX after they have been processed to produce .sty
or .cls (or whatever) files. They need not be kept with the working system; however,
for many packages the .dtx file is the primary source of documentation, so you may
want to keep .dtx files elsewhere.
An interesting sideline to the story of .dtx files is the docmfp package, which
extends the model of the doc package to MetaFont and MetaPost ( and), thus permit-
ting documented distribution of bundles containing code for MetaFont and MetaPost
together with related LaTeX code.
clsguide.pdf : macros/latex/doc/clsguide.pdf
docmfp.sty : macros/latex/contrib/docmfp
docstrip.tex : Part of the LaTeX distribution
DTX tutorial: info/dtxtut
makedtx : support/makedtx
swiftex.el: support/emacs-modes/swiftex

55 What are encodings?


Let’s start by defining two concepts, the character and the glyph. The character is the
abstract idea of the ‘atom’ of a language or other dialogue: so it might be a letter in an
alphabetic language, a syllable in a syllabic language, or an ideogram in an ideographic
language. The glyph is the mark created on screen or paper which represents a char-
acter. Of course, if reading is to be possible, there must be some agreed relationship
between the glyph and the character, so while the precise shape of the glyph can be
affected by many other factors, such as the capabilities of the writing medium and the
designer’s style, the essence of the underlying character must be retained.
Whenever a computer has to represent characters, someone has to define the rela-
tionship between a set of numbers and the characters they represent. This is the essence
of an encoding: it is a mapping between a set of numbers and a set of things to be rep-
resented.
TeX of course deals in encoded characters all the time: the characters presented to
it in its input are encoded, and it emits encoded characters in its DVI (or PDF) output.
These encodings have rather different properties.
The TeX input stream was pretty unruly back in the days when Knuth first imple-
mented the language. Knuth himself prepared documents on terminals that produced
all sorts of odd characters, and as a result TeX contains some provision for translating
the input encoding to something regular. Nowadays, the operating system translates
keystrokes into a code appropriate for the user’s language: the encoding used is often
a national or international standard, though many operating systems use “code pages”
defined by Microsoft. These standards and code pages often contain characters that
can’t appear in the TeX system’s input stream. Somehow, these characters have to be
dealt with — so an input character like “é” needs to be interpreted by TeX in a way that
that at least mimics the way it interprets “\’e”.
The TeX output stream is in a somewhat different situation: characters in it are to be
used to select glyphs from the fonts to be used. Thus the encoding of the output stream
is notionally a font encoding (though the font in question may be a virtual one — see
virtual font). In principle, a fair bit of what appears in the output stream could be
direct transcription of what arrived in the input, but the output stream also contains the
product of commands in the input, and translations of the input such as ligatures like
fi⇒“fi”.
Font encodings became a hot topic when the Cork encoding appeared, because
of the possibility of suppressing \accent commands in the output stream (and hence
improving the quality of the hyphenation of text in inflected languages, which is in-
terrupted by the \accent commands — see “how does hyphenation work”). To take
advantage of the diacriticised characters represented in the fonts, it is necessary to
arrange that whenever the command sequence “\’e” has been input (explicitly, or im-
plicitly via the sort of mapping of input mentioned above), the character that codes the
position of the “é” glyph is used.
Thus we could have the odd arrangement that the diacriticised character in the
TeX input stream is translated into TeX commands that would generate something
31
looking like the input character; this sequence of TeX commands is then translated
back again into a single diacriticised glyph as the output is created. This is in fact
precisely what the LaTeX packages inputenc and fontenc do, if operated in tandem on
(most) characters in the ISO Latin-1 input encoding and the T1 font encoding. At first
sight, it seems eccentric to have the first package do a thing, and the second precisely
undo it, but it doesn’t always happen that way: most font encodings can’t match the
corresponding input encoding nearly so well, and the two packages provide the sort of
symmetry the LaTeX system needs.
56 What are the EC fonts?
A font consists of a number of glyphs. In order that the glyphs may be printed, they
are encoded, and the encoding is used as an index into tables within the font. For
various reasons, Knuth chose deeply eccentric encodings for his Computer Modern
family of fonts; in particular, he chose different encodings for different fonts, so that
the application using the fonts has to remember which font of the family it’s using
before selecting a particular glyph.
When TeX version 3 arrived, most of the excuses for the eccentricity of Knuth’s en-
codings went away, and at TUG’s Cork meeting, an encoding for a set of 256 glyphs,
for use in TeX text, was defined. The intention was that these glyphs should cover
‘most’ European languages that use Latin alphabets, in the sense of including all ac-
cented letters needed. (Knuth’s CMR fonts missed things necessary for Icelandic and
Polish, for example, but the Cork fonts have them. Even Cork’s coverage isn’t com-
plete: it misses letters from Romanian, Eastern and Northern Sami, and Welsh, at
least. The Cork encoding does contain “NG” glyphs that allows it to support Southern
Sami.) LaTeX refers to the Cork encoding as T1, and provides the means to use fonts
thus encoded to avoid problems with the interaction of accents and hyphenation (see
hyphenation of accented words).
The only MetaFont-fonts that conform to the Cork encoding are the EC fonts. They
look CM-like, though their metrics differ from CM-font metrics in several areas. The
fonts are now regarded as ‘stable’ (in the same sense that the CM fonts are stable: their
metrics are unlikely ever to change). Their serious disadvantages for the casual user
are their size (each EC font is roughly twice the size of the corresponding CM font),
and there are far more of them than there are CM fonts. The simple number of fonts
has acted as a disincentive to the production of Adobe Type 1 versions of the fonts, but
several commercial suppliers offer EC or EC-equivalent fonts in type 1 or TrueType
form — see commercial suppliers. Free auto-traced versions (the CM-super and the
LGC fonts), and the Latin Modern series (rather directly generated from MetaFont
sources), are available.
Note that the Cork encoding doesn’t cover mathematics (and neither do “T1-
encoded” font families, of course). If you’re using Computer-Modern-alike fonts, this
doesn’t actually matter: your system will have the original Computer Modern fonts,
which cover ‘basic’ TeX mathematics; more advanced mathematics are likely to need
separate fonts anyway. Suitable mathematics fonts for use with other font families are
discussed in “choice of scalable fonts”.
The EC fonts are distributed with a set of ‘Text Companion’ (TC) fonts that provide
glyphs for symbols commonly used in text. The TC fonts are encoded according to the
LaTeX TS1 encoding, and are not viewed as ‘stable’ in the same way as are the EC
fonts are.
The Cork encoding is also implemented by virtual fonts provided in the PSNFSS
system (), for PostScript fonts, and also by the txfonts and pxfonts font packages (see
“choice of scalable fonts”).
CM-super fonts: fonts/ps-type1/cm-super
CM-LGC fonts: fonts/ps-type1/cm-lgc
EC and TC fonts: fonts/ec
Latin Modern fonts: fonts/lm
57 What is the TDS?
TDS stands for the TeX Directory Structure, which is a standard way of organising all
the TeX-related files on a computer system.
Most modern distributions conform to the TDS, which provides for both a ‘stan-
dard’ and a (set of) ‘local’ hierarchies of directories containing TeX-related files. The
32
TDS reserves the name texmf as the name of the root directory (folder) of the hier-
archies. Files supplied as part of the distribution are put into the standard hierarchy.
The location of the standard hierarchy is system dependent, but on a Unix system it
might be at /usr/local/texmf, or /usr/local/share/texmf, or /opt/texmf, or a
similar location, but in each case the TeX files will be under the /texmf subdirectory.
There may be more than on ‘local’ hierarchy in which additional files can be stored.
In the extreme an installation can have a local hierarchy and each user can also have
an individual local hierarchy. The location of any local hierarchy is not only system
dependent but also user dependent. Again, though, all files should be put under a local
/texmf directory.
The TDS is published as the output of a TUG Technical Working Group (Technical
Working Group). You may browse an on-line version of the standard, and copies in
several other formats (including source) are available on CTAN.
TDS specification: tds
58 What is “Encapsulated PostScript” (“EPS”)
PostScript has over the years become a lingua franca of powerful printers; since
PostScript is also a powerful graphical programming language, it is commonly used as
an output medium for drawing (and other) packages.
However, since PostScript is such a powerful language, some rules need to be im-
posed, so that the output drawing may be included in a document as a figure without
“leaking” (and thereby destroying the surrounding document, or failing to draw at all).
Appendix H of the PostScript Language Reference Manual (second and subsequent
editions), specifies a set of rules for PostScript to be used as figures in this way. The
important features are:

• certain “structured comments” are required; important ones are the identification
of the file type, and information about the “bounding box” of the figure (i.e., the
minimum rectangle enclosing it);
• some commands are forbidden — for example, a showpage command will cause
the image to disappear, in most TeX-output environments; and
• “preview information” is permitted, for the benefit of things such as word proces-
sors that don’t have the ability to draw PostScript in their own right — this preview
information may be in any one of a number of system-specific formats, and any
viewing program may choose to ignore it.

A PostScript figure that conforms to these rules is said to be in “Encapsulated


PostScript” (EPS) format. Most (La)TeX packages for including PostScript are
structured to use Encapsulated PostScript; which of course leads to much hilarity
as exasperated (La)TeX users struggle to cope with the output of drawing software
whose authors don’t know the rules.
59 Adobe font formats
Adobe has specified a number of formats for files to represent fonts in PostScript files;
this question doesn’t attempt to be encyclopaedic, but we’ll discuss the two formats
most commonly encountered in the (La)TeX context, types 1 and 3.
Adobe Type 1 format specifies a means to represent outlines of the glyphs in a font.
The ‘language’ used is closely restricted, to ensure that the font is rendered as quickly
as possible. (Or rather, as quickly as possible with Adobe’s technology at the time the
specification was written: the structure could well be different if it were specified now.)
The format has long been the basis of the digital type-foundry business, though things
are showing signs of change.
In the (La)TeX context, Type 1 fonts are extremely important. Apart from their
simple availability (there are thousands of commercial Type 1 text fonts around), the
commonest reader for PDF files has long (in effect) insisted on their use (see PDF
quality).
Type 3 fonts have a more forgiving specification. A wide range of PostScript oper-
ators is permissible, including bitmaps operators. Type 3 is therefore the natural format
to be used for programs such as dvips when they auto-generate something to represent
MetaFont-generated fonts in a PostScript file. It’s Adobe Acrobat Viewer’s treatment
of bitmap Type 3 fonts that has made direct MetaFont output inreasingly unattractive,
in recent years. If you have a PDF document in which the text looks fuzzy and uneven
33
in Acrobat Reader, ask Reader for the File→Document Properties→Fonts ...,
and it will show some font or other as “Type 3” (usually with encoding “Custom”).
(This problem has disappeared with version 6 of Acrobat Reader.)
Type 3 fonts should not entirely be dismissed, however. Acrobat Reader’s failure
with them is entirely derived from its failure to use the anti-aliasing techniques common
in TeX-ware. Choose a different set of PostScript graphical operators, and you can
make pleasing Type 3 fonts that don’t “annoy” Reader. For example, you may not
change colour within a Type 1 font glyph, but there’s no such restriction on a Type 3
font, which opens opportunities for some startling effects.
60 What are “resolutions”
“Resolution” is a word that is used with little concern for its multiple meanings, in
computer equipment marketing. The word suggests a measure of what an observer
(perhaps the human eye) can resolve; yet we regularly see advertisements for printers
whose resolution is 1200dpi — far finer than the unaided human eye can distinguish.
The advertisements are talking about the precision with which the printer can place
spots on the printed image, which affects the fineness of the representation of fonts,
and the accuracy of the placement of glyphs and other marks on the page.
In fact, there are two sorts of “resolution” on the printed page that we need to
consider for (La)TeX’s purposes:
• the positioning accuracy, and
• the quality of the fonts.
In the case where (La)TeX output is being sent direct to a printer, in the printer’s “na-
tive” language, it’s plain that the DVI processor must know all such details, and must
take detailed account of both types of resolution.
In the case where output is being sent to an intermediate distribution format, that
has potential for printing (or displaying) we know not where, the final translator, that
connects to directly to the printer or display, has the knowledge of the device’s proper-
ties: the DVI processor need not know, and should not presume to guess.
Both PostScript and PDF output are in this category. While PostScript is used less
frequently for document distribution nowadays, it is regularly used as the source for
distillation into PDF; and PDF is the workhorse of an enormous explosion of document
distribution.
Therefore, we need DVI processors that will produce “resolution independent”
PostScript or PDF output; of course, the independence needs to extend to both forms
of independence outlined above.
Resolution-independence of fonts is forced upon the world by the feebleness of
Adobe’s Acrobat Reader at dealing with bitmap files: a sequence of answers starting
with one aiming at the quality of PDF from PostScript addresses the problems that
arise.
Resolution-independence of positioning is more troublesome: dvips is somewhat
notorious for insisting on positioning to the accuracy of the declared resolution of the
printer. One commonly-used approach is to declare a resolution of 8000 (“better than
any device”), and this is reasonably successful though it does have its problems.
61 What is the “Berry naming scheme”
In the olden days, (La)TeX distributions were limited by the feebleness of file systems’
ability to represent long names. (The MS-DOS file system was a particular bugbear:
fortunately any current Microsoft system allows rather more freedom to specify file
names. Sadly, the ISO 9660 standard for the structure of CD-ROMs has a similar
failing, but that too has been modified by various extension mechanisms.)
One area in which this was a particular problem was that of file names for Type 1
fonts. These fonts are distributed by their vendors with pretty meaningless short names,
and there’s a natural ambition to change the name to something that identifies the font
somewhat precisely. Unfortunately, names such as “BaskervilleMT” are already far
beyond the abilities of the typical feeble file system, and add the specifier of a font
shape or variant, and the difficulties spiral out of control.
Thus arose the Berry naming scheme.
The basis of the scheme is to encode the meanings of the various parts of the
file’s specification in an extremely terse way, so that enough font names can be ex-
pressed even in impoverished file spaces. The encoding allocates one letter to the font
34
“foundry”, two to the typeface name, one to the weight, and so on. The whole scheme
is outlined in the fontname distribution, which includes extensive documentation and a
set of tables of fonts whose names have been systematised.
fontname distribution: info/fontname

E Acquiring the Software


62 Repositories of TeX material
To aid the archiving and retrieval of of TeX-related files, a TUG working group devel-
oped the Comprehensive TeX Archive Network (CTAN). Each CTAN site has identical
material, and maintains authoritative versions of its material. These collections are ex-
tensive; in particular, almost everything mentioned in this FAQ is archived at the CTAN
sites (see the lists of software at the end of each answer).
The CTAN sites are currently dante.ctan.org (Germany), cam.ctan.org (UK)
and tug.ctan.org (USA). The organisation of TeX files on all CTAN sites is identical
and starts at tex-archive/. Each CTAN node may also be accessed via the Web
at URLs http://www.dante.de/tex-archive, http://www.tex.ac.uk/tex-
archive and http://www.ctan.org/tex-archive respectively, but not all CTAN
mirrors are Web-accessible. As a matter of course, to reduce network load, please use
the CTAN site or mirror closest to you. A complete and current list of CTAN sites and
known mirrors is available as file CTAN.sites on the archives themselves.
For details of how to find files at CTAN sites, see “finding (La)TeX files”.
The TeX user who has no access to any sort of network may buy a copy of the
archive as part of the TeX Live distribution.
63 What’s the CTAN nonfree tree?
The CTAN archives are currently restructuring their holdings so that files that are ‘not
free’ are held in a separate tree. The definition of what is ‘free’ (for this purpose)
is influenced by, but not exactly the same as the Debian Free Software Guidelines
(DFSG).
Material is placed on the nonfree tree if it is not freely-usable (e.g., if the material
is shareware, commercial, or if its usage is not permitted in certain domains at all,
or without payment). Users of the archive should check that they are entitled to use
material they have retrieved from the nonfree tree.
The Catalogue (one of the prime sources for finding TeX-related material via web
search — web search) lists the licence details in each entry in its lists. For details of
the licence categories, see its list of licences.
64 Contributing a file to the archives
You have something to submit to the archive — great! Before we even start, here’s a
check-list of things to sort out:

1. Licence: in the spirit of TeX, we hope for free software; in the spirit of today’s
lawyer-enthralled society, CTAN provides a list of “standard” licence statements.
2. Documentation: it’s good for users to be able to browse documentation before
downloading a package. You need at least a plain text README file (exactly that
name); best is a PDF file of the package documentation, prepared for screen read-
ing.
3. Name: endless confusion is caused by name clashes. If your package has the same
name as one already on CTAN, or if your package installation generates files of
the same name as something in a “normal” distribution, the CTAN team will delay
installation while they check that you’re doing the right thing: they may nag you
to change the name, or to negotiate a take-over with the author of the original
package. Browse the archive to ensure uniqueness.
The name you choose should also (as far as possible) be somewhat descriptive of
what your submission actually does; while “descriptiveness” is to some extent in
the eye of the beholder, it’s clear that names such as mypackage or jiffy aren’t
suitable.

If you are able to use anonymous ftp, get yourself a copy of the file README.
uploads from any CTAN archive. The file tells you where to upload, what to up-

35
load, and how to notify the CTAN management about what you want doing with your
upload.
You may also upload via the Web: each of the principle CTAN sites offers an
upload page — choose from http://www.ctan.org/upload.html, http://www.
dante.de/CTAN/upload.html or http://www.tex.ac.uk/upload.html; the pages
lead you through the process, showing you the information you need to supply.
If you can use neither of these methods, ask advice of the CTAN management: if
the worst comes to the worst, it may be possible to mail a contribution.
If it’s appropriate (if your package is large, or regularly updated), the CTAN man-
agement can arrange to mirror your contribution direct into the archive. At present this
may only be done if your contribution is available via ftp, and of course the directory
structure on your archive must ‘fit’.
README.uploads: README.uploads

65 Finding (La)TeX files


Modern TeX distributions contain a huge array of support files of various sorts, but
sooner or later most people need to find something that’s not in their present system (if
nothing else, because they’ve heard that something has been updated).
But how to find the files?
Some sources, such as these FAQ answers, provide links to files: so if you’ve learnt
about a package here, you should be able to retrieve it without too much fuss.
Otherwise, the CTAN sites provide searching facilities, via the web. The simplest
search, locating files by name, is to be found on the Dante CTAN at http://www.
dante.de/cgi-bin/ctan-index; the script scans a list of files (FILES.byname —
see below) and returns a list of matches, arranged very neatly as a series of links to
directories and to individual files.
The UK and USA CTANs offer a search page that provides
• a file-name search similar to the Dante machine’s (above);
• a keyword search of the archive catalogue (see below): this is a pretty powerful
tool: the results include links to the catalogue “short descriptions”, so you can
assure yourself that the package you’ve found is the one you want; and
• a search form that allows you to use Google to search CTAN.
An alternative way to scan the catalogue is to use the catalogue’s “by topic” index;
this lists a series of topics, and (La)TeX projects that are worth considering if you’re
working on matters related to the topic.
In fact, Google, and other search engines, can be useful tools. Enter your search
keywords, and you may pick up a package that the author hasn’t bothered to submit to
CTAN. If you’re using Google, you can restrict your search to CTAN by entering
site:ctan.org tex-archive hsearch term(s)i

in Google’s “search box”. You can also enforce the restriction using Google’s “ad-
vanced search” mechanism; other search engines (presumably) have similar facilities.
Many people avoid the need to go over the network at all, for their searches, by
downloading the file list that the archives’ web file searches use. This file, FILES.
byname, presents a unified listing of the archive (omitting directory names and cross-
links). Its companion FILES.last07days is also useful, to keep an eye on the changes
on the archive. Since these files are updated only once a day, a nightly automatic
download (perhaps using rsync) makes good sense.
FILES.byname: FILES.byname
FILES.last07days: FILES.last07days

66 Finding new fonts


A comprehensive list of MetaFont fonts used to be posted to comp.fonts and to comp.
text.tex, roughly every six weeks, by Lee Quin.
Nowadays, authors of new material in MetaFont are few and far between (and
mostly designing highly specialised things with limited appeal to ordinary users). Most
new fonts that appear are prepared in some scalable outline form or other (see “choice
of scalable fonts”), and they are almost all distributed under commercial terms.
MetaFont font list: info/metafont-list
36
67 The TeX Live distribution
If you don’t have access to the Internet, there are obvious attractions to TeX collections
on a disc. Even those with net access will find large quantities of TeX-related files to
hand a great convenience.
The TeX Live distribution provides this, together with a ready-to-run TeX system.
The TeX Live installation disc offers teTeX for use on Unix-like systems, and ProTeXt
for use on Windows systems. There is also a ‘demonstration’ disc and an archive
snapshot (all on CD- or DVD-ROMs). TeX-Live was originally developed under the
auspices of a consortium of User Groups (notably TUG, UK TUG and GUTenberg).
All members of several User Groups receive copies free of charge. Some user groups
will also sell additional copies: contact your local user group or TUG.
Details of TeX Live are available from its own web page on the TUG site.

F TeX Systems
68 (La)TeX for different machines
We list here the free or shareware packages; see for details of commercial packages.
Unix Instructions for retrieving the web2c Unix TeX distribution via anonymous ftp
are to be found in unixtex.ftp, though nowadays the sensible installer will take
(and possibly customise) one of the packaged distributions such as teTeX, or the
TeX Live distribution.
To compile and produce a complete teTeX distribution, you need a .tar.gz file
for each of teTeX-src, teTeX-texmf and teTeX-texmfsrc.
No sets of teTeX binaries are provided on CTAN; however, compilation of teTeX
is pretty stable, on a wide variety of platforms. If you don’t have the means to
compile teTeX yourself, you will find that most “support” sites carry compiled
versions in their “free area”, and the TeX-live discs also carry a wide range of
binary distributions.
There’s a mailing list for teTeX installation problems (and the like): subscribe
by sending mail to [email protected] containing nothing more
than “subscribe tetex”. The list is archived at http://www.mail-archive.
com/[email protected]/, and an RSS feed is available at the same
site: http://www.mail-archive.com/[email protected]/maillist.
xml
During periods when teTeX is itself under development, a “teTeX-beta” is avail-
able. Before proceeding with the β -release, check the ANNOUNCE files in the two
directories on CTAN: it may well be that the β -release doesn’t offer you anything
new, that you need.
MacOS X users should refer to the information below, under item “Mac”.
tetex : Browse systems/unix/teTeX/current/distrib
tetex-beta: systems/unix/teTeX-beta
unixtex.ftp: systems/unix/unixtex.ftp
web2c: systems/web2c
Linux Linux users may use teTeX (see above).
The most recent offering is a free version of the commercial VTeX (see VTeX),
which among other things, specialises in direct production of PDF from (La)TeX
input.
tetex : Browse systems/unix/teTeX/current/distrib
vtex : systems/vtex/linux
vtex required common files: systems/vtex/common
PC: Win32 MiKTeX, by Christian Schenk, is also a comprehensive distribution, de-
veloped separately from the teTeX work. It has its own previewer, YAP, which is
itself capable of printing, though the distribution also includes a port of dvips. The
current version is available for file-by-file download (the HTML files in the direc-
tory offer hints on what you need to get going). The MiKTeX developers provide a
ready-to-run copy of the distribution, on CD-ROM (for purchase) via the MiKTeX
web site; otherwise the setup executable is available on CTAN, together with all
the optional packages.
37
ProTeXt, by Thomas Feuerstack, is a further option for installing MiKTeX. It bun-
dles a MiKTeX setup with some further useful utilities, together with a PDF file
which contains clickable links for the various installation steps, along with expla-
nations. It again it is freeware, and copies are distributed with the TeX-live CD
set.
XEmTeX, by Fabrice Popineau (he who created the excellent, but now defunct, fp-
TeX distribution), is an integrated distribution of TeX, LaTeX, ConTeXt, XEmacs
and friends for Windows. All programs have been compiled natively to take the
best advantage of the Windows environment. Configuration is provided so that the
resulting set of programs runs out-of-the-box.
The (Japanese) W32TEX distribution was motivated by the needs of Japanese
users (Japanese won’t fit in a “simple” character set like ASCII, but TeX is based
on a version of ASCII). Despite its origins, W32TEX is said to be a good bet for
Western users, notably those whose disks are short of space: the minimum docu-
mented download is as small as 95 MBytes. Investigate the distribution at http:
//www.fsci.fuk.kindai.ac.jp/kakuto/win32-ptex/web2c75-e.html
A further (free) option arises from the “CygWin” bundle, which presents a Unix-
like environment over the Win32 interface; an X-windows server is available. If
you run CygWin on your Windows machine, you have the option of using teTeX,
too (you will need the X-server, to run xdvi). Of course, teTeX components will
look like Unix applications (but that’s presumably what you wanted), but it’s also
reputedly somewhat slower than native Win32 implementations such as MiKTeX
or XEmTeX. TeTeX is available as part of the CygWin distribution (in the same
way that a version is available with most Linux distributions, nowadays), and you
may also build your own copy from the current sources.
BaKoMa TeX, by Basil Malyshev, is a comprehensive (shareware) distribution,
which focuses on support of Acrobat. The distribution comes with a bunch of
Type 1 fonts packaged to work with BaKoMa TeX, which further the focus.
bakoma: nonfree/systems/win32/bakoma
miktex : Acquire systems/win32/miktex/setup/setup.exe (also available
from the MiKTeX web site), and read installation instructions from the
MiKTeX installation page
protext.exe: systems/texlive/Images/protext.exe
tetex : systems/unix/teTeX/current/distrib
PC: MS-DOS or OS/2 EmTeX, by Eberhard Mattes, includes LaTeX, BibTeX, pre-
viewers, and drivers, and is available as a series of zip archives. Documentation is
available in both German and English. Appropriate memory managers for using
emTeX with 386 (and better) processors and under Windows, are included in the
distribution. EmTeX will operate under Windows, but Windows users are better
advised to use a distribution tailored for the Windows environment.
A version of emTeX, packaged to use a TDS directory structure, is separately avail-
able as an emTeX ‘contribution’. Note that neither emTeX itself, nor emTeXTDS,
is maintained. Most users of Microsoft operating systems, who want an up-to-date
(La)TeX system, need to migrate to Win32-based systems.
emtex : systems/msdos/emtex
emtexTDS : obsolete/systems/os2/emtex-contrib/emtexTDS
PC: MS-DOS The most recent MS-DOS offering is a port of the Web2C 7.0 imple-
mentation, using the GNU djgpp compiler. While this package is more recent than
emTeX, it still implements a rather old instance of (La)TeX.
djgpp: systems/msdos/djgpp
PC: OS/2 OS/2 may also use a free version of the commercial VTeX (see VTeX),
which specialises in direct production of PDF from (La)TeX input.
vtex : systems/vtex/os2
vtex required common files: systems/vtex/common
Windows NT, other platforms Ports of MiKTeX for NT on Power PC and AXP are
available. Neither version has been updated for version 1.2 (or later) of MiK-
TeX — they may not be satisfactory.
miktex for AXP : obsolete/systems/win32/miktex-AXP
38
miktex for Power PC : obsolete/systems/win32/miktexppc
Mac OzTeX, by Andrew Trevorrow, is a shareware version of TeX for the Macintosh.
A DVI previewer and PostScript driver are also included.
UK TUG prepays the shareware fee for its members, so that they may acquire
the software without further payment. Questions about OzTeX may be directed to
[email protected]
Another partly shareware program is CMacTeX, put together by Tom Kiffe. This is
much closer to the Unix TeX setup (it uses dvips, for instance). CMacTeX includes
a port of a version of Omega.
Both OzTeX and CMacTeX run on either MacOS X or on a sufficiently recent
MacOS with CarbonLib (v1.3 for OzTeX, v1.4 for CMacTeX). MacOS X users
also have the option of gwTeX, by Gerben Wierda (which is based on teTeX).
This is naturally usable from the command line, just like any other Unix-based
system, but it can also be used Mac-style as the engine behind Richard Koch’s
(free) TeXShop, which is an integrated TeX editor and previewer.
From its 2005 release, the TeX-Live disc set includes “MacTeX”, a CD-ROM
image that contains MacOS X teTeX (the Gerben Wierda set mentioned above),
TeXshop, and XeTeX. Details (and a downloadable distribution set) may be found
on the TUG web site; the distribution is also on CTAN.
A useful resource for Mac users has a news and ‘help’ section, as well as details
of systems and tools.
cmactex : nonfree/systems/mac/cmactex
mactex : systems/mac/mactex
oztex : nonfree/systems/mac/oztex
MacOS X teTeX : ftp://ftp.nluug.nl/pub/comp/macosx/tex-gs/
TeXShop: http://darkwing.uoregon.edu/~koch/texshop/texshop.html
OpenVMS TeX for OpenVMS is available.
OpenVMS : systems/OpenVMS/TEX97_CTAN.ZIP

Atari TeX is available for the Atari ST.


If anonymous ftp is not available to you, send a message containing the line
‘help’ to [email protected]
Atari TeX : systems/atari
Amiga Full implementations of TeX 3.1 (PasTeX) and MetaFont 2.7 are available.
PasTeX : systems/amiga
TOPS-20 TeX was originally written on a DEC-10 under WAITS, and so was easily
ported to TOPS-20. A distribution that runs on TOPS-20 is available via anony-
mous ftp from ftp.math.utah.edu in pub/tex/pub/web
69 TeX-friendly editors and shells
There are good TeX-writing environments and editors for most operating systems;
some are described below, but this is only a personal selection:
Unix Try GNU emacs or XEmacs, and the AUC-TeX bundle (available from CTAN).
AUC-TeX provides menu items and control sequences for common constructs,
checks syntax, lays out markup nicely, lets you call TeX and drivers from within
the editor, and everything else like this that you can think of. Complex, but very
powerful.
Many who fail to find the versions of emacs attractive, prefer vim, a highly config-
urable editor (also available for Windows and Macintosh systems). Many plugins
are available to support the needs of the (La)TeX user, including syntax highlight-
ing, calling TeX programs, auto-insertion and -completion of common (La)TeX
structures, and browsing LaTeX help. The scripts auctex.vim and bibtex.vim
seem to be the most common recommendations.
The editor NEdit is also free and programmable, and is available for Unix systems.
An AUC-TeX-alike set of extensions for NEdit is available from CTAN.
LaTeX4Jed provides much enhanced LaTeX support for the jed editor. LaTeX4Jed
is similar to AUC-TeX: menus, shortcuts, templates, syntax highlighting, docu-
ment outline, integrated debugging, symbol completion, full integration with ex-
39
ternal programs, and more. It was designed with both the beginner and the ad-
vanced LaTeX user in mind.
The Kile editor that is provided with the KDE window manager provides GUI
“shell-like” facilities, in a similar way to the widely-praised Winedt (see below);
details (and downloads) are available from the project’s home on SourceForge. A
newer system (by Kile’s original author), texmaker doesn’t rely on KDE’s facili-
ties, and so may be more easily portable.
MS-DOS TeXshell is a simple, easily-customisable environment, which can be used
with the editor of your choice.
You can also use GNU emacs and AUC-TeX under MS-DOS.
Windows ’9x, NT, etc. TeXnicCenter is a (free) TeX-oriented development system,
uniting a powerful platform for executing (La)TeX and friends with a configurable
editor.
Winedt, a shareware package, is also highly spoken of. It too provides a shell for
the use of TeX and related programs, as well as a powerful and well-configured
editor.
Both emacs and vim are available in versions for Windows systems.
OS/2 epmtex offers an OS/2-specific shell.
Macintosh The commercial Textures provides an excellent integrated Macintosh en-
vironment with its own editor. More powerful still (as an editor) is the shareware
Alpha which is extensible enough to let you perform almost any TeX-related job.
It works well with OzTeX.
For MacOS X users, the tool of choice appears to be TeXShop, which combines
an editor and a shell with a coherent philosophy of dealing with (La)TeX in the
OS X environment.
Vim is available for use on Macintosh systems.
Atari, Amiga and NeXT users also have nice environments. LaTeX users looking for
make-like facilities should try latexmk.
While many (La)TeX-oriented editors can support work on BibTeX files, there are
many systems that provide specific “database-like” access to your BibTeX files —
“creating a bibliography file”.
alpha: nonfree/systems/mac/support/alpha
auctex : support/auctex
epmtex : systems/os2/epmtex
latexmk : support/latexmk
LaTeX4Jed : support/jed
Nedit LaTeX support: support/NEdit-LaTeX-Extensions
TeXnicCenter : systems/win32/TeXnicCenter
TeXshell: systems/msdos/texshell
TeXtelmExtel: systems/msdos/emtex-contrib/TeXtelmExtel
winedt: systems/win32/winedt/winedt32.exe

70 Commercial TeX implementations


There are many commercial implementations of TeX. The first appeared not long after
TeX itself appeared.
What follows is probably an incomplete list. Naturally, no warranty or fitness for
purpose is implied by the inclusion of any vendor in this list. The source of the infor-
mation is given to provide some clues to its currency.
In general, a commercial implementation will come ‘complete’, that is, with suit-
able previewers and printer drivers. They normally also have extensive documentation
(i.e., not just the TeXbook!) and some sort of support service. In some cases this is a
toll free number (probably applicable only within the USA and or Canada), but others
also have email, and normal telephone and fax support.
PC; TrueTeX Runs on all versions of Windows.
Richard J. Kinch
TrueTeX Software
7890 Pebble Beach Court
40
Lake Worth FL 33467
USA
Tel: +1 561-966-8400
Email: [email protected]
Web: http://www.truetex.com/
Source: Mail from Richard Kinch, August 2004.
pcTeX Long-established: pcTeX32 is a Windows implementation.
Personal TeX Inc
725 Greenwich Street, Suite 210
San Francisco, CA 94133
USA
Tel: 800-808-7906 (within the USA)
Tel: +1 415-296-7550
Fax: +1 415-296-7501
Email: [email protected]
Web: http://www.pctex.com/
Source: Personal TeX Inc web site, December 2004
PC; VTeX DVI, PDF and HTML backends, Visual Tools and Type 1 fonts
MicroPress Inc
68-30 Harrow Street
Forest Hills, NY 11375
USA
Tel: +1 718-575-1816
Fax: +1 718-575-8038
Email: [email protected]
Web: http://www.micropress-inc.com/
Source: Mail from MicroPress, Inc., July 1999
PC; Scientific Word Scientific Word and Scientific Workplace offer a mechanism for
near-WYSIWYG input of LaTeX documents; they ship with TrueTeX from Kinch
(see above). Queries within the UK and Ireland should be addressed to Scien-
tific Word Ltd., others should be addressed directly to the publisher, MacKichan
Software Inc.
Dr Christopher Mabb
Scientific Word Ltd.
20 Bankpark Crescent
Tranent
East Lothian, EH33 1AS
UK
Tel: 0845 766 0340 (within the UK)
Tel: +44 1875 616516
Fax: 01875 613513 (within the UK)
Email: [email protected]
Web: http://www.sciword.demon.co.uk
MacKichan Software Inc.
19307 8th Avenue, Suite C
Poulsbo WA 98370-7370
USA
Tel: +1 360 394 6033
Tel: 877 724 9673 (within the USA) Fax: +1 360 394 6039
Email: [email protected]
Web: http://www.mackichan.com
Source: Mail from Christopher Mabb, November 2004
Macintosh; Textures “A TeX system ‘for the rest of us’ ”; also gives away a MetaFont
implementation and some font manipulation tools.
Blue Sky Research
534 SW Third Avenue
Portland, OR 97204
41
USA
Tel: 800-622-8398 (within the USA)
Tel: +1 503-222-9571
Fax: +1 503-222-1643
Email: [email protected]
Web: http://www.bluesky.com/
Source: TUGboat 15(1) (1994)
AmigaTeX A full implementation for the Commodore Amiga, including full, on-
screen and printing support for all PostScript graphics and fonts, IFF raster graph-
ics, automatic font generation, and all of the standard macros and utilities.
Radical Eye Software
PO Box 2081
Stanford, CA 94309
USA
Source: Mail from Tom Rokicki, November 1994

Note that the company Y&Y has gone out of business, and Y&Y TeX (and support
for it) is therefore no longer available. Users of Y&Y systems may care to use the self-
help mailing list that was established in 2003; the remaining usable content of Y&Y’s
web site is available at http://www.tug.org/yandy/

G DVI Drivers and Previewers


71 DVI to PostScript conversion programs
The best public domain DVI to PostScript conversion program, which runs under many
operating systems, is Tom Rokicki’s dvips. dvips is written in C and ports easily. All
current development is in the context of Karl Berry’s kpathsea library, and sources are
available from the TeX live repository, and versions are available in all TeX distribu-
tions that recognise the use of PostScript.
An VMS versions is available as part of the CTAN distribution of TeX for VMS.
A precompiled version to work with emTeX is available on CTAN.
MS-DOS and OS/2: systems/msdos/dviware/dvips
VMS distribution: systems/OpenVMS/TEX97_CTAN.ZIP
72 DVI drivers for HP LaserJet
The emTeX distribution (see TeX systems) contains a driver for the LaserJet, dvihplj.
Version 2.10 of the Beebe drivers supports the LaserJet. These drivers will compile
under Unix, VMS, and on the Atari ST and DEC-20s.
For Unix systems, Karl Berry’s dviljk uses the same path-searching library as
web2c. dviljk is no longer distributed as a separate source, but the teTeX distribution
holds a copy under the name dvilj.
Beebe drivers: dviware/beebe
73 Output to “other” printers
In the early years of TeX, there were masses of DVI drivers for any (then) imaginable
kind of printer, but the steam seems rather to have gone out of the market for production
of such drivers for printer-specific formats. There are several reasons for this, but the
primary one is that few formats offer the flexibility available through PostScript, and
ghostscript is so good, and has such a wide range of printer drivers (perhaps this is
where the DVI output driver writers have all gone?).
The general advice, then, is to generate PostScript, and to process that with
ghostscript set to generate the format for the printer you actually have. If you are
using a Unix system of some sort, it’s generally quite easy to insert ghostscript into the
print spooling process.
ghostscript: Browse nonfree/support/ghostscript

42
74 DVI previewers
EmTeX for PCs running MS-DOS or OS/2, MiKTeX and XEmTeX for PCs running
Windows and OzTeX for the Macintosh, all come with previewers that can be used on
those platforms. EmTeX’s previewer can also be run under Windows 3.1.
Commercial PC TeX packages (see commercial vendors) have good previewers for
PCs running Windows, or for Macintoshes.
For Unix systems, there is one ‘canonical’ viewer, xdvi. Xdvik is a version of xdvi
using the web2c libraries; it is now built from the same distribution as xdvi. Unix
TeX distributions (such as teTeX) include a version of xdvik using the same version of
web2c as the rest of the distribution.
Alternatives to previewing include

• conversion to ‘similar’ ASCII text (see converting to ASCII) and using a conven-
tional text viewer to look at that,
• generating a PostScript version of your document and viewing it with a Ghostscript-
based previewer (see previewing PostScript files), and
• generating PDF output, and viewing that with Acrobat Reader or one of the sub-
stitutes for that.
xdvi: dviware/xdvi

75 Generating bitmaps from DVI


In the last analysis, any DVI driver or previewer is generating bitmaps: bitmaps for
placing tiny dots on paper via a laser- or inkjet-printer, or bitmaps for filling some por-
tion of your screen. However, it’s usually difficult to extract any of those bitmaps any
way other than by screen capture, and the resolution of that is commonly lamentable.
Why would one want separate bitmaps? Most often, the requirement is for some-
thing that can be included in HTML generated from (La)TeX source — not everything
that you can write in (La)TeX can be translated to HTML (at least, portable HTML that
may be viewed in ‘most’ browsers), so the commonest avoiding action is to generate a
bitmap of the missing bit. Examples are maths (a maths extension to the *ML family is
available but not widely used), and ‘exotic’ typescripts (ones that you cannot guarantee
your readers will have available). Other common examples are generation of sample
bitmaps, and generation for insertion into some other application’s display — to insert
equations into Microsoft PowerPoint, or to support the enhanced-emacs setup called
preview-latex (preview-latex).
In the past, the commonest way of generating bitmaps was to generate a PostScript
file of the DVI and then use ghostscript to produce the required bitmap format (possibly
by way of PNM format or something similar). This is an undesirable procedure (it is
very slow, and requires two or three steps) but it has served for a long time.
(La)TeX users may now take advantage of two bitmap ‘drivers’. The longest-
established, dvi2bitmap, will generate XBM and XPM formats, the long-deprecated
GIF format (which is now obsolescent, but is finally, in Summer 2003, to be relieved
of the patent protection of the LZW compression it uses), and also the modern (ISO-
standardised) PNG format.
Dvipng started out as a PNG renderer; from version 1.2 it can also render to the
GIF format. It is designed for speed, in environments that generate large numbers of
PNG files: the README mentions preview-latex, LyX, and a few web-oriented environ-
ments. Note that dvipng gives high-quality output even though its internal operations
are optimised for speed.
dvi2bitmap: dviware/dvi2bitmap
dvipng : dviware/dvipng

H Support Packages for TeX


76 Fig, a (La)TeX-friendly drawing package
(X)Fig is a menu driven tool that allows you to draw objects on the screen of an X
workstation; transfig is a set of tools which translate the code fig. The list of export
formats is very long, and includes MetaFont and MetaPost, Encapsulated PostScript
and PDF, as well as combinations that wrap a graphics format in a LaTeX import file.

43
There’s no explicit port of xfig to windows (although it is believed to work under
cygwin with their X-windows system). However, the program jfig is thought by many
to be an acceptable substitute, written in Java.
xfig : graphics/xfig
transfig : graphics/transfig

77 TeXCAD, a drawing package for LaTeX


TeXCAD is a program for the PC which enables the user to draw diagrams on screen
using a mouse or arrow keys, with an on-screen menu of available picture-elements. Its
output is code for the LaTeX picture environment. Optionally, it can be set to include
lines at all angles using the emTeX driver-family \specials (). TeXCAD is part of the
emTeX distribution.
A Unix port of the program (xtexcad) has been made.
emtex : systems/msdos/emtex
xtexcad : nonfree/graphics/xtexcad/xtexcad-2.4.1.tar.gz

78 Spelling checkers for work with TeX


For Unix, ispell has long the program of choice; it is well integrated with emacs, and
deals with some TeX syntax. However, it is increasingly challenged by aspell, which
was designed as a successor, and certainly performs better on most metrics; there re-
mains some question as to its performance with (La)TeX sources.
For Windows, there is a good spell checker incorporated into many of the shell/editor
combinations that are available. The spell checker from the (now defunct) 4AllTeX
shell remains available as a separate package, 4spell.
For the Macintosh, Excalibur is the program of choice. It will run in native mode
on both sorts of Macintosh. The distribution comes with dictionaries for several lan-
guages.
The VMS Pascal program spell makes special cases of some important features of
LaTeX syntax.
For MS-DOS, there are several programs. Amspell can be called from within an
editor, and jspell is an extended version of ispell.
4spell: support/4spell
amspell: support/amspell
aspell: Browse support/aspell — choose just those language dictionaries
(under subdirectory dict/ that you need.
excalibur : systems/mac/support/excalibur/Excalibur-4.0.2.sit.hqx
ispell: support/ispell/ispell-3.2.06.tar.gz
jspell: support/jspell
VMS spell: support/vmspell
winedt: systems/win32/winedt/winedt32.exe

79 How many words have you written?


One often has to submit a document (e.g., a paper or a dissertation) under some sort of
constraint about its size. Sensible people set a constraint in terms of numbers of pages,
but there are some that persist in limiting the numbers of words you type.
A simple solution to the requirement can be achieved following a simple observa-
tion: the powers that be are unlikely to count all the words of a document submitted
to them. Therefore, a statistical method can be employed: find how many words there
are on a full page; find how many full pages there are in the document (allowing for
displays of various sorts, this number will probably not be an integer); multiply the
two. However, if the document to be submitted is to determine the success of the rest
of one’s life, it takes a brave person to thumb their nose at authority quite so compre-
hensively. . .
The simplest method is to strip out the (La)TeX markup, and to count what’s left.
On a Unix-like system, this may be done using detex and the built-in wc:
detex <filename> | wc -w

44
The latexcount script does the same sort of job, in one “step”; being a perl script, it is
in principle rather easily configured (see documentation inside the script). Winedt (see
editors and shells) provides this functionality direct in the Windows environment.
Simply stripping (La)TeX markup isn’t entirely reliable, however: that markup
itself may contribute typeset words, and this could be a problem. The wordcount pack-
age contains a Bourne shell (i.e., typically Unix) script for running a LaTeX file with
a special piece of supporting TeX code, and then counting word indications in the log
file. This is probably as accurate automatic counting as you can get.
detex : support/detex
wordcount: macros/latex/contrib/wordcount

I Literate programming
80 What is Literate Programming?
Literate programming is the combination of documentation and source together in a
fashion suited for reading by human beings. In general, literate programs combine
source and documentation in a single file. Literate programming tools then parse the
file to produce either readable documentation or compilable source. The WEB style of
literate programming was created by D. E. Knuth during the development of TeX.
The “documented LaTeX” style of programming () is regarded by some as a form
of literate programming, though it only contains a subset of the constructs Knuth used.
Discussion of literate programming is conducted in the newsgroup comp.programming.
literate, whose FAQ is stored on CTAN. Another good source of information is
http://www.literateprogramming.com/
Literate Programming FAQ: help/comp.programming.literate_FAQ
81 WEB systems for various languages
TeX is written in the programming language WEB; WEB is a tool to implement the
concept of “literate programming”. Knuth’s original implementation will be in any
respectable distribution of TeX, but the sources of the two tools (tangle and weave),
together with a manual outlining the programming techniques, may be had from CTAN.
CWEB, by Silvio Levy, is a WEB for C programs.
Spidery WEB, by Norman Ramsey, supports many languages including Ada, awk,
and C and, while not in the public domain, is usable without charge. It is now super-
seded by noweb (also by Norman Ramsay) which incorporates the lessons learned in
implementing spidery WEB, and which is a simpler, equally powerful, tool.
FWEB, by John Krommes, is a version for Fortran, Ratfor, and C.
SchemeWEB, by John Ramsdell, is a Unix filter that translates SchemeWEB into
LaTeX source or Scheme source.
APLWEB is a version of WEB for APL.
FunnelWeb is a version of WEB that is language independent.
Other language independent versions of WEB are nuweb (which is written in ANSI
C).
Tweb is a WEB for Plain TeX macro files, using noweb.
aplweb: web/apl/aplweb
cweb: web/c_cpp/cweb
funnelweb: web/funnelweb
fweb: web/fweb
noweb: web/noweb
nuweb: web/nuweb
schemeweb: web/schemeweb
spiderweb: web/spiderweb
tangle: systems/knuth/web
tweb: web/tweb
weave: systems/knuth/web

45
J Format conversions
82 Conversion from (La)TeX to plain text
The aim here is to emulate the Unix nroff , which formats text as best it can for the
screen, from the same input as the Unix typesetting program troff .
Converting DVI to plain text is the basis of many of these techniques; sometimes
the simple conversion provides a good enough response. Options are:

• dvi2tty (one of the earliest),


• crudetype and
• catdvi, which is capable of generating Latin-1 (ISO 8859-1) or UTF-8 encoded
output. Catdvi was conceived as a replacement for dvi2tty, but can’t (quite) be
recommended as a complete replacement yet.

A common problem is the hyphenation that TeX inserts when typesetting something:
since the output is inevitably viewed using fonts that don’t match the original, the
hyphenation usually looks silly.
Ralph Droms provides a txt bundle of things in support of ASCII generation, but
it doesn’t do a good job with tables and mathematics. An alternative is the screen
package.
Another possibility is to use the LaTeX-to-ASCII conversion program, l2a, al-
though this is really more of a de-TeXing program.
The canonical de-TeXing program is detex, which removes all comments and con-
trol sequences from its input before writing it to its output. Its original purpose was
to prepare input for a dumb spelling checker, and it’s only usable for preparing useful
ASCII versions of a document in highly restricted circumstances.
Tex2mail is slightly more than a de-TeXer — it’s a Perl script that converts TeX
files into plain text files, expanding various mathematical symbols (sums, products,
integrals, sub/superscripts, fractions, square roots, . . . ) into “ASCII art” that spreads
over multiple lines if necessary. The result is more readable to human beings than the
flat-style TeX code.
Another significant possibility is to use one of the HTML-generation solutions
(HTML-generation solutions), and then to use a browser such as lynx to dump the
resulting HTML as plain text.
catdvi: dviware/catdvi
crudetype: dviware/crudetype
detex : support/detex
dvi2tty : nonfree/dviware/dvi2tty
l2a: support/l2a
screen.sty : macros/latex209/contrib/misc/screen.sty
tex2mail: support/tex2mail
txt: support/txt

83 Conversion from SGML or HTML to TeX


SGML is a very important system for document storage and interchange, but it has no
formatting features; its companion ISO standard DSSSL (see http://www.jclark.
com/dsssl/) is designed for writing transformations and formatting, but this has not
yet been widely implemented. Some SGML authoring systems (e.g., SoftQuad Author/
Editor) have formatting abilities, and there are high-end specialist SGML typesetting
systems (e.g., Miles33’s Genera). However, the majority of SGML users probably
transform the source to an existing typesetting system when they want to print. TeX is
a good candidate for this. There are three approaches to writing a translator:

1. Write a free-standing translator in the traditional way, with tools like yacc and lex;
this is hard, in practice, because of the complexity of SGML.
2. Use a specialist language designed for SGML transformations; the best known are
probably Omnimark and Balise. They are expensive, but powerful, incorporating
SGML query and transformation abilities as well as simple translation.

46
3. Build a translator on top of an existing SGML parser. By far the best-known (and
free!) parser is James Clark’s nsgmls, and this produces a much simpler output
format, called ESIS, which can be parsed quite straightforwardly (one also has the
benefit of an SGML parse against the DTD). Two good public domain packages
use this method:
• David Megginson’s sgmlspm, written in Perl 5.
• Joachim Schrod and Christine Detig’s STIL, (‘SGML Transformations in
Lisp’).
Both of these allow the user to write ‘handlers’ for every SGML element, with
plenty of access to attributes, entities, and information about the context within
the document tree.
If these packages don’t meet your needs for an average SGML typesetting job, you
need the big commercial stuff.

Since HTML is simply an example of SGML, we do not need a specific system for
HTML. However, Nathan Torkington developed html2latex from the HTML parser in
NCSA’s Xmosaic package. The program takes an HTML file and generates a LaTeX
file from it. The conversion code is subject to NCSA restrictions, but the whole source
is available on CTAN.
Michel Goossens and Janne Saarela published a very useful summary of SGML,
and of public domain tools for writing and manipulating it, in TUGboat 16(2).
html2latex source: support/html2latex

84 Conversion from (La)TeX to HTML


TeX and LaTeX are well suited to producing electronically publishable documents.
However, it is important to realize the difference between page layout and functional
markup. TeX is capable of extremely detailed page layout; HTML is not, because
HTML is a functional markup language not a page layout language. HTML’s exact
rendering is not specified by the document that is published but is, to some degree, left
to the discretion of the browser. If you require your readers to see an exact replication
of what your document looks like to you, then you cannot use HTML and you must
use some other publishing format such as PDF. That is true for any HTML authoring
tool.
TeX’s excellent mathematical capabilities remain a challenge in the business of
conversion to HTML. There are only two generally reliable techniques for generating
mathematics on the web: creating bitmaps of bits of typesetting that can’t be trans-
lated, and using symbols and table constructs. Neither technique is entirely satisfac-
tory. Bitmaps lead to a profusion of tiny files, are slow to load, and are inaccessible
to those with visual disabilities. The symbol fonts offer poor coverage of mathemat-
ics, and their use requires configuration of the browser. The future of mathematical
browsing may be brighter — see future Web technologies.
For today, possible packages are:
LaTeX2HTML a Perl script package that supports LaTeX only, and generates math-
ematics (and other “difficult” things) using bitmaps. The original version was
written by Nikos Drakos for Unix systems, but the package now sports an illustri-
ous list of co-authors and is also available for Windows systems. Michel Goossens
and Janne Saarela published a detailed discussion of LaTeX2HTML, and how to
tailor it, in TUGboat 16(2).
A mailing list for users may be found via http://tug.org/mailman/listinfo/
latex2html
TtH a compiled program that supports either LaTeX or Plain TeX, and uses the
font/table technique for representing mathematics. It is written by Ian Hutchin-
son, using flex. The distribution consists of a single C source (or a compiled
executable), which is easy to install and very fast-running.
Tex4ht a compiled program that supports either LaTeX or Plain TeX, by processing
a DVI file; it uses bitmaps for mathematics, but can also use other technologies
where appropriate. Written by Eitan Gurari, it parses the DVI file generated when
you run (La)TeX over your file with tex4ht’s macros included. As a result, it’s
pretty robust against the macros you include in your document, and it’s also pretty
fast.

47
TeXpider a commercial program from MicroPress (see Micropress), which is de-
scribed on http://www.micropress-inc.com/webb/wbstart.htm; it uses
bitmaps for equations.
Hevea a compiled program that supports LaTeX only, and uses the font/table tech-
nique for equations (indeed its entire approach is very similar to TtH). It is writ-
ten in Objective CAML by Luc Maranget. Hevea isn’t archived on CTAN; de-
tails (including download points) are available via http://pauillac.inria.fr/
~maranget/hevea/

An interesting set of samples, including conversion of the same text by the four
free programs listed above, is available at http://www.mayer.dial.pipex.com/
samples/example.htm; a linked page gives lists of pros and cons, by way of compar-
ison.
The World Wide Web Consortium maintains a list of “filters” to HTML, with
sections on (La)TeX and BibTeX — see http://www.w3.org/Tools/Word_proc_
filters.html
latex2html: Browse support/latex2html
tex4ht: support/TeX4ht/tex4ht-all.zip
tth: nonfree/support/tth/dist/tth_C.tgz
85 Other conversions to and from (La)TeX
troff Tr2latex, assists in the translation of a troff document into LaTeX 2.09 format. It
recognises most -ms and -man macros, plus most eqn and some tbl preprocessor
commands. Anything fancier needs to be done by hand. Two style files are pro-
vided. There is also a man page (which converts very well to LaTeX. . . ). Tr2latex
is an enhanced version of the earlier troff-to-latex (which is no longer available).
WordPerfect wp2latex has recently been much improved, and is now available either
for MS-DOS or for Unix systems, thanks to its current maintainer Jaroslav Fojtik.
PC-Write pcwritex.arc is a print driver for PC-Write that “prints” a PC-Write V2.71
document to a TeX-compatible disk file. It was written by Peter Flynn at University
College, Cork, Republic of Ireland.
runoff Peter Vanroose’s rnototex conversion program is written in VMS Pascal. The
sources are distributed with a VAX executable.
refer/tib There are a few programs for converting bibliographic data between BibTeX
and refer/tib formats. The collection includes a shell script converter from BibTeX
to refer format as well. The collection is not maintained.
RTF Rtf2tex, by Robert Lupton, is for converting Microsoft’s Rich Text Format to
TeX. There is also a convertor to LaTeX by Erwin Wechtl, called rtf2latex. The
latest converter, by Ujwal Sathyam and Scott Prahl, is rtf2latex2e; this system
seems rather good already, and is still being improved.
Translation to RTF may be done (for a somewhat constrained set of LaTeX doc-
uments) by TeX2RTF, which can produce ordinary RTF, Windows Help RTF (as
well as HTML, conversion to HTML). TeX2RTF is supported on various Unix
platforms and under Windows 3.1
Microsoft Word A rudimentary (free) program for converting MS-Word to LaTeX
is wd2latex, which runs on MS-DOS. Word2TeX and TeX2Word are shareware
translators from Chikrii Softlab; users’ reports are very positive.
If cost is a constraint, the best bet is probably to use an intermediate format such
as RTF or HTML. Word outputs and reads both, so in principle this route may be
useful.
Another, unlikely, intermediate form is PDF: Acrobat Reader for Windows (ver-
sion 5.0 and later) will output rather feeble RTF that Word can read.
Excel Excel2Latex converts an Excel file into a LaTeX tabular environment; it comes
as a .xls file which defines some Excel macros to produce output in a new format.
Wilfried Hennings’ FAQ, which deals specifically with conversions between TeX-
based formats and word processor formats, offers much detail as well as tables that
allow quick comparison of features.
A group at Ohio State University (USA) is working on a common document format
based on SGML, with the ambition that any format could be translated to or from
this one. FrameMaker provides “import filters” to aid translation from alien formats
(presumably including TeX) to FrameMaker’s own.
48
excel2latex : support/excel2latex/xl2latex.zip
pcwritex.arc: support/pcwritex
refer and tib tools: biblio/bibtex/utils/refer-tools
rnototex : support/rnototex
rtf2latex : support/rtf2latex
rtf2latex2e: support/rtf2latex2e
rtf2tex : support/rtf2tex
tex2rtf : support/tex2rtf
tr2latex : support/tr2latex
wd2latex : dviware/wd2latex
wp2latex : support/wp2latex
Word processor FAQ (source): help/wp-conv
86 Using TeX to read SGML or XML directly
This can nowadays be done, with a certain amount of clever macro programming.
David Carlisle’s xmltex is the prime example; it offers a practical solution to typesetting
XML files.
One use of a TeX that can typeset XML files is as a backend processor for XSL
formatting objects, serialized as XML. Sebastian Rahtz’s PassiveTeX uses xmltex to
achieve this end.
xmltex : macros/xmltex/base
passivetex : macros/xmltex/contrib/passivetex

87 Retrieving (La)TeX from DVI, etc.


The job just can’t be done automatically: DVI, PostScript and PDF are “final” formats,
supposedly not susceptible to further editing — information about where things came
from has been discarded. So if you’ve lost your (La)TeX source (or never had the
source of a document you need to work on) you’ve a serious job on your hands. In
many circumstances, the best strategy is to retype the whole document, but this strategy
is to be tempered by consideration of the size of the document and the potential typists’
skills.
If automatic assistance is necessary, it’s unlikely that any more than text retrieval
is going to be possible; the (La)TeX markup that creates the typographic effects of the
document needs to be recreated by editing.
If the file you have is in DVI format, many of the techniques for converting (La)TeX
to ASCII (converting (La)TeX to ASCII) are applicable. Consider dvi2tty, crudetype
and catdvi. Remember that there are likely to be problems finding included material
(such as included PostScript figures, that don’t appear in the DVI file itself), and math-
ematics is unlikely to convert easily.
To retrieve text from PostScript files, the ps2ascii tool (part of the ghostscript dis-
tribution) is available. One could try applying this tool to PostScript derived from an
PDF file using pdf2ps (also from the ghostscript distribution), or Acrobat Reader itself;
an alternative is pdftotext, which is distributed with xpdf .
Another avenue available to those with a PDF file they want to process is offered
by Adobe Acrobat (version 5 or later): you can tag the PDF file into an estructured doc-
ument, output thence to well-formed XHTML, and import the results into Microsoft
Word (2000 or later). From there, one may convert to (La)TeX using one of the tech-
niques discussed in converting to and from (La)TeX.
The result will typically (at best) be poorly marked-up. Problems may also arise
from the oddity of typical TeX font encodings (notably those of the maths fonts), which
Acrobat doesn’t know how to map to its standard Unicode representation.
catdvi: dviware/catdvi
crudetype: dviware/crudetype
dvi2tty : nonfree/dviware/dvi2tty
ghostscript: Browse nonfree/support/ghostscript
xpdf : Browse support/xpdf
49
88 Translating LaTeX to Plain TeX
Unfortunately, no “general”, simple, automatic process is likely to succeed at this task.
See “How does LaTeX relate to Plain TeX” (How does LaTeX relate to Plain TeX) for
further details.
Translating a document designed to work with LaTeX into one designed to
work with Plain TeX is likely to amount to carefully including (or otherwise re-
implementing) all those parts of LaTeX, beyond the provisions of Plain TeX, which
the document uses.

K Installing (La)TeX files


89 Installing a new package
The first step in installing a new package for your LaTeX system is usually to find
where it is and then to get it, usually from CTAN. However MiKTeX, since version
2.1, offers a simpler procedure than that described here, for packages it knows about.
Ordinarily, you should download the whole distribution directory; the only occa-
sion when this is not necessary is when you are getting something from one of the
(La)TeX contributed “misc” directories on CTAN; these directories contain collections
of single files, which are supposedly complete in themselves.
A small package hsmallpacki might be just a single .sty file (typically smallpack.
sty) with the usage instructions either included as comments in the file or in a separate
user manual or README file. More often a package hpacki will come as a pair of files,
pack.ins and pack.dtx, written to be used with the LaTeX doc system. The package
code must be extracted from these files. If there is a README file as part of the package
distribution, read it!
In the doc system, the user manual and documented package code is in the .dtx
file, and the .ins file contains LaTeX instructions on what code should be extracted
from the .dtx file. To unpack a doc package hpacki, do the following:
• Run latex on pack.ins. This will generate one or more files (normally a pack.sty
file but there may be others depending on the particular package).
• Run latex on pack.dtx as a start to getting the user manual and possibly a com-
mented version of the package code.
• Run latex again on pack.dtx, which should resolve any references and generate a
Table of Contents if it was called for.
• LaTeX may have said “No file pack.ind”; this is the source for the command
index; if you want the index, process the raw material with:
makeindex -s gind.ist pack
and run LaTeX again.
• Print and read pack.dvi
Sometimes a user manual is supplied separately from the .dtx file. Process this after
doing the above, just in case the user manual uses the package it is describing.
Almost the final stage of the installation is to put the package file(s) ‘where LaTeX
can find them’. Where the magic place is, and how you put the files there depends
on your particular LaTeX system and how it is set up (see the TeX directory structure
standard for general principles, where to put files for specific advice).
The final stage is to tell LaTeX that there is a new file, or files, that it should be
able to go and find. Most free LaTeX systems maintain a database of the names and
locations of latex-related files to enable faster searching. In these systems the database
must be updated, using the script or program provided with the distribution for this
purpose.
teTeX Run:
texhash
web2c On a current web2c distribution, texhash ought to work; if it doesn’t, run
mktexlsr
MiKTeX On a \miktex{} distribution earlier than v2.0, click Start→Programs→
MiKTeX→Maintenance→Refresh filename database
or get a DOS window and run:
initexmf --update-fndb
50
On a \miktex{} distribution v2.0 or later, do:
Start→Programs→MiKTeX 2→MiKTeX Options, and press the Refresh now
button (Update filename database in earlier versions of MiKTeX).

Remember that a \usepackage{pack} command must be put in the preamble of


each document in which you want to use the pack package.
90 Where to put new files
Where precisely you put files that you have downloaded does depend on what TeX
distribution you have. However, assuming that you have one of the modern TDS-
compliant distributions (such as teTeX or MiKTeX) there are some general rules that
you can follow:
(1) Always install new files in a local texmf tree. The root directory will be named
something like:
teTeX: /usr/share/texmf-local/ or
/usr/local/share/texmf/
fpTeX: c:\Programs\TeXLive\texmf-local\
MiKTeX: c:\localtexmf\
(In fact, a teTeX system can be asked to tell you what its local root is; on a Unix system,
the command to use is:

kpsewhich -expand-var "\$TEXMFLOCAL"

the output being the actual path.)


Let’s write $TEXMF for this root, whatever it is for your system.
(2) In your local texmf tree, imitate the directory structure in your main tree. Here are
some examples of where files of given extensions should go:
.sty, .cls or .fd: $TEXMF/tex/latex/<package>/
.dvi, .ps or .pdf: $TEXMF/doc/latex/<package>/
.mf: $TEXMF/fonts/source/<supplier>/<font>/
.tfm: $TEXMF/fonts/tfm/<supplier>/<font>/
.vf: $TEXMF/fonts/vf/<supplier>/<font>/
.afm: $TEXMF/fonts/afm/<supplier>/<font>/
.pfb: $TEXMF/fonts/type1/<supplier>/<font>/
.ttf: $TEXMF/fonts/truetype/<supplier>/<font>/
.pool, .fmt, .base or .mem: $TEXMF/web2c
and for modern systems (distributed in 2005 or later, such as teTeX 3.0, using TDS
v1.1 layouts):
.map: $TEXMF/fonts/map/<syntax>/<bundle>/
.enc: $TEXMF/fonts/enc/<syntax>/<bundle>/
Where of course hpackagei, hfonti and hsupplieri depend upon what’s appropriate for
the individual file. The hsyntaxi (for .map and .enc files) is a categorisation based
on the way the files are written; typically, it’s the name of a program such as dvips or
pdftex.
“Straight” (La)TeX input can take other forms than the .sty, .cls or .fd listed
above, too. Examples are .sto and .clo for package and class options, .cfg for
configuration information, and so on.
Note that hfonti may stand for a single font or an entire family: for example, files
for all of Knuth’s Computer Modern fonts are to be found in .../public/cm, with
various prefixes as appropriate.
The font “supplier” public is a sort of hold-all for “free fonts produced for use with
(La)TeX”: as well as Knuth’s fonts, public’s directory holds fonts designed by others
(originally, but no longer exclusively, in MetaFont).
Some packages have configuration files (commonly with file suffix .cfg), and
occasionally other run-time files. The package documentation should mention these
things, but sometimes doesn’t. A common exception is the .drv file, used by some
packages as part of the documentation building process; this is a hang-over from the
pre-LaTeX 2ε predecessor of the package documentation process.

51
91 Installing MiKTeX “known packages”
MiKTeX 2.1 (and later) maintains a database of packages it “knows about”, together
with (coded) installation instructions that enable it to install the packages with minimal
user intervention.
If MiKTeX does know about a package you need installed, it’s worth using the
system.
First, open the MiKTeX packages window: click on Start→Programs→MiKTeX→
MiKTeX Options, and select the Packages tab.
On the tab, there is an Explorer-style display of packages. Right-click on the root
of the tree, “MiKTeX Packages”, and select “Search”: enter the name of the package
you’re interested in, and press the “Search” button. If MiKTeX knows about your
package, it will open up the tree to show you a tick box for your package: check that
box.
If you prefer a command-line utility, there’s mpm. Open a command shell, and
type:

mpm --install=<package>

(which of course assumes you know the name by which MiKTeX refers to your pack-
age).
If MiKTeX doesn’t know about the package you’re interested in, you have to use
the long-winded procedure outlined elsewhere in this FAQ.
If necessary, repeat to select other packages, and then press “OK”; MiKTeX tells you
how many packages you have selected — if you’re happy, press “OK” again. MiKTeX
will go off, download the package (as a .cab file), if necessary, install the files of the
package, and then refresh the filename database so that the files will be found.
92 “Temporary” installation of (La)TeX files
Operating systems and applications need to know where to find files: many files that
they need are “just named” — the user doesn’t necessarily know where they are, but
knows to ask for them. The commonest case, of course, is the commands whose names
you type to a shell (yes, even Windows’ “MS-DOS prompt”) are using a shell to read
what you type: many of the commands simply involve loading and executing a file, and
the PATH variable tells the shell where to find those files.
Modern TeX implementations come with a bunch of search paths built in to them.
In most circumstances these paths are adequate, but one sometimes needs to extend
them to pick up files in strange places: for example, we may wish to try a new bundle
of packages before installing them ‘properly’ (installing them ‘properly’). To do this,
we need to change the relevant path as TeX perceives it. However, we don’t want to
throw away TeX’s built-in path (all of a sudden, TeX won’t know how to deal with all
sorts of things).
To extend a TeX path, we define an operating system environment variable in ‘path
format’, but leaving a gap which TeX will fill with its built-in value for the path. The
commonest case is that we want to place our extension in front of the path, so that our
new things will be chosen in preference, so we leave our ‘gap to be filled’ at the end of
the environment variable. The syntax is simple (though it depends which shell you’re
actually using): so on a Unix-like system, using the bash shell, the job might be done
like:
export TEXINPUTS=/tmp:
while in a Windows system, within a MS-DOS window, it would be:
set TEXINPUTS=C:/temp;
In either case, we’re asking TeX to load files from the root disc temporary files direc-
tory; in the Unix case, the “empty slot” is designated by putting the path separator ‘:’
on its own at the end of the line, while in the Windows case, the technique is the same,
but the path separator is ‘;’.
Note that in either sort of system, the change will only affect instances of TeX that
are started from the shell where the environment variable was set. If you run TeX from
another window, it will use the original input path. To make a change of input path
that will “stick” for all windows, set the environment variable in your login script or
profile (or whatever) in a Unix system and log out and in again, or in autoexec.bat
in a Windows system, and reboot the system.

52
While all of the above has talked about where TeX finds its macro files, it’s appli-
cable to pretty much any sort of file any TeX-related program reads — there are lots
of these paths, and of their corresponding environment variables. In a web2c-based
system, the copious annotations in the texmf.cnf system configuration file help you
to learn which path names correspond to which type of file.
93 “Private” installations of files
It sometimes happens that you need a new version of some macro package or font,
but that the machine you use is maintained by someone who’s unwilling to update
and won’t give you privileges to do the job yourself. A “temporary” installation is
sometimes the correct approach, but if there’s the slightest chance that the installation
will be needed on more than one project, temporary installations aren’t right.
In circumstances where you have plenty of quota on backed-up media, or adequate
local scratch space, the correct approach is to create a private installation of (La)TeX
which includes the new stuff you need; this is the ideal, but is not generally possible.
So, since you can’t install into the public texmf tree, you have to install into a
texmf of your own; fortunately, the TDS standard allows for this, and teTeX 2.0 ac-
tually makes provision for it, defining an internal variable HOMETEXMF which points
to the directory $HOME/texmf. (TeTeX 1.0 had the definition, but suppressed it with
comment markers.)
So, install your new package (or whatever) in the correct place in a tree based on
$HOME/texmf, and generate an index of that tree

texhash $HOME/texmf

(the argument specifies which tree you are indexing: it’s necessary since you don’t, by
hypothesis, have access to the main tree, and texhash without the argument would try
to write the main tree.
There are two wrinkles to this simple formula: first, the installation you’re using
may not define HOMETEXMF (teTeX 1.0 didn’t, for example), and second, there may be
some obstruction to using $HOME/texmf as the default name. In either case, a good
solution is to have your own texmf.cnf — an idea that sounds more frightening that
it actually is. The installation’s existing file may be located with the command:

kpsewhich texmf.cnf

Take a copy of the file and put it into a directory of your own; this could be any di-
rectory, but an obvious choice is the web2c directory of the tree you want to create,
i.e., $HOME/texmf/web2c or the like. Make an environment variable to point to this
directory:

TEXMFCNF=$HOME/texmf/web2c
export TEXMFCNF

(for a Bourne shell style system), or

setenv TEXMFCNF $HOME/texmf/web2c

(for a C-shell style system). Now edit the copy of texmf.cnf


There will be a line in the existing file that defines the tree where everything
searches; the simplest form of the line is:

TEXMF = !!$TEXMFMAIN

but, as teTeX 1.0 is distributed, there are several alternative settings behind comment
markers (“%”), and the person who installed your system may have left them there.
Whatever, you need to modify the line that’s in effect: change the above to three lines:

HOMETEXMF = $HOME/texmf
TEXMF = {$HOMETEXMF,!!$TEXMFMAIN}
% TEXMF = !!$TEXMFMAIN

the important point being that $HOMETEXMF must come before whatever was there be-
fore, inside the braces. For example, if the original was

TEXMF = {!!$LOCALTEXMF,!!$TEXMFMAIN}
53
it should be converted to:
HOMETEXMF = $HOME/texmf
TEXMF = {$HOMETEXMF,!!$LOCALTEXMF,!!$TEXMFMAIN}
% TEXMF = {!!$LOCALTEXMF,!!$TEXMFMAIN}

(retaining the original, as a comment, is merely an aide-memoir in case you need to


make another change, later). The !! signs tell the file-searching library that it should
insist on a texhash-ed directory tree; if you can count on yourself remembering to run
texhash on your new tree every time you change it, then it’s worth adding the marks to
your tree:
TEXMF = {!!$HOMETEXMF,!!$LOCALTEXMF,!!$TEXMFMAIN}

as this will make (La)TeX find its files marginally faster.


Having made all these changes, (La)TeX should “just use” files in your new tree,
in preference to anything in the main tree — you can use it for updates to packages in
the main tree, as well as for installing new versions of things.
94 Installing a new font
Fonts are really “just another package”, and so should be installed in the same sort
of way as packages. However, fonts tend to be more complicated than the average
package, and as a result it’s sometimes difficult to see the overall structure.
Font files may appear in any of a large number of different formats; each format has
a different function in a TeX system, and each is stored in a directory its own sub-tree
in the installation’s TDS tree; all these sub-trees have the directory $TEXMF/fonts as
their root. A sequence of answers, below, describes the installation of fonts. Other
answers discuss specific font families — see, for example, using the concrete fonts.
95 Installing a font provided as MetaFont source
Metafont fonts are (by comparison with other sorts of font) rather pleasingly simple.
Nowadays, they are mostly distributed just as the MetaFont source, since modern TeX
distributions are able to produce everything the user needs “on the fly”; however, if the
distribution does include TFM files, do install them, too, since they save a little time
and don’t occupy much disc space. Always distrust distributions of PK font bitmap
files: there’s no way of learning from them what printer they were generated for, and
naming schemes under different operating systems are another source of confusion.
“Where to install files” specifies where the files should go.
Further confusion is introduced by font families whose authors devise rules for
automatic generation of MetaFont sources for generating fonts at particular sizes; the
installation has to know about the rules, as otherwise it cannot generate font files.
96 Installing a PostScript printer built-in font
There is a “standard” set of fonts that has appeared in every PostScript printer since the
second generation of the type. These fonts (8 families of four text fonts each, and three
special-purpose fonts) are of course widely used, because of their simple availability.
The set consists of:
• Times family (4 fonts)
• Palatino family (4 fonts)
• New Century Schoolbook family (4 fonts)
• Bookman family (4 fonts)
• Helvetica family (4 fonts)
• Avant Garde (4 fonts)
• Courier family (4 fonts)
• Utopia family (4 fonts)
• Zapf Chancery (1 font)
• Zapf Dingbats (1 font)
• Symbol (1 font)
All these fonts are supported, for LaTeX users, by the psnfss set of metrics and support
files in the file lw35nfss.zip on CTAN. Almost any remotely modern TeX system
will have some version of psnfss installed, but users should note that the most recent
version has much improved coverage of maths-with-Times and -Palatino, as well as a
more reliable set of font metrics.
54
The archive lw35nfss.zip is laid out according to the TDS, so in principle, instal-
lation consists simply of “unzipping” the file at the root of a texmf tree.
Documentation of the psnfss bundle is provided in psnfss2e.pdf in the distribu-
tion.
psnfss bundle: macros/latex/required/psnfss

97 Installing the Bluesky versions of the CM fonts


This is a specialised case of installing a font (installing a font), but it comes easier than
most, since the font metrics are installed in every (La)TeX system before you even start.
Indeed, most recent systems will have the Type 1 fonts themselves already installed, so
that the job is already done, and all you need is to start using them: so the first thing
to do is to just try it. On a system that uses dvips (most systems nowadays do), try the
sequence:
latex sample2e
dvips -Pcmz -Pamz -o sample2e.ps sample2e
at a “command prompt” (shell, in a Unix-style system, “DOS box” in a Windows sys-
tem).
If the command works at all, the console output of the command will include a
sequence of Type 1 font file names, listed as <cmr10.pfb> and so on; this is dvips
telling you it’s including the Type 1 font, and you need do no more.
If the test has failed, you need to install your own set of the fonts.
The CTAN directories listed below contain compressed archives of the Type 1 files
for various architectures, both for the Computer Modern fonts and for the AMS fonts
of mathematical and other useful things. Download the archives that are appropriate
for your architecture, and extract the files — you only actually need the contents of the
pfb directories, since you already have the fonts installed in the “ordinary” way, so that
the TFM files are already present. (You don’t need the PostScript font metric — AFM
and PFM — files in any case.)
The files should go into your local texmf tree (texmf.local, texmf-local,
localtexmf, or whatever). Create directories at offsets fonts/type1/bluesky/cm
and fonts/type1/bluesky/ams, and copy the pfb files into them.
Now you need to tell dvips, PDFTeX, etc., that the fonts are available. This is done
by use of a map file, which lists font name (as TeX understands it), font name (as it
appears in the type 1 file itself), and where the program will find the file. Map files are
provided in the download bundles for the AMS fonts; for the CM fonts, map files are
available separately.
The set of map files includes files config.*; each of these contains an instruction
to load a single map file. For ordinary use, you instruct dvips to load the “detailed”
map of the CM fonts by use of the command:
dvips -Pcmz myfile
The same can be done with the AMS fonts, and you may invoke both sets of fonts with:
dvips -Pcmz -Pamz myfile
Alternatively, the contents of config.cmz and config.amz could be combined into a
single file, perhaps config.bluesky, loaded by the command
dvips -Pbluesky myfile
Remember, after all such changes, the file-name database must be refreshed (file-
name database must be refreshed).
AMS fonts: Browse fonts/amsfonts/ps-type1
CM fonts: Browse fonts/cm/ps-type1/bluesky
CM font maps: fonts/cm/ps-type1/bluesky-contrib/dvips

98 Installing a Type 1 font


The process of installing a Type 1 font set is rather convoluted, but it may be separated
into a modest set of stages.

• Acquire the font. A very small set of Type 1 fonts is installed in most PostScript
printers you will encounter. For those few (whose use is covered by the basic
PSNFSS package), you don’t need the Type 1 font itself, to be able to print using
the font.

55
For all the myriad other Type 1 fonts, to be able to print using the font you need the
Type 1 file itself. Some of these are available for free (they’ve either been donated
to the public domain, or were developed as part of a free software project), but the
vast majority are commercial products, requiring you to spend real money.
• Acquire the fonts’ AFM files. AFM files contain information from the font
foundry, about the sizes of the characters in the font, and how they fit together.
One measure of the quality of a font-supplier is that they provide the AFM files
by default: if the files are not available, you are unlikely to be able to use the font
with (La)TeX.
• Rename the AFM files and the Type 1 files to match the Berry font naming scheme.
• Generate TeX metric files from the AFM files. The commonest tool for this task
is fontinst; the package documentation helps, but other guides are available (see
below). The simplest possible script to pass to fontinst is:
\latinfamily{xyz}{}
\bye
where xyz is the Berry name of the font family. This simple script is adequate
for most purposes: its output covers the font family in both T1 and OT1 font
encodings. Nevertheless, with fancier fonts, more elaborate things are possible
with fontinst: see the documentation for details.
Fontinst also generates map files, and LaTeX font definition (.fd) files.
• Install the files, in your texmf tree. All the strictures about installing non-standard
things apply here: be sure to put the files in the local tree. The list gives reasonable
destinations for the various files related to a font whose Berry name is hbnamei:
.pfb,
.pfa .../fonts/type1/<foundry>/<bname>
.tfm .../fonts/tfm/<foundry>/<bname>
.vf .../fonts/vf/<foundry>/<bname>
.fd .../tex/latex/fontinst/<foundry>/<bname>
The irregular things being .map files: in teTeX 3.0 and later, these should be placed
according to the revised TDS as
.map .../fonts/map/dvips/<foundry>
and in other (earlier) systems as
.map .../dvips/fontinst/<foundry>
• Regenerate the file indexes (as described in package installation).
• Update the dvips and other maps:
– On a teTeX system earlier than version 2.0, edit the file $TEXMF/dvips/
config/updmap and insert an absolute path for the lm.map just after the line
that starts extra_modules=" (and before the closing quotes).
– On a teTeX version 2.0 (or later), execute the command
updmap --enable Map <xyz>.map
– On a MiKTeX system earlier than version 2.2, the “Refresh filename database”
operation, which you performed after installing files, also updates the system’s
“PostScript resources database”.
– On a MiKTeX system, version 2.2 or later, update updmap.cfg which is
described in MiKTeX online documentation. Then execute the command
initexmf --mkmaps, and the job is done.

The whole process is very well (and thoroughly) described in Philipp Lehman’s guide
to font installation, which may be found on CTAN.
fontinst.sty : fonts/utilities/fontinst
Type 1 installation guide: info/Type1fonts/fontinstallationguide/
fontinstallationguide.pdf

56
L Fonts
L.1 MetaFont fonts
99 Getting MetaFont to do what you want
MetaFont allows you to create your own fonts, and most TeX users will never need
to use it. MetaFont, unlike TeX, requires some customisation: each output device for
which you will be generating fonts needs a mode associated with it. Modes are defined
using the mode_def convention described on page 94 of The MetaFontbook (see TeX-
related books). You will need a file, which conventionally called local.mf, containing
all the mode_defs you will be using. If local.mf doesn’t already exist, Karl Berry’s
collection of modes (modes.mf) is a good starting point (it can be used as a ‘local.mf’
without modification in a ‘big enough’ implementation of MetaFont). Lists of settings
for various output devices are also published periodically in TUGboat (see TUG). Now
create a plain base file using inimf , plain.mf, and local.mf:

% inimf
This is METAFONT. . .
**plain you type ‘plain’
(output)
*input local you type this
(output)
*dump you type this
Beginning to dump on file plain. . .
(output)

This will create a base file named plain.base (or something similar; for example,
it will be PLAIN.BAS on MS-DOS systems) which should be moved to the directory
containing the base files on your system (note that some systems have two or more
such directories, one for each ‘size’ of MetaFont used).
Now you need to make sure MetaFont loads this new base when it starts up. If
MetaFont loads the plain base by default on your system, then you’re ready to go.
Under Unix (using the default web2c distribution2 ) this does indeed happen, but we
could for instance define a command mf which executes virmf &plain loading the
plain base file.
The usual way to create a font with plain MetaFont is to start it with the line

\mode=<mode name>; mag=<magnification>; input <font file name>

in response to the ‘**’ prompt or on the MetaFont command line. (If <mode name> is
unknown or omitted, the mode defaults to ‘proof’ and MetaFont will produce an out-
put file called <font file name>.2602gf) The <magnification> is a floating point
number or ‘magstep’ (magsteps are defined in The MetaFontbook and The TeXbook).
If mag=<magnification> is omitted, then the default is 1 (magstep 0). For example,
to generate cmr10 at 12pt for an epson printer you would type
mf \mode=epson; mag=magstep 1; input cmr10
Note that under Unix the \ and ; characters must usually be quoted or escaped, so this
would typically look something like
mf ’\mode=epson; mag=magstep 1; input cmr10’
If you don’t have inimf or need a special mode that isn’t in the base, you can put its
commands in a file (e.g., ln03.mf) and invoke it on the fly with the \smode command.
For example, to create cmr10.300gf for an LN03 printer, using the file
% This is ln03.mf as of 1990/02/27
% mode_def courtesy of John Sauter
proofing:=0;
fontmaking:=1;
tracingtitles:=0;
pixels_per_inch:=300;
blacker:=0.65;

2 The command_name is symbolically linked to virmf , and virmf loads command_name.base


57
fillin:=-0.1;
o_correction:=.5;
(note the absence of the mode_def and enddef commands), you would type
mf \smode="ln03"; input cmr10
This technique isn’t one you should regularly use, but it may prove useful if you acquire
a new printer and want to experiment with parameters, or for some other reason are
regularly editing the parameters you’re using. Once you’ve settled on an appropriate
set of parameters, you should use them to rebuild the base file that you use.
Other sources of help are mentioned in MetaFont and MetaPost Tutorials.
modes.mf : fonts/modes/modes.mf

100 Which font files should be kept


MetaFont produces from its run three files, a metrics (TFM) file, a generic font (GF)
file, and a log file; all of these files have the same base name as does the input (e.g.,
if the input file was cmr10.mf, the outputs will be cmr10.tfm, cmr10.nnngf3 and
cmr10.log).
For TeX to use the font, you need a TFM file, so you need to keep that. However,
you are likely to generate the same font at more than one magnification, and each time
you do so you’ll (incidentally) generate another TFM file; these files are all the same,
so you only need to keep one of them.
To preview or to produce printed output, the DVI processor will need a font raster
file; this is what the GF file provides. However, while there used (once upon a time) to
be DVI processors that could use GF files, modern processors use packed raster (PK)
files. Therefore, you need to generate a PK file from the GF file; the program gftopk
does this for you, and once you’ve done that you may throw the GF file away.
The log file should never need to be used, unless there was some sort of problem in
the MetaFont run, and need not be ordinarily kept.
101 Acquiring bitmap fonts
When CTAN was young, most people would start using TeX with a 300 dots-per-inch
(dpi) laser printer, and sets of Computer Modern bitmap fonts for this resolution are
available on CTAN. (There are separate sets for write-black and write-white printers,
as well as sets at 120 dpi and 240 dpi.)
Over the years, there have been regular requests that CTAN should hold a wider
range of resolutions, but they were resisted for two reasons:

• The need to decide which printers to generate fonts for. The broad-brush approach
taken for 300 dpi printers was (more or less) justified back then, given the dom-
inance of certain printer ‘engines’, but nowadays one could not make any such
assumption.
• Given the above, justifying the space taken up by a huge array of bitmap fonts.

Fortunately, (La)TeX distribution technology has put a stop to these arguments: most
(if not all) current distributions generate bitmap fonts as needed, and cache them for
later re-use. The impatient user, who is determined that all bitmap fonts should be
created once and for all, may be supported by scripts such as allcm (distributed with
teTeX, at least; otherwise such a person should consult "the use of MetaFont").
If your output is to a PostScript-capable device, you advised to switch to using
Type 1 versions of the CM fonts. Two free sets are currently available; the older
(bakoma) is somewhat less well produced than the bluesky fonts, which were origi-
nally professionally produced and sold, but were then donated to the public domain
by their originators Y&Y and Bluesky Research, in association with the AMS. Unfor-
tunately, the coverage of the sets is slightly different, but you are advised to use the
bluesky set except when bakoma is for some reason absolutely unavoidable. In recent
years, several other ‘MetaFont’ fonts have been converted to Type 1 format; it’s com-
mon never to need to generate bitmap fonts for any purpose other than previewing (see
“previewing documents with Type 1 fonts”).
bakoma: fonts/cm/ps-type1/bakoma

3 Note that the file name may be transmuted by such operating systems as MS-DOS, which don’t permit

long file names

58
bluesky : Browse fonts/cm/ps-type1/bluesky
cm fonts (write-black printers): fonts/cm/pk/pk300.zip
cm fonts (write-white printers): fonts/cm/pk/pk300w.zip

L.2 Adobe Type 1 (“PostScript”) fonts


102 Using PostScript fonts with TeX
In order to use PostScript fonts, TeX needs metric (called TFM) files. Several sets of
metrics are available from the archives; for mechanisms for generating new ones, see
metrics for PostScript fonts. You also need the fonts themselves; PostScript printers
come with a set of fonts built in, but to extend your repertoire you almost invariably
need to buy from one of the many commercial font vendors (see, for example, “choice
of fonts”).
If you use LaTeX 2ε , the best way to get PostScript fonts into your document is
to use the PSNFSS package maintained by Walter Schmidt. The LaTeX3 project team
declare that PSNFSS is “required”, and bug reports may be submitted via the LaTeX
bugs system. PSNFSS gives you a set of packages for changing the default roman,
sans-serif and typewriter fonts; e.g., the mathptmx package will set up Times Roman
as the main text font (and introduces mechanisms to typeset mathematics using Times
and various more-or-less matching fonts), while package avant changes the sans-serif
family to AvantGarde, and courier changes the typewriter font to Courier. To go with
these packages, you need the font metric files and font description (.fd) files for each
font family you want to use. For convenience, metrics for the ‘common 35’ PostScript
fonts found in most PostScript printers are provided with PSNFSS, packaged as the
“Laserwriter set”.
For older versions of LaTeX there are various schemes, of which the simplest to
use is probably the PSLaTeX macros distributed with dvips.
For Plain TeX, you load whatever fonts you like; if the encoding of the fonts is
not the same as Computer Modern it will be up to you to redefine various macros and
accents, or you can use the font re-encoding mechanisms available in many drivers and
in ps2pk and afm2tfm.
Victor Eijkhout’s Lollipop package (Lollipop package) supports declaration of font
families and styles in a similar way to LaTeX’s NFSS, and so is easy to use with
PostScript fonts.
Some common problems encountered are discussed elsewhere (see problems with
PS fonts).
Metrics for the ‘Laserwriter’ set of 35 fonts: macros/latex/required/
psnfss/lw35nfss.zip
lollipop: nonfree/macros/lollipop
psnfss: macros/latex/required/psnfss

103 Previewing files using Type 1 fonts


Until recently, free TeX previewers have only been capable of displaying bitmap PK
fonts, but current versions of xdvi sport a Type 1 font renderer.
Other (free) previewers of the current generation offer automatic generation of the
requisite PK files (using gsftopk, or similar, behind the scenes). If your previewer isn’t
capable of this, you have three options:

• Convert the DVI file to PostScript and use a PostScript previewer. Some sys-
tems offer this capability as standard, but most people will need to use a separate
previewer such as ghostscript or ghostscript-based viewers such as ghostview or
shareware offering GSview.
• Under Windows on a PC, or on a Macintosh, let Adobe Type Manager display the
fonts (textures, on the Macintosh, works like this). (See commercial suppliers for
details.)
• If you have the PostScript fonts in Type 1 format, use ps2pk or gsftopk (designed
for use with the ghostscript fonts) to make PK bitmap fonts which your previewer
will understand. This can produce excellent results, also suitable for printing with
non-PostScript devices. Check the legalities of this if you have purchased the
fonts.

59
ghostscript: Browse nonfree/support/ghostscript
ghostview : Browse support/ghostscript/gnu/ghostview
gsftopk : fonts/utilities/gsftopk
GSview : Browse nonfree/support/ghostscript/ghostgum
ps2pk : fonts/utilities/ps2pk
xdvi: dviware/xdvi

104 TeX font metric files for PostScript fonts


Reputable font vendors such as Adobe supply metric files for each font, in AFM
(Adobe Font Metric) form; these can be converted to TFM (TeX Font Metric) form.
Most modern distributions have prebuilt metrics which will be more than enough for
many people; but you may need to do the conversion yourself if you have special needs
or acquire a new font. One important question is the encoding of (Latin character)
fonts; while we all more or less agree about the position of about 96 characters in fonts
(the basic ASCII set), the rest of the (typically) 256 vary. The most obvious prob-
lems are with floating accents and special characters such as the ‘pounds sterling’ sign.
There are three ways of dealing with this: either you change the TeX macros which
reference the characters (not much fun, and error-prone); or you change the encoding
of the font (easier than you might think); or you use virtual fonts to pretend to TeX that
the encoding is the same as it is used to. LaTeX 2ε has facilities for dealing with fonts
in different encodings; read the LaTeX Companion for more details. In practice, if you
do much non-English (but Latin script) typesetting, you are strongly recommended to
use the fontenc package with option ‘T1’ to select ‘Cork’ encoding. A useful alternative
is Y&Y’s “private” LY1 encoding, which is designed to sit well with “Adobe standard”
encoded fonts. Basic support of LY1 is available on CTAN: note that the “relation with
Adobe’s encoding” means that there are no virtual fonts in the LY1 world.
Alan Jeffrey’s fontinst package is an AFM to TFM converter written in TeX; it
is used to generate the files used by LaTeX 2ε ’s PSNFSS package to support use of
PostScript fonts. It is a sophisticated package, not for the faint-hearted, but is powerful
enough to cope with most needs. Much of its power relies on the use of virtual fonts.
For slightly simpler problems, Rokicki’s afm2tfm, distributed with dvips, is fast
and efficient; note that the metrics and styles that come with dvips are not currently
LaTeX 2ε compatible.
For the Macintosh (classic), there is a program called EdMetrics which does the
job (and more). EdMetrics comes with the (commercial) Textures distribution, but is
itself free software, and is available on CTAN.
dvips: dviware/dvips
EdMetrics: systems/mac/textures/utilities/EdMetrics.sea.hqx
fontinst: fonts/utilities/fontinst
LY1 support: macros/latex/contrib/psnfssx/ly1

105 Deploying Type 1 fonts


For the LaTeX user trying to use the PSNFSS (PSNFSS) package, three questions may
arise.
First, you have to declare to the DVI driver that you are using PostScript fonts; in
the case of dvips, this means adding lines to the psfonts.map file, so that dvips will
know where the proper fonts are, and won’t try to find PK files. If the font isn’t built
into the printer, you have to acquire it (which may mean that you need to purchase the
font files).
Second, your previewer must know what to do with the fonts: see previewing type
1 fonts.
Third, the stretch and shrink between words is a function of the font metric; it
is not specified in AFM files, so different converters choose different values. The
PostScript metrics that come with PSNFSS used to produce quite tight setting, but they
were revised in mid 1995 to produce a compromise between American and European
practice. Sophisticated users may not find even the new the values to their taste, and
want to override them. Even the casual user may find more hyphenation or overfull
boxes than Computer Modern produces; but CM is extremely generous.

60
106 Choice of scalable outline fonts
If you are interested in text alone, you can in principle use any of the huge numbers
of text fonts in Adobe Type 1, TrueType or OpenType formats. The constraint is, of
course, that your previewer and printer driver should support such fonts (TeX itself
only cares about metrics, not the actual character programs).
If you also need mathematics, then you are severely limited by the demands that
TeX makes of maths fonts (for details, see the paper by B.K.P. Horn in TUGboat 14(3)).
For maths, then, there are relatively few choices (though the list is at last growing).
There are several font families available that are based on Knuth’s original designs,
and some that complement other commercial text font designs; one set (MicroPress’s
‘informal math’) stands alone. “Free” font families that will support TeX mathematics
include:
Computer Modern (75 fonts — optical scaling) Donald E. Knuth
The CM fonts were originally designed in MetaFont, but are also now available in
scalable outline form. There are commercial as well as public domain versions,
and there are both Adobe Type 1 and TrueType versions. A set of outline versions
of the fonts was developed as a commercial venture by Y&Y and Blue Sky Re-
search; they have since assigned the copyright to the AMS, and the fonts are now
freely available from CTAN. Their quality is such that they have become the de
facto standard for Type 1 versions of the fonts.
AMS fonts (52 fonts, optical scaling) The AMS
This set of fonts offers adjuncts to the CM set, including two sets of symbol fonts
(msam and msbm) and Euler text fonts. These are not a self-standing family, but
merit discussion here (not least because several other families mimic the sym-
bol fonts). Freely-available Type 1 versions of the fonts are available on CTAN.
The eulervm package permits use of the Euler maths alphabet in conjunction with
text fonts that do not provide maths alphabets of their own (for instance, Adobe
Palatino or Minion).
mathpazo version 1.003 (5 fonts) Diego Puga
The Pazo Math fonts are a family of type 1 fonts suitable for typesetting maths in
combination with the Palatino family of text fonts. Four of the five fonts of the
distribution are maths alphabets, in upright and italic shapes, medium and bold
weights; the fifth font contains a small selection of “blackboard bold” characters
(chosen for their mathematical significance). Support under LaTeX 2ε is avail-
able in PSNFSS (PSNFSS); the fonts are licensed under the GPL, with legalese
permitting the use of the fonts in published documents.
Fourier/Utopia (15 fonts) Michel Bovani
Fourier is a family built on Adobe Utopia (which has been released for usage
free of charge by Adobe). The fonts provide the basic Computer Modern set of
mathematical symbols, and add many of the AMS mathematical symbols (though
you are expected to use some from the AMS fonts themselves). There are also
several other mathematical and decorative symbols. The fonts come with a fourier
package for use with LaTeX; text support of OT1 encoding is not provided — you
are expected to use T1.
MathDesign (3 entire families. . . so far) Paul Pichareau
This (very new: first release was in April 2005) set so far offers mathematics fonts
to match Adobe Utopia, URW Garamond and Bitstream Charter (all of which are
separately available, on CTAN, in Type 1 format). There has been a little comment
on these fonts, but none from actual users posted to the public forums. Users,
particularly those who are willing to discuss their experiences, would obviously
be welcome. Browse the CTAN directory and see which you want: there is a
wealth of documentation and examples.
Belleek (3 fonts) Richard Kinch
Belleek is the upshot of Kinch’s thoughts on how MetaFont might be used in the
future: they were published simultaneously as MetaFont source, as Type 1 fonts,
and as TrueType fonts. The fonts act as “drop-in” replacements for the basic Math-
Time set (as an example of “what might be done”).
The paper outlining Kinch’s thoughts, proceeding from considerations of the ‘in-
tellectual’ superiority of MetaFont to evaluations of why its adoption is so limited
61
and what might be done about the problem, is to be found at http://truetex.
com/belleek.pdf (the paper is a good read, but exhibits the problems discussed
in getting good PDF — don’t try to read it on-screen in Acrobat reader).
mathptmx Alan Jeffrey, Walter Schmidt and others.
This set contains maths italic, symbol, extension, and roman virtual fonts, built
from Adobe Times, Symbol, Zapf Chancery, and the Computer Modern fonts. The
resulting mixture is not entirely acceptable, but can pass in many circumstances.
The real advantage is that the mathptm fonts are (effectively) free, and the resulting
PostScript files can be freely exchanged. Support under LaTeX 2ε is available in
PSNFSS.
Computer Modern Bright Free scalable outline versions of these fonts do exist; they
are covered below together with their commercial parallels.

Fonts capable of setting TeX mathematics, that are available commercially, include:

BA Math (13 fonts) MicroPress Inc.


BA Math is a family of serif fonts, inspired by the elegant and graphically perfect
font design of John Baskerville. BA Math comprises the fonts necessary for math-
ematical typesetting (maths italic, math symbols and extensions) in normal and
bold weights. The family also includes all OT1 and T1 encoded text fonts of vari-
ous shapes, as well as fonts with most useful glyphs of the TS1 encoding. Macros
for using the fonts with Plain TeX, LaTeX 2.09 and current LaTeX are provided.
For further details (including samples) see
http://www.micropress-inc.com/fonts/bamath/bamain.htm
CH Math (15 fonts) MicroPress Inc.
CH Math is a family of slab serif fonts, designed as a maths companion for
Bitstream Charter. (The distribution includes four free Bitstream text fonts, in
addition to the 15 hand-hinted MicroPress fonts.) For further details (including
samples) see
http://www.micropress-inc.com/fonts/chmath/chmain.htm
Computer Modern Bright (62 fonts — optical scaling) Walter Schmidt
CM Bright is a family of sans serif fonts, based on Knuth’s CM fonts. It comprises
the fonts necessary for mathematical typesetting, including AMS symbols, as well
as text and text symbol fonts of various shapes. The collection comes with its own
set of files for use with LaTeX. The CM Bright fonts are supplied in Type 1 format
by MicroPress, Inc. The hfbright bundle offers free Type 1 fonts for text using the
OT1 encoding — the cm-super set for use with T1 texts doesn’t (yet) offer support
for mathematics.
For further details of Micropress’ offering (including samples) see
http://www.micropress-inc.com/fonts/brmath/brmain.htm
Concrete Math (25 fonts — optical scaling) Ulrik Vieth
The Concrete Math font set was derived from the Concrete Roman typefaces de-
signed by Knuth. The set provides a collection of math italics, math symbol, and
math extension fonts, and fonts of AMS symbols that fit with the Concrete set, so
that Concrete may be used as a complete replacement for Computer Modern. Since
Concrete is considerably darker than CM, the family may particularly attractive for
use in low-resolution printing or in applications such as posters or transparencies.
Concrete Math fonts, as well as Concrete Roman fonts, are supplied in Type 1
format by MicroPress, Inc.
For further information (including samples) see
http://www.micropress-inc.com/fonts/ccmath/ccmain.htm
HV Math (14 fonts) MicroPress Inc.
HV Math is a family of sans serif fonts, inspired by the Helvetica (TM) type-
face. HV Math comprises the fonts necessary for mathematical typesetting (maths
italic, maths symbols and extensions) in normal and bold weights. The family also
includes all OT1 and T1 encoded text fonts of various shapes, as well as fonts
with most useful glyphs of the TS1 encoding. Macros for using the fonts with
Plain TeX, LaTeX 2.09 and current LaTeX are provided. Bitmapped copies of the
fonts are available free, on CTAN.
62
For further details (and samples) see
http://www.micropress-inc.com/fonts/hvmath/hvmain.htm
Informal Math (7 outline fonts) MicroPress Inc.
Informal Math is a family of fanciful fonts loosely based on the Adobe’s Tek-
ton (TM) family, fonts which imitate handwritten text. Informal Math comprises
the fonts necessary for mathematical typesetting (maths italic, maths symbols and
extensions) in normal weight, as well as OT1 encoded text fonts in upright and
oblique shapes. Macros for using the fonts with Plain TeX, LaTeX 2.09 and cur-
rent LaTeX are provided.
For further details (including samples) see
http://www.micropress-inc.com/fonts/ifmath/ifmain.htm
Lucida Bright with Lucida New Math (25 fonts) Chuck Bigelow and Kris Holmes
Lucida is a family of related fonts including seriffed, sans serif, sans serif fixed
width, calligraphic, blackletter, fax, Kris Holmes’ connected handwriting font, etc;
they’re not as ‘spindly’ as Computer Modern, with a large x-height, and include a
larger set of maths symbols, operators, relations and delimiters than CM (over 800
instead of 384: among others, it also includes the AMS msam and msbm symbol
sets). ‘Lucida Bright Expert’ (14 fonts) adds seriffed fixed width, another hand-
writing font, smallcaps, bold maths, upright ‘maths italic’, etc., to the set. Support
under LaTeX is available under the auspices of the PSNFSS, and pre-built metrics
are also provided.
TUG has recently (November 2005) acquired the right to distribute these fonts;
the web site “Lucida and TUG” has details.
Adobe Lucida, LucidaSans and LucidaMath (12 fonts)
Lucida and LucidaMath are generally considered to be a bit heavy. The three
maths fonts contain only the glyphs in the CM maths italic, symbol, and extension
fonts. Support for using LucidaMath with TeX is not very good; you will need to
do some work reencoding fonts etc. (In some sense this set is the ancestor of the
LucidaBright plus LucidaNewMath font set, which are not currently available.)
MathTime Pro Publish or Perish (Michael Spivak)
This latest instance of the MathTime family covers all the weights (medium, bold
and heavy) and symbols of previous versions of MathTime. In addition it has a
much extended range of symbols, and many typographic improvements that make
for high-quality documents. The fonts are supported under both Plain TeX and
LaTeX 2ε , and are exclusively available for purchase from Personal TeX Inc.
For a sample, see http://www.pctex.com/mtpdemo.pdf
PA Math PA Math is a family of serif fonts loosely based on the Palatino (TM) type-
face. PA Math comprises the fonts necessary for mathematical typesetting (maths
italics, maths, calligraphic and oldstyle symbols, and extensions) in normal and
bold weights. The family also includes all OT1, T1 encoded text fonts of various
shapes, as well as fonts with the most useful glyphs of the TS1 encoding. Macros
for using the fonts with Plain TeX, LaTeX 2.09 and current LaTeX are provided.
For further details (and samples) see
http://www.micropress-inc.com/fonts/pamath/pamain.htm
TM Math (14 fonts) MicroPress Inc.
TM Math is a family of serif fonts, inspired by the Times (TM) typeface. TM Math
comprises the fonts necessary for mathematical typesetting (maths italic, maths
symbols and extensions) in normal and bold weights. The family also includes
all OT1 and T1 encoded text fonts of various shapes, as well as fonts with most
useful glyphs of the TS1 encoding. Macros for using the fonts with Plain TeX,
LaTeX 2.09 and current LaTeX are provided. Bitmapped copies of the fonts are
available free, on CTAN.
For further details (and samples) see
http://www.micropress-inc.com/fonts/tmmath/tmmain.htm

Two other font sets should be mentioned, even though they don’t currently produce
satisfactory output — their author is no longer working on them, and several problems
have been identified:
63
pxfonts set version 1.0 (26 fonts) by Young Ryu
The pxfonts set consists of
• virtual text fonts using Adobe Palatino (or the URW replacement used by
ghostscript) with modified plus, equal and slash symbols;
• maths alphabets using Palatino;
• maths fonts of all symbols in the computer modern maths fonts (cmsy, cmmi,
cmex and the Greek letters of cmr)
• maths fonts of all symbols corresponding to the AMS fonts (msam and msbm);
• additional maths fonts of various symbols.
The text fonts are available in OT1, T1 and LY1 encodings, and TS encoded sym-
bols are also available. The sans serif and monospaced fonts supplied with the
txfonts set (see below) may be used with pxfonts; the txfonts set should be installed
whenever pxfonts are. LaTeX, dvips and PDFTeX support files are included. Doc-
umentation is readily available.
The fonts are licensed under the GPL; use in published documents is permitted.
txfonts set version 3.1 (42 fonts) by Young Ryu
The txfonts set consists of
• virtual text fonts using Adobe Times (or the URW replacement used by
ghostscript) with modified plus, equal and slash symbols;
• matching sets of sans serif and monospace (‘typewriter’) fonts (the sans serif
set is based on Adobe Helvetica);
• maths alphabets using Times;
• maths fonts of all symbols in the computer modern maths fonts (cmsy, cmmi,
cmex and the Greek letters of cmr)
• maths fonts of all symbols corresponding to the AMS fonts (msam and msbm);
• additional maths fonts of various symbols.
The text fonts are available in OT1, T1 and LY1 encodings, and TS encoded sym-
bols are also available. Documentation is readily available.
The fonts are licensed under the GPL; use in published documents is permitted.
Finally, one must not forget:
Proprietary fonts Various sources.
Since having a high quality font set in scalable outline form that works with TeX
can give a publisher a real competitive advantage, there are some publishers that
have paid (a lot) to have such font sets made for them. Unfortunately, these sets are
not available on the open market, despite the likelihood that they’re more complete
than those that are.
We observe a very limited selection of commercial maths font sets; a maths font has to
be explicitly designed for use with TeX, which is an expensive business, and is of little
appeal in other markets. Furthermore, the TeX market for commercial fonts is minute
by comparison with the huge sales of other font sets.
Text fonts in Type 1 format are available from many vendors including Adobe,
Monotype and Bitstream. However, be careful with cheap font “collections”; many of
them dodge copyright restrictions by removing (or crippling) parts of the font programs
such as hinting. Such behaviour is both unethical and bad for the consumer. The fonts
may not render well (or at all, under ATM), may not have the ‘standard’ complement
of 228 glyphs, or may not include metric files (which you need to make TFM files).
TrueType remains the “native” format for Windows. Some TeX implementations
such as TrueTeX use TrueType versions of Computer Modern and Times Maths fonts
to render TeX documents in Windows without the need for additional system software
like ATM. (When used on a system running Windows XP, TrueTeX can also use Type 1
fonts.)
When choosing fonts, your own system environment may not be the only one of
interest. If you will be sending your finished documents to others for further use, you
should consider whether a given font format will introduce compatibility problems.
Publishers may require TrueType exclusively because their systems are Windows-
based, or Type 1 exclusively, because their systems are based on the early popularity
64
of that format in the publishing industry. Many service bureaus don’t care as long as
you present them with a finished print file (PostScript or PDF) for their output device.
CM family collection: Browse fonts/cm/ps-type1/bluesky
AMS font collection: Browse fonts/amsfonts/ps-type1
Belleek fonts: fonts/belleek/belleek.zip
CM-super collection: fonts/ps-type1/cm-super
eulervm.sty and supporting metrics: fonts/eulervm
fourier (including metrics and other support for utopia: fonts/fourier-GUT
hfbright collection: fonts/ps-type1/hfbright
hvmath (free bitmapped version): fonts/micropress/hvmath
Lucida Bright/Math metrics: fonts/psfonts/bh/lucida
Lucida PSNFSS support: macros/latex/contrib/psnfssx/lucidabr
MathDesign collection: fonts/mathdesign
pxfonts: fonts/pxfonts
tmmath (free bitmapped version): fonts/micropress/tmmath
txfonts: fonts/txfonts
utopia fonts: fonts/utopia

107 Weird characters in dvips output


You’ve innocently generated output, using dvips, and there are weird transpositions
in it: for example, the fi ligature has appeared as a £ symbol. This is an unwanted
side-effect of the precautions about generating PostScript for PDF outlined in gener-
ating PostScript for PDF. The -G1 switch discussed in that question is appropriate for
Knuth’s text fonts, but doesn’t work with text fonts that don’t follow Knuth’s patterns
(such as fonts supplied by Adobe).
If the problem arises, suppress the -G1 switch: if you were using it explicitly, don’t;
if you were using -Ppdf, add -G0 to suppress the implicit switch in the pseudo-printer
file.
The problem has been corrected in dvips v 5.90 (and later versions), which should
be available in any recent TeX distribution.

L.3 Macros for using fonts


108 Using non-standard fonts in Plain TeX
Plain TeX (in accordance with its description) doesn’t do anything fancy with fonts: it
sets up the fonts that Knuth found he needed when writing the package, and leaves you
to do the rest.
To use something other than Knuth’s default, the default mechanism is to use the
\font primitive:

\font\foo=nonstdfont
...
\foo
Text set using nonstdfont ...

The name you use (nonstdfont, above) is the name of the .tfm file for the font you
want.
If you want to use an italic version of \foo, you need to use \font again:
\font\fooi=nonstdfont-italic
...
\fooi
Text set using nonstdfont italic...

This is all very elementary stuff, and serves for simple use of fonts. However, there
are wrinkles, the most important of which is the matter of font encodings. Unfortu-
nately, many fonts that have appeared recently simply don’t come in versions using
Knuth’s eccentric font encodings — but those encodings are built into Plain TeX, so
that some macros of Plain TeX need to be changed to use the fonts. LaTeX gets around
65
all these problems by using a “font selection scheme” — this ‘NFSS’ (‘N’ for ‘new’, as
opposed to what LaTeX 2.09 had) carries around with it separate information about the
fonts you use, so the changes to encoding-specific commands happen automagically.
If you only want to use the EC fonts, you can in principle use the ec-plain bundle,
which gives you a version of Plain TeX which you can run in the same way that you
run Plain TeX using the original CM fonts, by invoking tex. (Ec-plain also extends the
EC fonts, for reasons which aren’t immediately clear, but which might cause problems
if you’re hoping to use Type 1 versions of the fonts.)
The font_selection package provides a sort of halfway house: it provides font face
and size, but not family selection. This gives you considerable freedom, but leaves you
stuck with the original CM fonts. It’s a compact solution, within its restrictions.
Other Plain TeX approaches to the problem (packages plnfss, fontch and ofs) break
out of the Plain TeX model, towards the sort of font selection provided by ConTeXt and
LaTeX — font selection that allows you to change family, as well as size and face. The
remaining packages all make provision for using encodings other than Knuth’s OT1.
Plnfss has a rather basic set of font family details; however, it is capable of us-
ing font description (.fd) files created for LaTeX. (This is useful, since most modern
mechanisms for integrating outline fonts with TeX generate .fd files in their process.)
Fontch has special provision for T1 and TS1 encodings, which you select by arcane
commands, such as:

\let\LMTone\relax
\input fontch.tex

for T1.
Ofs seems to be the most thoroughly thought-through of the alternatives, and can
select more than one encoding: as well as T1 it covers the encoding IL2, which is
favoured in the Czech Republic and Slovakia. Ofs also covers mathematical fonts,
allowing you the dubious pleasure of using fonts such as the pxfonts and txfonts.
ec-plain: macros/ec-plain
fontch: macros/plain/contrib/fontch
font_selection: macros/plain/contrib/font_selection
ofs: macros/generic/ofs
plnfss: macros/plain/contrib/plnfss

L.4 Particular font families


109 Using the “Concrete” fonts
The Concrete Roman fonts were designed by Don Knuth for a book called “Concrete
Mathematics”, which he wrote with Graham and Patashnik (the Patashnik, of BibTeX
fame). Knuth only designed text fonts, since the book used the Euler fonts for mathe-
matics. The book was typeset using Plain TeX, of course, with additional macros that
may be viewed in a file gkpmac.tex, which is available on CTAN.
The packages beton, concmath, and ccfonts are LaTeX packages that change the
default text fonts from Computer Modern to Concrete. Packages beton and ccfonts
also slightly increase the default value of \baselineskip to account for the rather
heavier weight of the Concrete fonts. If you wish to use the Euler fonts for mathemat-
ics, as Knuth did, there’s the euler package which has been developed from Knuth’s
own Plain TeX-based set: these macros are currently deprecated (they clash with many
things, including AMSLaTeX). The independently-developed eulervm bundle is there-
fore preferred to the euler package. (Note that installing the eulervm bundle involves
installing a series of virtual fonts. While most modern distributions seem to have the
requisite files installed by default, you may find you have to install them. If so, see the
file readme in the eulervm distribution.)
A few years after Knuth’s original design, Ulrik Vieth designed the Concrete Math
fonts. Packages concmath, and ccfonts also change the default math fonts from Com-
puter Modern to Concrete and use the Concrete versions of the AMS fonts (this last
behaviour is optional in the case of the concmath package).
There are no bold Concrete fonts, but it is generally accepted that the Computer
Modern Sans Serif demibold condensed fonts are an adequate substitute. If you are
using concmath or ccfonts and you want to follow this suggestion, then use the package
66
with boldsans class option (in spite of the fact that the concmath documentation calls
it sansbold class option). If you are using beton, add
\renewcommand{\bfdefault}{sbc}

to the preamble of your document.


Type 1 versions of the fonts are available. For OT1 encoding, they are available
from MicroPress. The CM-Super fonts contain Type 1 versions of the Concrete fonts
in T1 encoding.
beton.sty : macros/latex/contrib/beton
ccfonts.sty : macros/latex/contrib/ccfonts
CM-Super fonts: fonts/ps-type1/cm-super
concmath.sty : macros/latex/contrib/concmath
Concmath fonts: fonts/concmath
Concrete fonts: fonts/concrete
euler.sty : macros/latex/contrib/euler
eulervm bundle: fonts/eulervm
gkpmac.tex : systems/knuth/local/lib/gkpmac.tex

110 Using the Latin Modern fonts


The lm fonts are an exciting addition to the armoury of the (La)TeX user: high quality
outlines of fonts that were until recently difficult to obtain, all in a free and relatively
compact package. However, the spartan information file that comes with the fonts
remarks “It is presumed that a potential user knows what to do with all these files”.
This answer aims to fill in the requirements: the job is really not terribly difficult.
Note that teTeX distributions, from version 3.0, already have the lm fonts: all you
need do is use them. The fonts may also be installed via the package manager, in a
current MiKTeX system. The remainder of this answer, then, is for people who don’t
use such systems.
The font (and related) files appear on CTAN as a set of single-entry TDS trees —
fonts, dvips, tex and doc. The doc subtree really need not be copied (it’s really a
pair of sample files), but copy the other three into your existing Local $TEXMF tree, and
update the filename database.
Now, incorporate the fonts in the set searched by PDFLaTeX, dvips, dvipdfm, your
previewers and Type 1-to-PK conversion programs, by
• On a teTeX system earlier than version 2.0, edit the file $TEXMF/dvips/config/
updmap and insert an absolute path for the lm.map just after the line that starts
extra_modules=" (and before the closing quotes).
• On a teTeX version 2.0 (or later), execute the command
updmap --enable Map lm.map
• On a MiKTeX system earlier than version 2.2, the “Refresh filename database”
operation, which you performed after installing files, also updates the system’s
“PostScript resources database”.
• On a MiKTeX system, version 2.2 or later, update updmap.cfg as described
in the MiKTeX online documentation. Then execute the command initexmf
--mkmaps, and the job is done.

To use the fonts in a LaTeX document, you should


\usepackage{lmodern}

this will make the fonts the default for all three LaTeX font families (“roman”, “sans-
serif” and “typewriter”). You also need
\usepackage[T1]{fontenc}

for text, and


\usepackage{textcomp}

if you want to use any of the TS1-encoding symbols. There is no support for using
fonts according to the OT1 encoding.
Latin Modern fonts: fonts/lm
67
M Hypertext and PDF
111 Making hypertext documents from TeX
If you want on-line hypertext with a (La)TeX source, probably on the World Wide Web,
there are four technologies to consider:

• Direct (La)TeX conversion to HTML ((La)TeX conversion to HTML);


• Use Texinfo, and use the info viewer, or convert the texinfo to HTML;
• Use Adobe Acrobat, which will preserve your typesetting perfectly (see Making
Acrobat documents from LaTeX);
• The hyperTeX conventions (standardised \special commands); there are sup-
porting macro packages for Plain TeX and LaTeX).

The HyperTeX project extended the functionality of all the LaTeX cross-referencing
commands (including the table of contents) to produce \special commands which
are parsed by DVI processors conforming to the HyperTeX guidelines; it provides
general hypertext links, including those to external documents.
The HyperTeX specification says that conformant viewers/translators must recog-
nize the following set of \special commands:

href: html:<a href = "href_string">


name: html:<a name = "name_string">
end: html:</a>
image: html:<img src = "href_string">
base_name: html:<base href = "href_string">

The href, name and end commands are used to do the basic hypertext operations of
establishing links between sections of documents.
Further details are available on http://arXiv.org/hypertex/; there are two
commonly-used implementations of the specification, a modified xdvi and (recent re-
leases of) dvips. Output from the latter may be used in recent releases of ghostscript or
Acrobat Distiller.
112 Making Acrobat PDF documents from (La)TeX
There are three general routes to Acrobat output: Adobe’s original ‘distillation’ route
(via PostScript output), conversion of a DVI file, and the use of a direct PDF gener-
ator such as PDFTeX (see PDFTeX) or MicroPress’s VTeX (which comes both as a
commercial version for Windows PCs, and as a ‘free’ version) for Linux systems).
For simple documents (with no hyper-references), you can either

• process the document in the normal way, produce PostScript output and distill it;
• (on a Windows or Macintosh machine with the appropriate Adobe tools installed)
pass the output through the PDFwriter in place of a printer driver (this route is a
dead end: the PDFwriter cannot create hyperlinks);
• process the document in the normal way and generate PDF direct from the DVI
with dvipdfm; or
• process the document direct to PDF with PDFTeX or VTeX. PDFTeX has the
advantage of availability for a wide range of platforms, VTeX (available com-
mercially for Windows, or free of charge for Linux or OS/2) has wider graphics
capability, dealing with encapsulated PostScript and some in-line PostScript.

To translate all the LaTeX cross-referencing into Acrobat links, you need a La-
TeX package to suitably redefine the internal commands. There are two of these for
LaTeX, both capable of conforming to the HyperTeX specification (see Making hyper-
text documents from TeX): Heiko Oberdiek’s hyperref , and Michael Mehlich’s hyper.
uses a configuration file to determine how it will generate hypertext; it can operate
using PDFTeX primitives, the hyperTeX \specials, or DVI driver-specific \special
commands. Dvips translates the DVI with these \special commands into PostScript
acceptable to Distiller, and dvipdfm has \special commands of its own.
(In practice, almost everyone uses hyperref ; hyper hasn’t been updated since 2000.)
If you use Plain TeX, Eplain macro package can help you create Acrobat documents
with hyper-references (see what is Eplain). It can operate using PDFTeX primitives, or
\special commands for the dvipdfm DVI driver.

68
There is no free implementation of all of Adobe Distiller’s functionality, any but
the very oldest versions of ghostscript provide pretty reliable distillation (but beware
of the problems discussed in dvips output for distillation). In fact, Distiller itself is now
remarkably cheap (for academics at least).
For viewing (and printing) the resulting files, Adobe’s Acrobat Reader is available
for a fair range of platforms; for those for which Adobe’s reader is unavailable, re-
motely current versions of ghostscript combined with ghostview or GSview can display
and print PDF files.
In many circumstances, ghostscript combined with a viewer application is actually
preferable to Acrobat Reader. For example, on Windows Acrobat Reader locks the
.pdf file it’s displaying: this makes the traditional (and highly effective) (La)TeX de-
velopment cycle of “Edit→Process→Preview” become incredibly clumsy — GSview
doesn’t make the same mistake.
Acrobat Reader: browse ftp://ftp.adobe.com/pub/adobe/acrobatreader
dvipdfm : dviware/dvipdfm
ghostscript: Browse nonfree/support/ghostscript
ghostview : Browse support/ghostscript/gnu/ghostview
GSview : Browse nonfree/support/ghostscript/ghostgum
hyper.sty : macros/latex/contrib/hyper
hyperref.sty : macros/latex/contrib/hyperref

113 Quality of PDF from PostScript


Any reasonable PostScript, including any output of dvips, may be converted to PDF,
using (for example) a sufficiently recent version of ghostscript, Frank Siegert’s (share-
ware) PStill, or Adobe’s (commercial) Distiller.
But, although the job may (almost always) be done, the results are often not accept-
able: the most frequent problem is bad presentation of the character glyphs that make
up the document. The following answers offer solutions to this (and other) problems
of bad presentation. Issues covered are:
• Wrong type of fonts used (Wrong type of fonts used), which is the commonest
cause of fuzzy text.
• ghostscript too old (ghostscript too old), which can also result in fuzzy text.
• Switching to font encoding T1, which is yet another possible cause of fuzzy text.
• Another problem — missing characters — arises from an aged version of Adobe Distiller.
• Finally, there’s the common confusion that arises from using the dvips configura-
tion file -Ppdf, the weird characters.

It should be noted that Adobe Reader 6 (released in mid-2003, and later versions) does
not exhibit the “fuzziness” that so many of the answers below address. This is of course
good news: however, it will inevitably be a long time before every user in the world
has this (or later) versions, so the remedies below are going to remain for some time to
come.
The problems are also discussed, with practical examples, in Mike Shell’s testflow
package, which these FAQs recommend as a “specialised tutorial.
testflow : macros/latex/contrib/IEEEtran/testflow

114 The wrong type of fonts in PDF


This is far the commonest problem: the symptom is that text in the document looks
“fuzzy”.
Most people use Adobe Acrobat Reader to view their PDF: Reader is distributed
free of charge, and is widely available, for all its faults. One of those faults is its failure
to deal with bitmap fonts (at least, in all versions earlier than the recently released
version 6).
So we don’t want bitmap fonts in our PostScript: with them, characters show up
in Reader’s display as blurred blobs which are often not recognisable as the original
letter, and are often not properly placed on the line. Nevertheless, even now, most TeX
systems have dvips configured to use .pk files (.pk files) in its output. Even PDFTeX
will use .pk files if it can see no alternative for a font in the document it is processing.

69
Our remedy is to use “Adobe Type 1” (Adobe Type 1) versions of the fonts we
need. Since Adobe are in the business of selling Type 1 fonts, Reader was of course
made to deal with them really rather well, from the very beginning.
Of course, if your document uses nothing but fonts that came from Adobe in the
first place — fonts such as Times that appear in pretty much every PostScript printer,
or such as Adobe Sabon that you pay extra for — then there’s no problem.
But most people use Computer Modern to start with, and even those relative so-
phisticates who use something as exotic as Sabon often find themselves using odd
characters from CM without really intending to do so. Fortunately, rather good ver-
sions of the CM fonts are available from the AMS (who have them courtesy of Blue
Sky Research — Blue Sky Research and Y&Y).
Most modern systems have the fonts installed ready to use; and any system installed
less than 3 years ago has a dvips configuration file ‘pdf’ that signals the use of the
CM fonts, and also sets a few other parameters to improve dvips’ output. Use this
configuration as:
dvips -Ppdf myfile -o myfile.ps
This may produce a warning message about failing to find the configuration file:
dvips: warning: no config file for ‘pdf’
or something similar, or about failing to find a font file:
dvips: ! Couldn’t find header file cmr10.pfb
Either of these failures signals that your system doesn’t have the fonts in the first place.
A way of using the fonts that doesn’t involve the sophistication of the -Ppdf mech-
anism is simply to load maps:
dvips -Pcmz -Pamz myfile -o myfile.ps
You may encounter the same warning messages as listed above.
If your system does not have the fonts, it won’t have the configuration file either;
however, it might have the configuration file without the fonts. In either case, you need
to install the fonts (install the fonts).
115 Fuzzy fonts because Ghostscript too old
So you’ve done everything the FAQ has told you that you need, correct fonts properly
installed and appearing in the dvips output, but still you get fuzzy character output after
distilling with ghostscript.
The problem could arise from too old a version of ghostscript, which you may
be using directly, or via a script such as ps2pdf (distributed with ghostscript itself),
dvipdf , or similar. Though ghostscript was capable of distillation from version 5.50,
that version could only produce bitmap Type 3 output of any font other than the funda-
mental 35 fonts (Times, Helvetica, etc.). Later versions added ‘complete’ distillation,
but it wasn’t until version 6.50 that one could rely on it for everyday work.
So, if your PDF output still looks fuzzy in Acrobat Reader, upgrade ghostscript.
The new version should be at least version 6.50, of course, but it’s usually good policy
to go to the most recent version (version 8.12 at the time of writing — 2003).
116 Fonts go fuzzy when you switch to T1
You’ve been having problems with hyphenation, and someone has suggested that you
should use “\usepackage[T1]{fontenc}” to help sort them out. Suddenly you find
that your final PDF has become fuzzy. The problem may arise whether you are using
PostScript output and then distilling it, or you are using PDFTeX for the whole job.
In fact, this is the same problem as most others about the quality of PDF (quality of
PDF): you’ve abandoned your previous setup using Type 1 versions of the CM fonts,
and dvips has inserted Type 3 versions of the EC fonts into your document output. (See
Adobe font types for details of these font types; also, note that the font encoding T1
has nothing directly to do with the font format Type 1).
However, as noted in 8-bit Type 1 fonts, Type 1 versions of CM-like fonts in T1 (or
equivalent) encoding are now available, both as “real” fonts, and as virtual font sets.
One solution, therefore, is to use one of these alternatives.
The alternative is to switch font family altogether, to something like Times (as a no-
thought default) or one of the many more pleasing Adobe-encoded fonts. The default
action of fontinst, when creating metrics for such a font, is to create settings for both

70
OT1 and T1 encodings, so there’s little change in what goes on (at the user level) even
if you have switched to T1 encoding when using the fonts.
117 Characters missing from PDF output
If you’re using Acrobat Distiller to create your PDF output, you may find characters
missing. This may manifest itself as messed-up maths equations (missing “−” signs,
for example), or bits missing from large symbols. Early versions of Distiller used to
ignore character positions 0–31 and 128–159 of every font: Adobe’s fonts never use
such positions, so why should Distiller?
Well, the answer to this question is “because Adobe don’t produce all the world’s
fonts” — fonts like Computer Modern were around before Adobe came on the scene,
and they use positions 0–31. Adobe don’t react to complaints like that in the previous
sentence, but they do release new versions of their programs; and Distiller, since at
least version 4.0, has recognised the font positions it used to shun.
Meanwhile, TeX users with old versions of Distiller need to deal with their fonts.
Dvips comes to our aid: the switch -G1 (“remap characters”), which moves the of-
fending characters out of the way. The PDF configuration file (-Ppdf), recommended
above, includes the switch.
The switch is not without its problems; pre-2003 versions of dvips will apply it
to Adobe fonts as well, causing havoc (havoc), but fortunately that problem is usually
soluble. However, a document using both CM and Adobe-specified fonts is stuck. The
only real solution is either to upgrade dvips, or to spend money to upgrade Distiller.
118 Finding ‘8-bit’ Type 1 fonts
Elsewhere, answers to these FAQs recommend that you use an ‘8-bit’ font to permit
accentuation of inflected languages, and also recommend the use of Type 1 fonts to
ensure that you get good quality PDF. These recommendations used to be contradic-
tory: one could not just “switch” from the free CM fonts to free Cork- (or similarly)
encoded Type 1 fonts. The first approach that started to alleviate these problems, was
the development of virtual fonts that make a good approach to the Cork encoding (see
below). Now, however, we have “true” Type 1 fonts available: as always, we have
an embarrassment of riches with three free alternatives, and one commercial and one
shareware version.
CM-super is an auto-traced set which encompasses all of the T1 and TS1 encodings
as well as the T2* series (the family of encodings that cover languages based on Cyrillic
alphabets). These fonts are pretty easy to install (the installation instructions are clear),
but they are huge: don’t try to install them if you’re short of disc space.
CM-LGC is a similar “super-font” set, but of much more modest size; it covers T1,
TS1 and T2A encodings (as does CM-super, and also covers the LGR encoding (for
typesetting Greek, based on Claudio Beccari’s MetaFont sources). CM-LGC manages
to be small by going to the opposite extreme from CM-super, which includes fonts at
all the sizes supported by the original EC (a huge range); CM-LGC has one font per
font shape, getting other sizes by scaling. There is an inevitable loss of quality inherent
in this approach, but for the disc-space-challenged machine, CM-LGC is an obvious
choice.
Tt2001 is a simple scan of the EC and TC fonts, and has some virtues — it’s no-
ticeably smaller than CM-super while being less stark than CM-LGC.
Latin Modern is produced using the program MetaType1. The Latin Modern set
comes with T1, TS1 LY1 encoded variants (as well as a variant using the Polish QX
encoding); for the glyph set it covers, its outlines seem rather cleaner than those of
CM-super. Latin Modern is more modest in its disc space demands than is CM-super,
while not being nearly as stark in its range of design sizes as is CM-LGC — Latin
Modern’s fonts are offered in the same set of sizes as the original CM fonts. It’s hard
to argue with the choice: Knuth’s range of sizes has stood the test of time, and is one
of the bases on which the excellence of the TeX system rests.
Virtual fonts help us deal with the problem, since they allow us to map “bits of
DVI file” to single characters in the virtual font; so we can create an “é” character by
recreating the DVI commands that would result from the code “\’e”. However, since
this involves two characters being selected from a font, the arrangement is sufficient to
fool Acrobat Reader: you can’t use the program’s facilities for searching for text that
contains inflected characters, and if you cut text from a window that contains such a

71
character, you’ll find something unexpected (typically the accent and the ‘base’ char-
acters separated by a space) when you paste the result. However, if you can live with
this difficulty, virtual fonts are a useful and straightforward solution to the problem.
There are two virtual-font offerings of CM-based 8-bit fonts — the ae (“almost
EC”) and zefonts sets; the zefonts set has wider coverage (though the ae set may be
extended to offer guillemets by use of the aeguill package). Neither offers characters
such as eth and thorn (used in, for example, in Icelandic), but the aecompl package
works with the ae fonts to provide the missing characters from the EC fonts (i.e., as
bitmaps).
The sole remaining commercial CM-like 8-bit font comes from Micropress, who
offer the complete EC set in Type 1 format, as part of their range of outline versions of
fonts that were originally distributed in MetaFont format. See “commercial distribu-
tions”.
The shareware BaKoMa TeX distribution offers a set of Type 1 EC fonts, as an extra
shareware option. (As far as the present author can tell, these fonts are only available
to users of BaKoMa TeX: they are stored in an archive format that seems not to be
publicly available.)
Finally, you can use one of the myriad text fonts available in Type 1 format (with
appropriate PSNFSS metrics for T1 encoding, or metrics for some other 8-bit encoding
such as LY1). However, if you use someone else’s text font (even something as simple
as Adobe’s Times family) you have to find a matching family of mathematical fonts,
which is a non-trivial undertaking — “choice of scalable fonts”.
ae fonts: fonts/ae
aecompl.sty : Distributed with fonts/ae
aeguill.sty : macros/latex/contrib/aeguill
BaKoMa fonts: Browse nonfree/systems/win32/bakoma/fonts
CM-LGC fonts: fonts/ps-type1/cm-lgc
CM-super fonts: fonts/ps-type1/cm-super (beware: very large download)
Latin Modern fonts: fonts/lm
tt2001 fonts: fonts/ps-type1/tt2001
zefonts: fonts/zefonts
119 Replacing Type 3 fonts in PostScript
One often comes across a PostScript file generated by dvips which contains embedded
PK fonts; if you try to generate PDF from such a file, the quality will be poor.
Of course, the proper solution is to regenerate the PostScript file, but if neither the
sources nor the DVI file are available, one must needs resort to some sort of patching
to replace the bitmap fonts in the file by outline fonts.
The program pkfix (by Heiko Oberdiek) will do this patching, for files created by
“not too old versions” of dvips: it finds the fonts to be replaced by examining the
PostScript comments dvips has put in the file. For each font, pkfix puts appropriate
TeX commands in a file, which it then processes and runs through dvips (with switch
-Ppdf) to acquire an appropriate copy of the font; these copies are then patched back
into the original file.
Yet another option is Frank Siegert’s (shareware) PStill, which is capable of pro-
cessing the PostScript it is distilling, and one option is to replace bitmap fonts in the
file with Type 1 versions.
pkfix : support/pkfix

120 Hyperref and repeated page numbers


The book class (and its friends and relations) automatically changes the display of page
numbers in the frontmatter of the document to lower-case roman. This is fine for human
readers, but it confuses hyperref since there are pages which seem (to hyperref ) to have
the same page number. Fortunately, there are configuration options to make hyperref
“do the right thing”.
The two options in question are:

plainpages=false Make page anchors using the formatted form of the page number.
With this option, hyperref writes different anchors for pages ‘ii’ and ‘2’. (If the
72
option is set ‘true’ — the default — hyperref writes page anchors as the arabic
form of the absolute page number, rather than the formatted form.)
pdfpagelabels Set PDF page labels; i.e., write the value of \thepage to the PDF file
so that Acrobat Reader can display the page number as (say) ‘ii (4 of 40)’ rather
than simply ‘4 of 40’.

The two should be used whenever page numbering is not just ‘1..n’; they may be used
independently, but usually are not.
The recipe isn’t perfect: it relies on \thepage being different for every page in the
document. A common problem arises when there is an unnumbered title page, after
which page numbers are reset: the PDFTeX warning of “duplicate destinations” will
happen in this case, regardless of the options.
hyperref.sty : macros/latex/contrib/hyperref

121 Searching PDF files


In principle, you can search a PDF file: the text of the file is available to the viewer, and
at least some viewers provide a search facility. (It’s not the fastest thing in the world,
but it does help in some circumstances.)
However, there is a problem: the viewer wants to look at Unicode text, but no or-
dinary TeX-based system deals in Unicode text. Fortunately for us Anglophones, this
is is hardly ever a problem for our text, since even Knuth’s “OT1” encoding matches
ASCII (and hence the lowest 128 characters of Unicode) for most things printable.
However, using the inflected characters of Continental European languages, or any-
thing that doesn’t use a Latin alphabet, there is potential for problems, since TeX’s
view of what a font is doesn’t map PDF’s and the reader won’t understand. . .
. . . Unless you use the cmap package with PDFLaTeX, that is. The package will in-
struct PDFTeX to load character maps into your PDF for output fonts encoded accord-
ing to the T1 (Western European Languages), T2A, T2B, or T2C (Cyrillic Languages),
or T5 (Vietnamese) encodings. If your document uses such encodings, viewers that can
search will use the maps to interpret what they find in the file.
Unfortunately, the package only works with fonts that are directly encoded, such
as the cm-super distribution. Fonts like Adobe Times Roman (which are encoded for
(La)TeX use via virtual fonts) are not amenable to this treatment.
cmap.sty : macros/latex/contrib/cmap
cm-super fonts: fonts/ps-type1/cm-super

N Graphics
122 How to import graphics into (La)TeX documents
Knuth, when designing the current version of TeX back in the early 1980s, could dis-
cern no “standard” way of expressing graphics in documents. He reasoned that this
state could not persist for ever, but that it would be foolish for him to define TeX prim-
itives that allowed the import of graphical image definitions. He therefore deferred the
specification of the use of graphics to the writers of DVI drivers; TeX documents would
control the drivers by means of \special commands (\special commands).
There is therefore a straightforward way for anyone to import graphics into their
document: read the specification of the \special commands your driver uses, and
‘just’ code them. This is the hair-shirt approach: it definitely works, but it’s not for
everyone.
Over the years, therefore, “graphics inclusion” packages have sprung up; most were
designed for inclusion of Encapsulated PostScript graphics (Encapsulated PostScript
graphics) — which has become the lingua franca of graphics inclusion over the last
decade or so.
Notable examples are the epsf package (distributed with dvips) and the psfig pack-
age. (Both of these packages were designed to work well with both Plain TeX and
LaTeX 2.09; they are both still available.) All such packages were tied to a particu-
lar DVI driver (dvips, in the above two cases), but their code could be configured for
others.
The obvious next step was to make the code configurable dynamically. The LaTeX
standard graphics package and its derivatives made this step: it is strongly preferred
73
for all current work. It can also be used (with the help of the miniltx “LaTeX emulator”
and the graphicx.tex front-end) in documents written in Plain TeX.
The graphics package takes a variety of “driver options” — package options that
select code to generate the commands appropriate to the DVI driver in use. In most
cases, your (La)TeX distribution will provide a graphics.cfg file that will select the
correct driver for what you’re doing (for example, a distribution that provides both
LaTeX and PDFLaTeX will usually provide a configuration file that determins whether
PDFLaTeX is running, and selects the definitions for it if so).
The graphics package provides a toolkit of commands (insert graphics, scale a
box, rotate a box), which may be composed to provide most facilities you need; the
basic command, \includegraphics, takes one optional argument, which specifies
the bounding box of the graphics to be included.
The graphicx package uses the facilities of of graphics behind a rather more sophis-
ticated command syntax to provide a very powerful version of the \includegraphics
command. graphicx’s version can combine scaling and rotation, viewporting and clip-
ping, and many other things. While this is all a convenience (at some cost of syntax),
it is also capable of producing noticeably more efficient PostScript, and some of its
combinations are simply not possible with the graphics package version.
The epsfig package provides the same facilities as graphicx, but via a \psfig com-
mand (also known as \epsfig), capable of emulating the behaviour (if not the bugs)
the old psfig package. Epsfig also supplies homely support for former users of the epsf
package. However, there’s a support issue: if you declare you’re using epsfig, any po-
tential mailing list or usenet helper has to clear out of the equation the possibility that
you’re using “old” epsfig, so that support is slower coming than it would otherwise be.
There is no rational reason to stick with the old packages, which have never been
entirely satisfactory in the LaTeX context. (One irrational reason to leave them be-
hind is that their replacement’s name tends not to imply that it’s exclusively related to
PostScript graphics. The reasoning also excludes epsfig, of course.)
A wide variety of detailed techniques and tricks have been developed over the years,
and Keith Reckdahl’s epslatex outlines them in compendious detail: this highly recom-
mendable document is available from CTAN. An invaluable review of the practicalities
of exchanging graphics between sites, “Graphics for Inclusion in Electronic Docu-
ments” has been written by Ian Hutchinson; the document isn’t on CTAN, but may
also be browsed on the Web.
epsf.tex : macros/generic/epsf/epsf.tex
epsfig.sty : Part of the macros/latex/required/graphics bundle
epslatex.pdf : info/epslatex/english/epslatex.pdf; the document is also
available in PostScript format as info/epslatex/english/epslatex.ps
graphics.sty : macros/latex/required/graphics
graphicx.sty : Part of the macros/latex/required/graphics bundle
miniltx.tex : macros/plain/graphics
psfig.sty : nonfree/graphics/psfig

123 Imported graphics in dvips


Dvips, as originally conceived, can only import a single graphics format: encapsulated
PostScript (.eps files, encapsulated PostScript). Dvips also deals with the slightly
eccentric EPS that is created by MetaPost.
Apart from the fact that a depressing proportion of drawing applications produce
corrupt EPS when asked for such output, this is pretty satisfactory for vector graphics
work.
To include bitmap graphics, you need some means of converting them to PostScript;
in fact many standard image manipulators (such as ImageMagick’s convert) make a
good job of creating EPS files (but be sure to ask for output at PostScript level 2 or
higher). (Unix users should beware of xv’s claims: it has a tendency to downsample
your bitmap to your screen resolution.)
Special purpose applications jpeg2ps (which converts JPEG files using PostScript
level 2 functionality) and bmeps (which converts both JPEG and PNG files), and
a2ping/sam2p (which convert a bewildering array of bitmap formats to EPS or PDF
files; sam2p is one of the engines that a2ping uses) are also considered “good bets”.

74
Bmeps comes with patches to produce your own version of dvips that can cope
with JPEG and PNG direct, using bmeps’s conversion library. Dvips, as distributed by
MiKTeX, comes with those patches built-in.
a2ping : graphics/a2ping
bmeps: support/bmeps
jpeg2ps: nonfree/support/jpeg2ps
sam2p: graphics/sam2p

124 Imported graphics in PDFLaTeX


PDFTeX itself has a rather wide range of formats that it can “natively” incorporate
into its output PDF stream: JPEG (.jpg files) for photographs and similar images,
PNG files for artificial bitmap images, and PDF for vector drawings. Old versions of
PDFTeX (prior to version 1.10a) supported TIFF (.tif files) format as an alternative
to PNG files; don’t rely on this facility, even if you are running an old enough version
of PDFTeX. . .
In addition to the ‘native’ formats, the standard PDFLaTeX graphics package setup
causes Hans Hagen’s supp-pdf macros to be loaded: these macros are capable of
translating the output of MetaPost to PDF “on the fly”; thus MetaPost output (.mps
files) may also be included in PDFLaTeX documents.
The commonest problem users encounter, when switching from TeX, is that there
is no straightforward way to include EPS files: since PDFTeX is its own “driver”, and
since it contains no means of converting PostScript to PDF, there’s no direct way the
job can be done.
The simple solution is to convert the EPS to an appropriate PDF file. The epstopdf
program will do this: it’s available either as a Windows executable or as a Perl script
to run on Unix and other similar systems. A LaTeX package, epstopdf , can be used to
generate the requisite PDF files “on the fly”; this is convenient, but requires that you
suppress one of TeX’s security checks: don’t allow its use in files from sources you
don’t entirely trust.
A similar package, pst-pdf , permits other things than ‘mere’ graphics files in its
argument. Pst-pdf operates (the authors suggest) “like BibTeX” — you process your
file using PDFLaTeX, then use LaTeX, dvips and ps2pdf in succession, to produce a
secondary file to input to your next PDFLaTeX run. (Scripts are provided to ease the
production of the secondary file.)
An alternative solution is to use purifyeps, a Perl script which uses the good offices
of pstoedit and of MetaPost to convert your Encapsulated PostScript to “Encapsulated
PostScript that comes out of MetaPost”, and can therefore be included directly. Sadly,
purifyeps doesn’t work for all .eps files.
Good coverage of the problem is to be found in Herbert Voß’ PDF support page,
which is targeted at the use of pstricks in PDFLaTeX, and also covers the pstricks-
specific package pdftricks.
epstopdf : Browse support/epstopdf
epstopdf.sty : Distributed with Heiko Oberdiek’s packages macros/latex/
contrib/oberdiek
pdftricks.sty : macros/latex/contrib/pdftricks
pst-pdf.sty : macros/latex/contrib/pst-pdf
pstoedit: support/pstoedit
purifyeps: support/purifyeps

125 Imported graphics in dvipdfm


Dvipdfm translates direct from DVI to PDF (all other available routes produce
PostScript output using dvips and then convert that to PDF with ghostscript or Acrobat
Distiller).
Dvipdfm is a particularly flexible application. It will permit the inclusion of bitmap
and PDF graphics, as does PDFTeX, but is also capable of employing ghostscript “on
the fly” so as to be able to permit the inclusion of encapsulated PostScript (.eps) files
by translating them to PDF. In this way, dvipdfm combines the good qualities of dvips
and of PDFTeX as a means of processing illustrated documents.

75
Unfortunately, “ordinary” LaTeX can’t deduce the bounding box of a binary bitmap
file (such as JPEG or PNG), so you have to specify the bounding box. This may be
done explicitly, in the document:

\usepackage[dvipdfm]{graphicx}
...
\includegraphics[bb=0 0 540 405]{photo.jpg}

It’s usually not obvious what values to give the “bb” key, but the program ebb will
generate a file containing the information; the above numbers came from an ebb output
file photo.bb:

%%Title: /home/gsm10/photo.jpg
%%Creator: ebb Version 0.5.2
%%BoundingBox: 0 0 540 405
%%CreationDate: Mon Mar 8 15:17:47 2004

However, if such a file is available, you may abbreviate the inclusion code, above, to
read:

\usepackage[dvipdfm]{graphicx}
...
\includegraphics{photo}

which makes the operation feel as simple as does including .eps images in a LaTeX
file for processing with dvips; the graphicx package knows to look for a .bb file if no
bounding box is provided in the \includegraphics command.
The one place where usage isn’t quite so simple is the need to quote dvipdfm ex-
plicitly, as an option when loading the graphicx package: if you are using dvips, you
don’t ordinarily need to specify the fact, since the default graphics configuration file
(of most distributions) “guesses” the dvips option if you’re using TeX.
dvipdfm : dviware/dvipdfm
ebb: Distributed as part of dviware/dvipdfm

126 Importing graphics from “somewhere else”


By default, graphics commands like \includegraphics look “wherever TeX files are
found” for the graphic file they’re being asked to use. This can reduce your flexibility if
you choose to hold your graphics files in a common directory, away from your (La)TeX
sources.
The simplest solution is to patch TeX’s path, by changing the default path. On
most systems, the default path is taken from the environment variable TEXINPUTS, if
it’s present; you can adapt that to take in the path it already has, by setting the variable
to

TEXINPUTS=.:<graphics path(s)>:

on a Unix system; on a Windows system the separator will be “;” rather than “:”. The
“.” is there to ensure that the current directory is searched first; the trailing “:” says
“patch in the value of TEXINPUTS from your configuration file, here”.
This method has the merit of efficiency ((La)TeX does all of the searches, which is
quick), but it’s always clumsy and may prove inconvenient to use in Windows setups
(at least).
The alternative is to use the graphics package command \graphicspath; this com-
mand is of course also available to users of the graphicx and the epsfig packages. The
syntax of \graphicspath’s one argument is slightly odd: it’s a sequence of paths (typ-
ically relative paths), each of which is enclosed in braces. A slightly odd sample, given
in the graphics bundle documentation, is:

\graphicspath{{eps/}{tiff/}}

however, if the security checks on your (La)TeX system allow, the path may be any-
thing you choose (rather than strictly relative, like those above); note that the trailing
“/” is required.

76
Be aware that \graphicspath does not affect the operations of graphics macros
other than those from the graphics bundle — in particular, those of the outdated epsf
and psfig packages are immune.
The disadvantage of the \graphicspath method is inefficiency. The package will
call (La)TeX once for each entry in the list, which is itself slows things. More seriously,
TeX remembers the file name, thus effectively losing memory, every time it’s asked to
look up a file, so a document that uses a huge number of graphical inputs could be
embarrassed by lack of memory.
If your document is split into a variety of directories, and each directory has its as-
sociated graphics, the import package may well be the thing for you; see the discussion
of “bits of document in other directories” (bits of document in other directories).
graphics bundle: macros/latex/required/graphics
import.sty : macros/latex/contrib/misc/import.sty

127 Portable imported graphics


A regular need is a document to be distributed in more than one format: commonly
both PostScript and PDF. The following advice is based on a post by one with much
experience of dealing with the problem of dealing with EPS graphics in this case.
• Don’t specify a driver when loading loading whichever version of the graphics
package you use. The scheme relies on the distribution’s ability to decide which
driver is going to be used: the choice is between dvips and PDFTeX, in this case.
Be sure to exclude options dvips, pdftex and dvipdfm (dvipdfm is not used in
this scheme, but the aspirant PDF-maker may be using it for his output, before
switching to the scheme).
• Use \includegraphics[...]{filename} without specifying the extension (i.e.,
neither .eps nor .pdf).
• For every .eps file you will be including, produce a .pdf version, as described
in Graphics in PDFLaTeX. Having done this, you will have two copies of each
graphic (a .eps and a .pdf file) in your directory.
• Use PDFLaTeX (rather than LaTeX–dvips–distillation or LaTeX–dvipdfm) to pro-
duce your PDF output.
Dvipdfm’s charms are less than attractive here: the document itself needs to be altered
from its default (dvips) state, before dvipdfm will process it.
128 Repeated graphics in a document
A logo or “watermark” image, or any other image that is repeated in your document,
has the potential to make the processed version of the document unmanageably large.
The problem is, that the default mechanisms of graphics usage add the image at every
point it’s to be used, and when processed, the image appears in the output file at each
such point.
Huge PostScript files are embarrassing; explaining why such a file is huge, is more
embarrassing still.
The epslatex graphics tutorial describes a technique for avoiding the problem:
basically, one converts the image that’s to be repeated into a PostScript subroutine, and
load that as a dvips prologue file. In place of the image, you load a file (with the same
bounding box as the image) containing no more than an invocation of the subroutine
defined in the prologue.
The epslatex technique is tricky, but does the job. Trickier still is the neat scheme
of converting the figure to a one-character Adobe Type 3 outline font. While this
technique is for the “real experts” only (the author of this answer has never even tried
it), it has potential for the same sort of space saving as the epslatex technique, with
greater flexibility in actual use.
More practical is Hendri Adriaens’ graphicx-psmin; you load this in place of
graphicx, so rather than:
\usepackage[<options>]{graphicx}

you will write:


\usepackage[<options>]{graphicx-psmin}

and at the start of your document, you write:


77
\loadgraphics[<bb>]{<list of graphics>}

and each of the graphics in the list is converted to an “object” for use within the re-
sulting PostScript output. (This is, in essence, an automated version of the epslatex
technique described above.)
Having loaded the package as above, whenever you use \includegraphics, the
command checks if the file you’ve asked for is one of the graphics in \loadgraphics’
list. If so, the operation is converted into a call to the “object” rather than a new copy
of the file; the resulting PostScript can of course be much smaller.
Note that the package requires a recent dvips, version 5.95b (this version isn’t —
yet — widely distributed).
If your PostScript is destined for conversion to PDF, either by a ghostscript-based
mechanism such as ps2pdf or by (for example) Acrobat Distiller, the issue isn’t so
pressing, since the distillation mechanism will amalgamate graphics objects whether or
not the PostScript has them amalgamated. PDFTeX does the same job with graphics,
automatically converting multiple uses into references to graphics objects.
graphicx-psmin.sty : macros/latex/contrib/graphicx-psmin

129 Limit the width of imported graphics


Suppose you have graphics which may or may not be able to fit within the width of the
page; if they will fit, you want to set them at their natural size, but otherwise you want
to scale the whole picture so that it fits within the page width.
You do this by delving into the innards of the graphics package (which of course
needs a little LaTeX internals programming):
\makeatletter
\def\maxwidth{%
\ifdim\Gin@nat@width>\linewidth
\linewidth
\else
\Gin@nat@width
\fi
}
\makeatother

This defines a “variable” width which has the properties you want. Replace \linewidth
if you have a different constraint on the width of the graphic.
Use the command as follows:
\includegraphics[width=\maxwidth]{figure}

130 Top-aligning imported graphics


When TeX sets a line of anything, it ensures that the base-line of each object in the
line is at the same level as the base-line of the final object. (Apart, of course, from
\raisebox commands. . . )
Most imported graphics have their base-line set at the bottom of the picture. When
using packages such as subfig, one often wants to align figures by their tops. The
following odd little bit of code does this:
\vtop{%
\vskip0pt
\hbox{%
\includegraphics{figure}%
}%
}

The \vtop primitive sets the base-line of the resulting object to that of the first “line”
in it; the \vskip creates the illusion of an empty line, so \vtop makes the very top of
the box into the base-line.
In cases where the graphics are to be aligned with text, there is a case for making
the base-line one ex-height below the top of the box, as in:
\vtop{%
\vskip-1ex
78
\hbox{%
\includegraphics{figure}%
}%
}

A more LaTeX-y way of doing the job (somewhat innefficiently) uses the calc package:
\usepackage{calc}
...
\raisebox{1ex-\height}{\includegraphics{figure}}

(this has the same effect as the text-align version, above).


The fact is, you may choose where the base-line ends up. This answer merely shows
you sensible choices you might make.
131 Displaying MetaPost output in ghostscript
MetaPost ordinarily expects its output to be included in some context where the ‘stan-
dard’ MetaFont fonts (that you’ve specified) are already defined — for example, as a
figure in TeX document. If you’re debugging your MetaPost code, you may want to
view it in ghostscript (or some other PostScript previewer). However, the PostScript
‘engine’ in ghostscript doesn’t ordinarily have the fonts loaded, and you’ll eperience
an error such as
Error: /undefined in cmmi10

There is provision in MetaPost for avoiding this problem: issue the command
prologues := 2; at the start of the .mp file.
Unfortunately, the PostScript that MetaPost inserts in its output, following this
command, is incompatible with ordinary use of the PostScript in inclusions into
(La)TeX documents, so it’s best to make the prologues command optional. Further-
more, MetaPost takes a very simple-minded approach to font encoding: since TeX font
encodings regularly confuse sophisticated minds, this can prove troublesome. If you’re
suffering such problems (the symptom is that characters disappear, or are wrongly
presented) the only solution is to view the ‘original’ metapost output after processing
through LaTeX and dvips.
Conditional compilation may be done either by inputting MyFigure.mp indirectly
from a simple wrapper MyFigureDisplay.mp:
prologues := 2;
input MyFigure

or by issuing a shell command such as


mp ’\prologues:=2; input MyFigure’

(which will work without the quote marks if you’re not using a Unix shell).
A suitable LaTeX route would involve processing MyFigure.tex, which contains:
\documentclass{article}
\usepackage{graphicx}
\begin{document}
\thispagestyle{empty}
\includegraphics{MyFigure.1}
\end{document}

Processing the resulting DVI file with the dvips command


dvips -E -o MyFigure.eps MyFigure

would then give a satisfactory Encapsulated PostScript file. This procedure may be
automated using the Perl script mps2eps, thus saving a certain amount of tedium.
The Plain TeX user may use an adaptation of a jiffy of Knuth’s, by Dan Luecking.
Dan’s version mpsproof.tex will work under TeX to produce a DVI file for use with
dvips, or under PDFTeX to produce a PDF file, direct. The output is set up to look like
a proof sheet.
A script application, mptopdf , is available in recent (La)TeX distributions: it seems
fairly reliably to produce PDF from MetaPost, so may reasonably be considered an
answer to the question. . .
79
mps2eps: support/mps2eps
mpsproof.tex : graphics/metapost/contrib/misc/mpsproof.tex

132 Drawing with TeX


There are many packages to do pictures in (La)TeX itself (rather than importing graph-
ics created externally), ranging from simple use of LaTeX picture environment,
through enhancements like epic, to sophisticated (but slow) drawing with PiCTeX.
Depending on your type of drawing, and setup, here are a few systems you may
consider:

• pict2e; this was advertised in the LaTeX manual, but didn’t appear for nearly ten
years after publication of the book! It removes all the petty niggles that surround
the use of the picture environment. It therefore suffers only from the rather ec-
centric drawing language of the environment, and is a far more useful tool than the
original environment has ever been. (Note that pict2e supersedes David Carlisle’s
stop-gap pspicture.)
• pstricks; this gives you access to all the power of PostScript from TeX itself, by
sophisticated use of \special commands. Since PostScript is itself a pretty pow-
erful programming language, this means there are many astounding things that can
in principle be achieved. pstricks’ \specials are by default specific to dvips, but
VTeX (both in its commercial and in its free versions) understands them. PDFTeX
users may use pdftricks, which (like epstopdf — see PDFLaTeX graphics) gen-
erates PDF files on the fly from pstricks commands. The documentation is good
(you may browse it via the pstricks page on the TUG web site). There is also a
mailing list ([email protected]) which you may join, or you may just browse
the list archives.
• pgf : while pstricks is very powerful and convenient, using it with PDFLaTeX is
an awful fidget: if you simply want the graphical capabilities, pgf , together with
its rather pleasing “user-oriented” interface tikz, may be a good bet for you. While
PDF has (in essence) the same graphical capabilities as PostScript, it isn’t pro-
grammable; pgf provides common LaTeX commands that will utilise the graphical
capabilities of both PostScript and PDF equally.
• MetaPost; you liked MetaFont, but never got to grips with font files? Try Meta-
Post — all the power of MetaFont, but it generates PostScript figures; MetaPost
is nowadays part of most serious (La)TeX distributions. Knuth uses it for all his
work. . .
• Mfpic; you liked MetaFont, but can’t understand the language? The package
makes up MetaFont or MetaPost code for you within using familiar-looking TeX
macros. Not quite the full power of MetaPost, but a friendlier interface; of course,
with MetaPost output, the results can be used equally well in either LaTeX or
PDFLaTeX.
• You liked PiCTeX but don’t have enough memory or time? Look at Eitan Gurari’s
dratex, which is as powerful as most other TeX drawing packages, but is an entirely
new implementation, which is not as hard on memory, is much more readable (and
is fully documented).

dratex.sty : graphics/dratex
mfpic: graphics/mfpic
pdftricks.sty : macros/latex/contrib/pdftricks
pspicture.sty : Distributed as part of macros/latex/contrib/carlisle
pgf.sty : graphics/pgf
pict2e.sty : macros/latex/contrib/pict2e
pstricks: graphics/pstricks
tikz.sty : Distributed with graphics/pgf

133 Drawing Feynman diagrams in LaTeX


Michael Levine’s feynman bundle for drawing the diagrams in LaTeX 2.09 is still avail-
able.
Thorsten Ohl’s feynmf is designed for use with current LaTeX, and works in com-
bination with MetaFont (or, in its feynmp incarnation, with MetaPost). The feynmf or
80
feynmp package reads a description of the diagram written in TeX, and writes out code.
MetaFont (or MetaPost) can then produce a font (or PostScript file) for use in a subse-
quent LaTeX run. For new users, who have access to MetaPost, the PostScript version
is probably the better route, for document portability and other reasons.
Jos Vermaseren’s axodraw is mentioned as an alternative in the documentation of
feynmf , but it is written entirely in terms of dvips \special commands, and is thus
rather imperfectly portable.
An alternative approach is implemented by Norman Gray’s feyn package. Rather
than creating complete diagrams as postscript images, feyn provides a font (in a variety
of sizes) containing fragments, which you can compose to produce complete diagrams.
It offers fairly simple diagrams which look good in equations, rather than complicated
ones more suitable for display in figures.
axodraw : graphics/axodraw
feyn font bundle: fonts/feyn
feynman bundle: macros/latex209/contrib/feynman
feynmf/feynmp bundle: macros/latex/contrib/feynmf

134 Labelling graphics


“Technical” graphics (such as graphs and diagrams) are often labelled with quite com-
plex mathematical expressions: there are few drawing or graphing tools that can do
such things (the honourable exception being MetaPost, which allows you to program
the labels, in (La)TeX, in the middle of specifying your graphic).
Labels on graphics produced by all those other tools is where the psfrag pack-
age can help. Place an unique text in your graphic, using the normal text features of
your tool, and you can ask psfrag to replace the text with arbitrary (La)TeX material.
Psfrag’s “operative” command is \psfrag{PS text}{Repl text}, which instructs
the system to replace the original (“PS”) text with the TeX-typeset replacement text.
Optional arguments permit adjustment of position, scale and rotation; full details may
be found in pfgguide in the distribution. (Unfortunately, psfrag can’t be used with
PDFLaTeX, though one might hope that it would be susceptible to the same sort of
treatment as is used in the pdftricks package. On the other hand, VTeX’s GeX proces-
sor explicitly deals with psfrag, both in its free and commercial instances.)
The psfragx package goes one step further than psfrag: it provides a means whereby
you can put the psfrag commands into the preamble of your EPS file itself. Psfrag
has such a command itself, but deprecates it; psfragx has cleaned up the facility, and
provides a script laprint for use with Matlab to produce appropriately tagged output.
(In principle, other graphics applications could provide a similar facility, but apparently
none does.)
Emacs users may find the embedded editor iTe a useful tool for placing labels:
it’s a (La)TeX-oriented graphical editor written in Emacs Lisp. You create iteblock
environments containing graphics and text, and may then invoke iTe to arrange the
elements relative to one another.
Another useful approach is overpic, which overlays a picture environment on a
graphic included by use of \includegraphics. This treatment lends itself to ready
placement of texts and the like on top of a graphic. The package can draw a grid for
planning your “attack”; the distribution comes with simple examples.
Pstricks can of course do everything that overpic can, with all the flexibility of
PostScript programming that it offers The pstricks web site has a page with several
examples of labelling which will get you started; if pstricks is an option for you, this
route is worth a try.
The confident user may, of course, do the whole job in a picture environment which
itself includes the graphic. I would recommend overpic or the pstricks approach, but
such things are plainly little more than a convenience over what is achievable with the
do-it-yourself approach.
iTe: support/ite
laprint: Distributed with macros/latex/contrib/psfragx
overpic.sty : macros/latex/contrib/overpic
psfrag.sty : macros/latex/contrib/psfrag
psfragx.sty : macros/latex/contrib/psfragx
81
pstricks.sty : graphics/pstricks

O Bibliographies and citations


O.1 Creating bibliographies
135 Creating a BibTeX bibliography file
A BibTeX bibliography file may reasonably be compared to a small database, the en-
tries in which are references to literature that may be called up by citations in a docu-
ment.
Each entry in the bibliography has a type and a unique key. The bibliography is
read, by BibTeX, using the details specified in a bibliography style. From the style,
BibTeX finds what entry types are permissible, what fields each entry type has, and
how to format the whole entry.
The type specifies the type of document you’re making reference to; it may run
all the way from things like “Book” and “Proceedings” (which may even contain
other citations of type “InBook” or “InProceedings”) through dissertation styles like
“PhdThesis” to otherwise-uncategorisable things such as “Misc”. The unique key is
something you choose yourself: it’s what you use when you want to cite an entry in the
file. People commonly create a key that combines the (primary) author’s name and the
year of publication, possibly with a marker to distinguish publications in the same year.
So, for example, the Dyson, Eddington, Davidson paper about deflection of starlight
appears in my experimental .bib file as Dyson20.1.
So, noting the rules of the style, you have ‘simply’ to write a bibliography database.
Fortunately, there are several tools to help in this endeavour:

• Most of the better (La)TeX-oriented editors have “BibTeX modes”.


• If you have an existing thebibliography environment, the Perl script tex2bib
will probably help.
• There are a number of BibTeX bibliography management systems available, some
of which permit a graphical user interface to the task. Sadly, none seems to be
available with the ordinary TeX distributions.
Tools such as Xbibfile (a graphical user interface), ebib (a database application
written to run ‘inside’ emacs) and btOOL (a set of perl tools for building BibTeX
database handlers) are available from CTAN.
Other systems, such as RefDB, BibORB, BibDesk, pybliographer and the Java-
based Bibkeeper and JabRef (which claims to supersede Bibkeeper) are only avail-
able from their development sites.
• Some commercial citation-management systems will export in BibTeX format; an
example is EndNote.
• Data from on-line citation databases may often be translated to BibTeX format
by utilities to be found on CTAN. For example, the Perl script isi2bibtex will
translate citations from ISI “Web of knowledge” (a subscription service, available
to UK academics via BIDS). UK academics may translate BIDS downloads using
bids.to.bibtex

bids.to.bibtex : biblio/bibtex/utils/bids/bids.to.bibtex
btOOL: biblio/bibtex/utils/btOOL
ebib: biblio/bibtex/utils/ebib
isi2bibtex : biblio/bibtex/utils/isi2bibtex
tex2bib: biblio/bibtex/utils/tex2bib/tex2bib
tex2bib.readme: biblio/bibtex/utils/tex2bib/README
xbibfile: biblio/bibtex/utils/xbibfile

136 Creating a bibliography style


It is possible to write your own: the standard bibliography styles are distributed in a
commented form, and there is a description of the language (see BibTeX documen-
tation). However, it must be admitted that the language in which BibTeX styles are

82
written is pretty obscure, and one would not recommend anyone who’s not a confi-
dent programmer to write their own, though minor changes to an existing style may be
within the grasp of many.
If your style isn’t too ‘far out’, you can probably generate it by using the facilities
of the custom-bib bundle. This contains a file makebst.tex, which runs you through a
text menu to produce a file of instructions, with which you can generate your own .bst
file. This technique doesn’t offer entirely new styles of document, but the system’s
“master BibTeX styles” already offer significantly more than the BibTeX standard set.
BibTeX documentation: biblio/bibtex/distribs/doc
makebst.tex : Distributed with macros/latex/contrib/custom-bib

137 Capitalisation in BibTeX


The standard BibTeX bibliography styles impose fixed ideas about the capitalisation of
titles of things in the bibliography. While this is not unreasonable by BibTeX’s lights
(the rules come from the Chicago Manual of Style) it can be troublesome, since BibTeX
fails to recognise special uses (such as acronyms, chemical formulae, etc.).
The solution is to enclose the letter or letters, whose capitalisation BibTeX should
not touch, in braces, as:

title = {The {THE} operating system},

Sometimes you find BibTeX changing the case of a single letter inappropriately. No
matter: the technique can be applied to single letters, as in:

title = {Te{X}niques and tips},

If your document design specification requires a different style of capitalisation, you


should acquire a bibliography style that doesn’t enforce BibTeX’s default rules. It is
definitely not a good idea to enclose an entire title in braces, as in

title = {{TeXniques and tips}},

though that does ensure that the capitalisation is not changed. Your BibTeX database
should be a general-purpose thing, not something tuned to the requirements of a par-
ticular document, or to the way you are thinking today.
There’s more on the subject in the BibTeX documentation.
138 Accents in bibliographies
BibTeX not only has a tendency (by default) to mess about with the case of letters in
your bibliography, also makes a hash of accent commands: “ma\~nana” comes out as
“ma nana” (!). The solution is similar: enclose the troublesome sequence in braces, as
“{\~n}”, in this example.
139 ‘String too long’ in BibTeX
The BibTeX diagnostic “Warning–you’ve exceeded 1000, the global-string-size,
for entry foo” usually arises from a very large abstract or annotation included in
the database. The diagnostic usually arises because of an infelicity in the coding of
abstract.bst, or styles derived from it. (One doesn’t ordinarily output annotations
in other styles.)
The solution is to make a copy of the style file (or get a clean copy from
CTAN — biblio/bibtex/utils/bibtools/abstract.bst), and rename it (e.g.,
on a long file-name system, to abstract-long.bst). Now edit it: find function
output.nonnull and

• change its first line (line 60 in the version on CTAN) from


{ ’s :=
to
{ swap$
Finally,
• delete the function’s last line, which just says “s (line 84 in the version on CTAN).

83
Finally, change your \bibliographystyle command to refer to the name of the new
file.
This technique applies equally to any bibliography style: the same change can be
made to any similar output.nonnull function.
If you’re reluctant to make this sort of change, the only way forward is to take the
entry out of the database, so that you don’t encounter BibTeX’s limit, but you may need
to retain the entry because it will be included in the typeset document. In such cases,
put the body of the entry in a separate file:
@article{long.boring,
author = "Fred Verbose",
...
abstract = "{\input{abstracts/long.tex}}"
}

In this way, you arrange that all BibTeX has to deal with is the file name, though it will
tell TeX (when appropriate) to include all the long text.
140 BibTeX doesn’t understand lists of names
BibTeX has a strict syntax for lists of authors’ (or editors’) names in the BibTeX data
file; if you write the list of names in a “natural”-seeming way, the chances are you will
confuse BibTeX, and the output produced will be quite different from what you had
hoped.
Names should be expressed in one of the forms

First Last
Last, First
Last, Suffix, First

and lists of names should be separated with “and”. For example:

AUTHOR = {Fred Q. Bloggs, John P. Doe \&


Another Idiot}

falls foul of two of the above rules: a syntactically significant comma appears in an
incorrect place, and ‘\&’ is being used as a name separator. The output of the above
might be something like:

John P. Doe \& Another Idiot Fred Q. Bloggs

because “John P. Doe & Another Idiot has become the ‘first name’, while “Fred Q.
Bloggs” has become the ‘last name’ of a single person. The example should have been
written:

AUTHOR = {Fred Q. Bloggs and John P. Doe and


Another Idiot}

Some bibliography styles implement clever acrobatics with very long author lists. You
can force truncation by using the pseudo-name “others”, which will usually translate
to something like “et al” in the typeset output. So, if Mr. Bloggs wanted to distract
attention from his co-authors, he would write:

AUTHOR = {Fred Q. Bloggs and others}

141 URLs in BibTeX bibliographies


There is no citation type for URLs, per se, in the standard BibTeX styles, though Oren
Patashnik (the author of BibTeX) is believed to beconsidering developing one such for
use with the long-awaited BibTeX version 1.0.
The actual information that need be available in a citation of an URL is discussed
at some length in the publicly available on-line extracts of ISO 690–2; the techniques
below do not satisfy all the requirements of ISO 690–2, but they offer a solution that is
at least available to users of today’s tools.
Until the new version of BibTeX arrives, the simplest technique is to use the
howpublished field of the standard styles’ @misc function. Of course, the strictures
about typesetting URLs still apply, so the entry will look like:

84
@misc{...,
...,
howpublished = "\url{http://...}"
}
A possible alternative approach is to use BibTeX styles other than the standard ones,
that already have URL entry types. Pre-eminent are the natbib styles (plainnat,
unsrtnat and abbrevnat). These styles are extensions of the standard styles, principally
for use with natbib itself, but they’ve acquired URLs and other “modern” entries along
the way. The same author’s custom-bib is also capable of generating styles that honour
URL entries.
Another candidate is the harvard package (if its citation styles are otherwise satis-
factory for you). Harvard bibliography styles all include a “url” field in their specifi-
cation; however, the typesetting offered is somewhat feeble (though it does recognise
and use LaTeX2HTML macros if they are available, to create hyperlinks).
You can also acquire new BibTeX styles by use of Norman Gray’s urlbst system,
which is based on a Perl script that edits an existing BibTeX style file to produce a
new style. The new style thus generated has a webpage entry type, and also offers
support for url and lastchecked fields in the other entry types. The Perl script comes
with a set of converted versions of the standard bibliography styles. Documentation is
distributed as LaTeX source.
Another possibility is that some conventionally-published paper, technical report
(or even book) is also available on the Web. In such cases, a useful technique is some-
thing like:
@techreport{...,
...,
note = "Also available as \url{http://...}"
}
There is good reason to use the url or hyperref packages in this context: BibTeX has a
habit of splitting lines it considers excessively long, and if there are no space characters
for it to use as ‘natural’ breakpoints, BibTeX will insert a comment (‘%’) character . . .
which is an acceptable character in an URL. Any current version of the url or hyperref
package detects this “%–end-of-line” structure in its argument, and removes it.
custom-bib bundle: macros/latex/contrib/custom-bib
harvard.sty : macros/latex/contrib/harvard
hyperref.sty : macros/latex/contrib/hyperref
natbib styles: macros/latex/contrib/natbib
url.sty : macros/latex/contrib/misc/url.sty
urlbst: biblio/bibtex/contrib/urlbst
142 Using BibTeX with Plain TeX
The file btxmac.tex (which is part of the Eplain system) contains macros and docu-
mentation for using BibTeX with Plain TeX, either directly or with Eplain. See the use
of BibTeX for more information about BibTeX itself.
btxmac.tex : macros/eplain/btxmac.tex
eplain system: macros/eplain
143 Reconstructing .bib files
Perhaps you’ve lost the .bib file you generated your document from, or have been sent
a document without one. Or even, you’ve realised the error of building a substantial
document without the benefit of BibTeX. . .
The Perl script, tex2bib makes a reasonable job of regenerating .bib files from
thebibliography environments, provided that the original (whether automatically or
manually generated) doesn’t deviate too far from the “standard” styles.
You are well-advised to check the output of the script. While it will not usually
destroy information, it can quite reasonably mislabel it.
Documentation of the script is to be found in the file tex2bib.readme
tex2bib: biblio/bibtex/utils/tex2bib/tex2bib
tex2bib.readme: biblio/bibtex/utils/tex2bib/README
85
144 BibTeX sorting and name prefixes
BibTeX recognises a bewildering array of name prefixes (mostly those deriving from
European language names); it ignores the prefixes when sorting the bibliography —
you want “Ludwig van Beethoven” sorted under “Beethoven”, not under “van”. (Lam-
port made a witty deliberate mistake with Beethoven’s name, in the first edition of his
LaTeX manual.)
However, a recurring issue is the desire to quote Lord Rayleigh’s publications
(“Lord” isn’t an acceptable prefix), or names from languages that weren’t considered
when BibTeX was designed such as “al-Wakil” (transcribed from the Arabic). What’s
needed is a separate “sort key”, but BibTeX only allows such a thing in citations of
items that have no author or editor.
The solution is to embed the sort key in the author’s name, but to prevent it from be-
ing typeset. Patashnik recommends a command \noopsort (no-output-sortkey), which
is defined and used as follows:
@PREAMBLE{ {\providecommand{\noopsort}[1]{}} }
...
@ARTICLE{Rayleigh1,
AUTHOR = "{\noopsort{Rayleigh}}{Lord Rayleigh}",
...

145 Transcribed initials in BibTeX


If your bibliographic style uses initials + surname, you may encounter a problem with
some transcribed names (for example, Russian ones). Consider the following example
from the real world:
@article{epifanov1997,
author = {Epifanov, S. Yu. and Vigasin, A. A.},
title = ...
}

Note that the “Yu” is the initial, not a complete name. However, BibTeX’s algorithms
will leave you with a citation — slightly depending on the bibliographic style — that
reads: “S. Y. Epifanov and A. A. Vigasin, . . . ”. instead of the intended “S. Yu. Epifanov
and A. A. Vigasin, . . . ”.
One solution is to replace each affected initial by a command that prints the correct
combination. To keep your bibliography portable, you need to add that command to
your bibliography with the @preamble directive:
@preamble{ {\providecommand{\BIBYu}{Yu} } }

@article{epifanov1997,
author = {Epifanov, S. {\BIBYu}. and Vigasin, A. A.},
title = ...
}

If you have many such commands, you may want to put them in a separate file and
\input that LaTeX file in a @preamble directive.
An alternative is to make the transcription look like an accent, from BibTeX’s point
of view. For this we need a control sequence that does nothing:
@article{epifanov1997,
author = {Epifanov, S. {\relax Yu}. and Vigasin, A. A.},
title = ...
}

Like the solution by generating extra commands, this involves tedious extra typing;
which of the two techniques is preferable for a given bibliography will be determined
by the names in it.

O.2 Creating citations


146 “Normal” use of BibTeX from LaTeX
To create a bibliography for your document, you need to perform a sequence of steps,
some of which seem a bit odd. If you choose to use BibTeX, the sequence is:
86
First: you need a BibTeX bibliography file (a .bib file) — see “creating a BibTeX
file” ().
Second: you must write your LaTeX document to include a declaration of the
‘style’ of bibliography, citations, and a reference to the bibliography file mentioned
in the step 1. So we may have a LaTeX file containing:

\bibliographystyle{plain}
...
Pooh is heroic~\cite{Milne:1926}.
...
Alice struggles~\cite{Carroll:1865}.
...
\bibliography{mybooks}

Note: we have bibliography style plain, above, which is nearly the simplest of the lot: a
sample text, showing the sorts of style choices available, can be found on Ken Turner’s
web site: http://www.cs.stir.ac.uk/~kjt/software/latex/showbst.html
Third: you must process the file.

latex myfile

As LaTeX processes the file, the \bibliographystyle command writes a note of the
style to the .aux file; each \cite command writes a note of the citation to the .aux
file, and the \bibliography command writes a note of which .bib file is to be used,
to the .aux file.
Note that, at this stage, LaTeX isn’t “resolving” any of the citations: at every \cite
command, LaTeX will warn you of the undefined citation, and when the document
finishes, there will be a further warning of undefined references.
Fourth: you must run BibTeX:

bibtex myfile

Don’t try to tell BibTeX anything but the file name: say bibtex myfile.aux (because
you know it’s going to read the .aux file) and BibTeX will blindly attempt to process
myfile.aux.aux.
BibTeX will scan the .aux file; it will find which bibliography style it needs to use,
and will “compile” that style; it will note the citations; it will find which bibliography
files it needs, and will run through them matching citations to entries in the bibliogra-
phy; and finally it will sort the entries that have been cited (if the bibliography style
specifies that they should be sorted), and outputs the resulting details to a .bbl file.
Fifth: you run LaTeX again. It warns, again, that each citation is (still) undefined,
but when it gets to the \bibliography command, it finds a .bbl file, and reads it. As
it encounters each \bibitem command in the file, it notes a definition of the citation.
Sixth: you run LaTeX yet again. This time, it finds values for all the citations, in
its .aux file. Other things being equal, you’re done. . . until you change the file.
If, while editing, you change any of the citations, or add new ones, you need to
go through the process above from steps 3 (first run of LaTeX) to 6, again, before the
document is once again stable. These four mandatory runs of LaTeX make processing
a document with a bibliography even more tiresome than the normal two runs required
to resolve labels.
To summarise: processing to resolve citations requires: LaTeX; BibTeX; LaTeX;
LaTeX.
147 Choosing a bibliography style
A large proportion of people are satisfied with one of Patashnik’s original “standard”
styles, plain, unsrt, abbrv and alpha. However, no style in that set supports the “author-
date” citation style that is popular in many fields; but there are a very large number of
contributed styles available, that do support the format.
(Note that author-date styles arose because the simple and clear citation style that
plain produces is so awkward in a traditional manuscript preparation scenario. How-
ever, TeX-based document production does away with all those difficulties, leaving us
free once again to use the simple option.)
Fortunately, help is at hand, on the Web, with this problem:

87
• a sample text, showing the sorts of style choices available, can be found on Ken
Turner’s web site;
• an excellent survey, that lists a huge variety of styles, sorted into their nominal
topics as well as providing a good range of examples, is the Reed College “Bibli-
ographies in LaTeX”.
Of course, these pages don’t cover everything; the problem the inquisitive user
faces, in fact, is to find what the various available styles actually do. This is best
achieved (if the links above don’t help) by using xampl.bib from the BibTeX documen-
tation distribution: one can get a pretty good feel for any style one has to hand using
this “standard” bibliography. For style my-style.bst, the simple LaTeX document:

\documentclass{article}
\begin{document}
\bibliographystyle{my-style}
\nocite{*}
\bibliography{xampl}
\end{document}

will produce a representative sample of the citations the style will produce. (Because
xampl.bib is so extreme in some of its “examples”, the BibTeX run will also give you
an interesting selection of BibTeX’s error messages. . . )
xampl.bib: biblio/bibtex/distribs/doc/xampl.bib

148 Separate bibliographies per chapter?


A separate bibliography for each ‘chapter’ of a document can be provided with the
package chapterbib (which comes with a bunch of other good bibliographic things).
The package allows you a different bibliography for each \included file (i.e., de-
spite the package’s name, the availability of bibliographies is related to the component
source files of the document rather than to the chapters that logically structure the doc-
ument).
The package bibunits ties bibliographies to logical units within the document: the
package will deal with chapters and sections (as defined by LaTeX itself) and also
defines a bibunit environment so that users can select their own structuring.
chapterbib.sty : macros/latex/contrib/cite
bibunits.sty : macros/latex/contrib/bibunits

149 Multiple bibliographies?


If you’re thinking of multiple bibliographies tied to some part of your document (such
as the chapters within the document), please see bibliographies per chapter.
For more than one bibliography, there are three options.
The multibbl package offers a very simple interface: you use a command \newbibliography
to define a bibliography “tag”. The package redefines the other bibliography commands
so that each time you use any one of them, you give it the tag for the bibliography
where you want the citations to appear. The \bibliography command itself also
takes a further extra argument that says what title to use for the resulting section or
chapter (i.e., it patches \refname and \bibname — \refname and \bibname — in a
babel-safe way). So one might write:

\usepackage{multibbl}
\newbibliography{bk}
\bibliographystyle{bk}{alpha}
\newbibliography{art}
\bibliographystyle{art}{plain}
...
\cite[pp.~23--25]{bk}{milne:pooh-corner}
...
\cite{art}{einstein:1905}
...
\bibliography{bk}{book-bib}{References to books}
\bibliography{art}{art-bib}{References to articles}

88
(Note that the optional argument of \cite appears before the new tag argument, and
that the \bibliography commands may list more than one .bib file — indeed all
\bibliography commands may list the same set of files.)
The \bibliography data goes into files whose names are htag-namei.aux, so you
will need to run
bibtex bk
bibtex art

after the first run of LaTeX, to get the citations in the correct place.
The multibib package allows you to define a series of “additional topics”, each of
which comes with its own series of bibliography commands. So one might write:
\usepackage{multibib}
\newcites{bk,art}%
{References from books,%
References from articles}
\bibliographystylebk{alpha}
\bibliographystyleart{plain}
...
\citebk[pp.~23--25]{milne:pooh-corner}
...
\citeart{einstein:1905}
...
\bibliographybk{book-bib}
\bibliographyart{art-bib}

Again, as for multibbl, any \bibliography... command may scan any list of .bib
files.
BibTeX processing with multibib is much like that with multibbl; with the above
example, one needs:
bibtex bk
bibtex art

Note that, unlike multibbl, multibib allows a simple, unmodified bibliography (as well
as the “topic” ones).
The bibtopic package allows you separately to cite several different bibliographies.
At the appropriate place in your document, you put a sequence of btSect environ-
ments (each of which specifies a bibliography database to scan) to typeset the separate
bibliographies. Thus, one might have a file diss.tex containing:
\usepackage{bibtopic}
\bibliographystyle{alpha}
...
\cite[pp.~23--25]{milne:pooh-corner}
...
\cite{einstein:1905}
...
\begin{btSect}{book-bib}
\section{References from books}
\btPrintCited
\end{btSect}
\begin{btSect}[plain]{art-bib}
\section{References from articles}
\btPrintCited
\end{btSect}

Note the different way of specifying a bibliographystyle: if you want a different style
for a particular bibliography, you may give it as an optional argument to the btSect
environment.
Processing with BibTeX, in this case, uses .aux files whose names are derived from
the name of the base document. So in this example you need to say:
bibtex diss1
bibtex diss2
89
There is also a command \btPrintNotCited, which gives the rest of the content
of the database (if nothing has been cited from the database, this is equivalent to LaTeX
standard \nocite{*}).
However, the real difference from miltibbl and mltibib is that selection of what
appears in each bibliography section is determined in bibtopic by what’s in the .bib
files.
An entirely different approach is taken by the splitbib package. You provide a
category environment, in the preamble of your document, for each category you want
a separate citation list for. In each environment, you list the \cite keys that you
want listed in each category. The \bibliography command (or, more precisely, the
thebibliograph environment it uses) will sort the keys as requested. (Keys not men-
tioned in a category appear in a “misc” category created in the sorting process.) A
code example appears in the package documentation (a PDF file in the CTAN directory,
see the file list, below).
bibtopic.sty : macros/latex/contrib/bibtopic
multibbl.sty : macros/latex/contrib/multibbl
multibib.sty : macros/latex/contrib/multibib
splitbib.sty : macros/latex/contrib/splitbib

150 Putting bibliography entries in text


This is a common requirement for journals and other publications in the humanities.
Sometimes the requirement is for the entry to appear in the running text of the docu-
ment, while other styles require that the entry appear in a footnote.
Options for entries in running text are

• The package bibentry, which puts slight restrictions on the format of entry that
your .bst file generates, but is otherwise undemanding of the bibliography style.
• The package inlinebib, which requires that you use its inlinebib.bst. Inlinebib
was actually designed for footnote citations: its expected use is that you place a
citation inline as the argument of a \footnote command.
• The package jurabib, which was originally designed for German law documents,
and has comprehensive facilities for the manipulation of citations. The package
comes with four bibliography styles that you may use: jurabib.bst, jhuman.bst
and two Chicago-like ones.

Options for entries in footnotes are

• The package footbib, and


• Packages jurabib and inlinebib, again.

Note that jurabib does the job using LaTeX’s standard footnotes, whereas footbib cre-
ates its own sequence of footnotes. Therefore, in a document which has other footnotes,
it may be advisable to use jurabib (or of course inlinebib), to avoid confusion of foot-
notes and foot-citations.
bibentry.sty : Distributed with macros/latex/contrib/natbib
footbib.sty : macros/latex/contrib/footbib
inlinebib.sty : biblio/bibtex/contrib/inlinebib
jurabib.sty : macros/latex/contrib/jurabib

151 Sorting and compressing citations


If you give LaTeX \cite{fred,joe,harry,min}, its default commands could give
something like “[2,6,4,3]”; this looks awful. One can of course get the things in order
by rearranging the keys in the \cite command, but who wants to do that sort of thing
for no more improvement than “[2,3,4,6]”?
The cite package sorts the numbers and detects consecutive sequences, so creating
“[2–4,6]”. The natbib package, with the numbers and sort&compress options, will
do the same when working with its own numeric bibliography styles (plainnat.bst
and unsrtnat.bst).
If you might need to make hyperreferences to your citations, cite isn’t adequate. If
you add the hypernat package:

90
\usepackage[...]{hyperref}
\usepackage[numbers,sort&compress]{natbib}
\usepackage{hypernat}
...
\bibliographystyle{plainnat}
the natbib and hyperref packages will interwork.
cite.sty : macros/latex/contrib/cite
hypernat.sty : macros/latex/contrib/misc/hypernat.sty
hyperref.sty : macros/latex/contrib/hyperref
plainnat.bst: Distributed with macros/latex/contrib/natbib
unsrtnat.bst: Distributed with macros/latex/contrib/natbib

152 Multiple citations


A convention sometimes used in physics journals is to “collapse” a group of related
citations into a single entry in the bibliography. BibTeX, by default, can’t cope with
this arrangement, but the mcite package deals with the problem.
The package overloads the \cite command to recognise a “*” at the start of a key,
so that citations of the form

\cite{paper1,*paper2}

appear in the document as a single citation, and appear arranged appropriately in the
bibliography itself. You’re not limited to collapsing just two references. You can mix
“collapsed” references with “ordinary” ones, as in

\cite{paper0,paper1,*paper2,paper3}

Which will appear in the document as 3 citations “[4,7,11]” (say) — citation ‘4’ will
refer to paper 0, ‘7’ will refer to a combined entry for paper 1 and paper 2, and ‘11’
will refer to paper 3.
You need to make a small change to the bibliography style (.bst) file you use; the
mcite package documentation tells you how to do that.
Unfortunately, the revtex system doesn’t play with mcite. As a result, for that (pri-
marily physics-targeted system) you need to play silly games like:

\cite{paper0,paper1,paper3}
\nocite{paper2}

and then edit the .bbl file to merge the two citations, to achieve the effects of mcite.
mcite.sty : macros/latex/contrib/mcite
revtex bundle: macros/latex/contrib/revtex

153 References from the bibliography to the citation


A link (or at least a page reference), from the bibliography to the citing command, is
often useful in large documents.
Two packages support this requirement, backref and citeref . Backref is part of
the hyperref bundle, and supports hyperlinks back to the citing command. Citeref
is the older, and seems to rely on rather simpler (and therefore possibly more stable)
code. Neither collapses lists of pages (“5, 6, 7” comes out as such, rather than as
“5-7”), but neither package repeats the reference to a page that holds multiple citations.
(The failure to collapse lists is of course forgiveable in the case of the hyperref -related
backref , since the concept of multiple hyperlinks from the same anchor is less than
appealing.)
backref.sty : Distributed with macros/latex/contrib/hyperref
citeref.sty : macros/latex/contrib/citeref

91
154 Sorting lists of citations
BibTeX has a sorting function, and most BibTeX styles sort the citation list they pro-
duce; most people find this desirable.
However, it is perfectly possible to write a thebibliography environment that
looks as if it came from BibTeX, and many people do so (in order to save time in the
short term).
The problem arises when thebibliography-writers decide their citations need to
be sorted. A common misapprehension is to insert \bibliographystyle{alpha}
(or similar) and expect the typeset output to be sorted in some magical way. BibTeX
doesn’t work that way! — if you write thebibliography, you get to sort its contents.
BibTeX will only sort the contents of a thebibliography environment when it creates
it, to be inserted from a .bbl file by a \bibliography command.
155 Reducing spacing in the bibliography
Bibliographies are, in fact, implemented as lists, so all the confusion about reducing
list item spacing also applies to bibliographies.
If the natbib package ‘works’ for you (it may not if you are using some special-
purpose bibliography style), the solution is relatively simple — add
\usepackage{natbib}
\setlength{\bibsep}{0.0pt}

to the preamble of your document.


Otherwise, one is into unseemly hacking of something or other. The mdwlist pack-
age actually does the job, but it doesn’t work here, because it makes a different-named
list, while the name “thebibliography” is built into LaTeX and BibTeX. Therefore,
we need to patch the underlying macro:
\let\oldbibliography\thebibliography
\renewcommand{\thebibliography}[1]{%
\oldbibliography{#1}%
\setlength{\itemsep}{0pt}%
}

The savetrees package performs such a patch, among a plethora of space-saving mea-
sures: you can, in principle, suppress all its other actions, and have it provide you a
compressed bibliography only.
mdwlist.sty : Distributed as part of macros/latex/contrib/mdwtools
natbib.sty : macros/latex/contrib/natbib
savetrees.sty : macros/latex/contrib/savetrees

156 Table of contents rearranges “unsrt” ordering


If you’re using the unsrt bibliography style, you’re expecting that your bibliography
will not be sorted, but that the entries will appear in the order that they first appeared
in your document.
However, if you’re unfortunate enough to need a citation in a section title, and you
also have a table of contents, the citations that now appear in the table of contents will
upset the “natural” ordering produced by the unsrt style. Similarly, if you have citations
in captions, and have a list of figures (or tables).
There’s a pretty simple “manual” method for dealing with the problem — when
you have the document stable:
1. Delete the .aux file, and any of .toc, .lof, .lot files.
2. Run LaTeX.
3. Run BibTeX for the last time.
4. Run LaTeX often enough that things are stable again.
Which is indeed simple, but it’s going to get tedious when you’ve found errors in your
“stable” version, often enough.
The package notoccite avoids the kerfuffle, and suppresses citations while in the
table of contents, or lists of figures, tables (or other floating things: the code is quite
general).
notoccite.sty : macros/latex/contrib/misc/notoccite.sty
92
157 Non-english bibliographies
Like so much of early (La)TeX software, BibTeX’s assumptions were firmly rooted in
what its author knew well, viz., academic papers in English (particularly those with a
mathematical bent). BibTeX’s standard styles all address exactly that problem, leaving
the user who writes in another language (or who deal with citations in the style of other
disciplines than maths) to strike out into contributed software.
For the user whose language is not English, there are several alternatives. The
simplest is to provide translations of BibTeX styles into the required language: the
solitary finplain.bst does that for Finnish; others one can find are for Danish (dk-bib),
French (bib-fr), German (bibgerm), Norwegian (norbib) and Swedish (swebib) bundles
(of which the bib-fr set is the most extensive). The spain style implements a traditional
Spanish citation style.
These static approaches solve the problem, for the languages that have been covered
by them. Unfortunately, with such an approach, a lot of work is needed for every
language involved. Two routes to a solution of the “general” problem are available —
that offered by babelbib, and the custom-bib mechanism for generating styles.
Babelbib (which is a development of the ideas of the bibgerm package) co-operates
with babel to control the language of presentation of citations (potentially at the level
of individual items). The package has a built-in set of languages it ‘knows about’, but
the documentation includes instructions on defining commands for other languages.
Babelbib comes with its own set of bibliography styles, which could be a restriction if
there wasn’t also a link from custom-bib.
The makebst menu of custom-bib allows you to choose a language for the BibTeX
style you’re generating (there are 14 languages to choose; it looks as if spain.bst, men-
tioned above, was generated this way). If, however, you opt not to specify a language,
you are asked whether you want the style to interact with babelbib; if you do so, you’re
getting the best of both worlds — formatting freedom from custom-bib and linguistic
freedom via the extensibility of babelbib
babelbib.sty : biblio/bibtex/contrib/babelbib
bib-fr bundle: biblio/bibtex/contrib/bib-fr
bibgerm bundle: biblio/bibtex/contrib/germbib
custom-bib bundle: macros/latex/contrib/custom-bib
finplain.bst: biblio/bibtex/contrib/misc/finplain.bst
norbib bundle: biblio/bibtex/contrib/norbib
spain: biblio/bibtex/contrib/spain
swebib bundle: biblio/bibtex/contrib/swebib

158 Format of numbers in the bibliography


By default, LaTeX makes entries in the bibliography look like:
[1] Doe, Joe et al. Some journal. 2004.
[2] Doe, Jane et al. Some journal. 2003.

while many documents need something like:

1. Doe, Joe et al. Some journal. 2004.


2. Doe, Jane et al. Some journal. 2003.
This sort of change may be achieved by many of the “general” citation packages;
for example, in natbib, it’s as simple as:

\renewcommand{\bibnumfmt}[1]{#1.}

but if you’re not using such a package, the following internal LaTeX commands, in the
preamble of your document, will do the job:

\makeatletter
\renewcommand*{\@biblabel}[1]{\hfill#1.}
\makeatother

natbib.sty : macros/latex/contrib/natbib

93
O.3 Manipulating whole bibliographies
159 Listing all your BibTeX entries
LaTeX and BibTeX co-operate to offer special treatment of this requirement. The
command \nocite{*} is specially treated, and causes BibTeX to generate bibliogra-
phy entries for every entry in each .bib file listed in your \bibliography statement,
so that after a LaTeX–BibTeX–LaTeX sequence, you have a document with the whole
thing listed.
Note that LaTeX doesn’t produce “Citation ... undefined” or “There were
undefined references” warnings in respect of \nocite{*}. This isn’t a problem
if you’re running LaTeX “by hand” (you know exactly how many times you have to
run things), but the lack might confuse automatic processors that scan the log file to
determine whether another run is necessary.
160 Making HTML of your Bibliography
A neat solution is offered by the noTeX bibliography style. This style produces a .bbl
file which is in fact a series of HTML ‘P’ elements of class noTeX, and which may
therefore be included in an HTML file. Provision is made for customising your bibli-
ography so that its content when processed by noTeX is different from that presented
when it is processed in the ordinary way by (La)TeX.
A thorough solution is offered by bib2xhtml; using it, you make use of one of its
modified versions of many common BibTeX styles, and post-process the output so
produced using a perl script.
A more conventional translator is the awk script bbl2html, which translates the
.bbl file you’ve generated: a sample of the script’s output may be viewed on the web,
at http://rikblok.cjb.net/lib/refs.html
bbl2html.awk : biblio/bibtex/utils/misc/bbl2html.awk
bib2xhtml: biblio/bibtex/utils/bib2xhtml
noTeX.bst: biblio/bibtex/utils/misc/noTeX.bst

P Adjusting the typesetting


P.1 Alternative document classes
161 Replacing the standard classes
People are forever concocting classes that replace the standard ones: the present author
produced an ukart class that used the sober package, and a few British-specific things
(such as appear in the babel package’s British-english specialisation) in the 1980s,
which is still occasionally used.
Similar public efforts were available well back in the days of LaTeX 2.09: a notable
example, whose pleasing designs seem not to have changed much over all that time, is
the ntgclass bundle. Each of the standard classes is replaced by a selection of classes,
named in Dutch, sometimes with a single numeric digit attached. So we have classes
artikel2, rapport1, boek3 and brief . These classes are moderately well documented in
English.
The KOMA-script bundle (classes named scr...) are a strong current contender.
They are actively supported, are comprehensive in their coverage of significant type-
setting issues, produce good-looking output and are well documented in both English
and German (scrguien in the distribution for English, scrguide for German).
The other comparable class is memoir. This aims to replace book and report classes
directly, and (like KOMA-script) is comprehensive in its coverage of small issues.
Memoir’s documentation (memman) is very highly spoken of, and its lengthy intro-
ductory section is regularly recommended as a tutorial on typesetting.
KOMA-script bundle: macros/latex/contrib/koma-script
memoir.cls: macros/latex/contrib/memoir
NTGclass bundle: macros/latex/contrib/ntgclass
sober.sty : macros/latex209/contrib/misc/sober.sty

94
162 Producing slides
Lamport’s original LaTeX had a separate program (SliTeX) for producing slides; it
dates from the age when colour effects were produced by printing separate slides in
different-coloured inks, and overlaying them, and was just about acceptable back then.
When LaTeX 2ε came along, the reason SliTeX had to be a separate program went
away, and its functionality was supplied by the slides class. While this makes life a
little easier for system administrators, it does nothing for the inferior functionality of
the class: no-one “who knows” uses slides nowadays.
The ‘classic’ alternatives have been seminar and foils (originally known as Foil-
TeX). Both were originally designed to produce output on acetate foils, though subse-
quent work has provided environments in which they can be used with screen projectors
(see below).
The advent of Microsoft PowerPoint (feeble though early versions of it were) has
created a demand for “dynamic” slides — images that develop their content in a more
elaborate fashion than by merely replacing one foil with the next in the way that was
the norm when slides, foils and seminar were designed.
The prosper class builds on seminar to provide dynamic effects and the like; it
retains the ability to provide PDF for a projected presentation, or to print foils for a
foil-based presentation. The add-on package ppr-prv adds “preview” facilities (that
which is commonly called “hand-out printing”). The HA-prosper package, which you
load with prosper, mends a few bugs, and adds several facilities and slide design styles.
The (relatively new) powerdot class is designed as a replacement for prosper and HA-
prosper, co-authored by the author of HA-prosper.
Beamer is a relatively easy-to-learn, yet powerful, class that (as its name implies)
was designed for use with projection displays. It needs the pgf package (for graphics
support), which in turn requires xcolor; while this adds to the tedium of installing
beamer “from scratch”, both are good additions to a modern LaTeX installation.
Beamer has reasonable facilities for producing printed copies of slides.
Talk is another highly functional, yet easy-to-learn class which claims to differ from
the systems mentioned above, such as beamer, in that it doesn’t impose a slide style
on you. You get to specify a bunch of slide styles, and you can switch from one to
the other between slides, as you need. (The class itself provides just the one style, in
the package greybars: the author hopes users will contribute their own styles, based on
greybars.)
Ppower4 (commonly known as pp4) is a Java-based support program that will
postprocess PDF, to ‘animate’ the file at places you’ve marked with commands from
one of the pp4 packages. The commands don’t work on PDF that has come from
dvips output; they work with PDF generated by PDFLaTeX, VTeX LaTeX, or dvipdfm
running on LaTeX output.
Pdfscreen and texpower are add-on pakages that permit dynamic effects in doc-
uments formatted in “more modest” classes; pdfscreen will even allow you to plug
“presentation effects” into an article-class document.
A more detailed examination of the alternatives (including examples of code using
many of them) may be found at Michael Wiedmann’s fine http://www.miwie.org/
presentations/presentations.html
beamer.cls: Download all of macros/latex/contrib/beamer
foils.cls: nonfree/macros/latex/contrib/foiltex
greybars.sty : distributed with macros/latex/contrib/talk
HA-prosper.sty : macros/latex/contrib/ha-prosper
seminar.cls: macros/latex/contrib/seminar
pgf.sty : graphics/pgf
powerdot.cls: macros/latex/contrib/powerdot
pp4: support/ppower4
ppr-prv.sty : macros/latex/contrib/ppr-prv
prosper.cls: macros/latex/contrib/prosper
talk.cls: macros/latex/contrib/talk
texpower : macros/latex/contrib/texpower

95
xcolor.sty : macros/latex/contrib/xcolor

163 Creating posters with LaTeX


There is no complete “canned solution” to creating a poster (as, for example, classes
like seminar, powerdot and beamer serve for creating presentations in a variety of
styles).
The nearest approach to the complete solution is the sciposter class, which provides
the means to produce really rather good posters according to the author’s required style.
A complete worked example is provided with the distribution
Otherwise, there is a range of tools, most of which are based on the a0poster class,
which sets up an appropriately-sized piece of paper, sets font sizes appropriately, and
leaves you to your own devices.
Having used a0poster, you can of course slog it out, and write all your poster as
an unadorned LaTeX document (presumably in multiple columns, using the multicol
package), but it’s not really necessary: the (straightforward) textpos package provides
a simple way of positioning chunks of text, or tables or figures, on the poster page.
More sophisticated is the flowfram package, whose basic aim in life is flowing text
from one box on the page to the next. One of the package’s design aims seems to
have been the production of posters, and a worked example is provided. The author of
flowfram has an experimental tool called JpgfDraw, which allows you to construct the
outline of frames for use with flowfram.
Despite the relative shortage of tools, there are a fair few web pages that explain
the process (mostly in terms of the a0poster route):

• from Norman Gray, http://purl.org/nxg/note/posters;


• from “awf ” and “capes” http://www.robots.ox.ac.uk/~awf/latex-posters/;
• from Brian Wolven, http://fuse.pha.jhu.edu/~wolven/posters.html (this
page also provides macros and other support suggestions); and
• from “pjh”, http://www.phys.ufl.edu/~pjh/posters/poster_howto_UF.
html, which covers the specific issue of dealing with University of Florida styled
poster, but has hints which are generally useful.
a0poster.cls: macros/latex/contrib/a0poster
flowfram.sty : macros/latex/contrib/flowfram
multicol.sty : Distributed as part of macros/latex/required/tools
sciposter.cls: macros/latex/contrib/sciposter
textpos.sty : macros/latex/contrib/textpos

164 Formatting a thesis in LaTeX


Thesis styles are usually very specific to your University, so it’s usually not profitable
to ask around for a package outside your own University. Since many Universities
(in their eccentric way) still require double-spacing, you may care to refer to the rel-
evant question. If you want to write your own, a good place to start is the University
of California style, but it’s not worth going to a lot of trouble. (If officials won’t al-
low standard typographic conventions, you won’t be able to produce an æsthetically
pleasing document anyway!)
UC thesis style: macros/latex/contrib/ucthesis

165 Setting papers for journals


Publishers of journals have a wide range of requirements for the presentation of papers,
and while many publishers do accept electronic submissions in (La)TeX, they don’t
often submit recommended macros to public archives.
Nevertheless, there are considerable numbers of macros of one sort or another avail-
able on CTAN; searching for your journal name in the CTAN catalogue (see searching
CTAN) may well turn up what you’re seeking.
Failing that, you may be well advised to contact the prospective publisher of your
paper; many publishers have macros on their own web sites, or otherwise available
only upon application.
Check that the publisher is offering you macros suitable to an environment you
can use: a few still have no macros for current LaTeX, for example, claiming that
LaTeX 2.09 is good enough. . .
96
Some publishers rekey anything sent them anyway, so that it doesn’t really matter
what macros you use. Others merely encourage you to use as few extensions of a
standard package as possible, so that they will find it easy to transform your paper to
their own internal form.
166 A ‘report’ from lots of ‘article’s
This is a requirement, for example, if one is preparing the proceedings of a conference
whose papers were submitted in LaTeX.
The nearest things to canned solutions are Peter Wilson’s combine and Federico
Garcia’s subfiles classes.
Combine defines the means to ‘\import’ entire documents, and provides means of
specifying significant features of the layout of the document, as well as a global table
of contents, and so on. An auxiliary package, combinet, allows use of the \titles and
\authors (etc.) of the \imported documents to appear in the global table of contents.
Subfiles is used in the component files of a multi-file project, and the corresponding
subfiles is used in the master file; arrangements may be made so that the component
files will be typeset using different page format, etc., parameters than those used when
they are typeset as a part of the main file.
A more ‘raw’ toolkit is offered by Matt Swift’s includex and newclude packages,
both part of the frankenstein bundle. Note that Matt believes includex is obsolete
(though it continues to work for this author); furthermore, its replacement, newclude
remains “in development”, as it has been since 1999.
Both includex and newclude enable you to ‘\includedoc’ complete articles (in the
way that you ‘\include’ chapter files in an ordinary report). The preamble (everything
up to \begin{document}), and everything after \end{document}, is ignored by both
packages. Thus the packages don’t “do the whole job” for you, though: you need to
analyse the package use of the individual papers, and ensure that a consistent set is
loaded in the preamble of the main report. (Both packages require moredefs, which is
also part of the bundle.)
A completely different approach is to use the pdfpages package, and to include
articles submitted in PDF format into a a PDF document produced by PDFLaTeX. The
package defines an \includepdf command, which takes arguments similar to those
of the \includegraphics command. With keywords in the optional argument of the
command, you can specify which pages you want to be included from the file named,
and various details of the layout of the included pages.
combine.cls: macros/latex/contrib/combine
combinet.sty : macros/latex/contrib/combine
includex.sty : Distributed in the “unsupported” part of macros/latex/contrib/
frankenstein
moredefs.sty : Distributed as part of macros/latex/contrib/frankenstein
newclude.sty : Distributed as part of macros/latex/contrib/frankenstein
pdfpages.sty : macros/latex/contrib/pdfpages
subfiles.cls, etc.: macros/latex/contrib/subfiles

167 Curriculum Vitae (Résumé)


Andrej Brodnik’s class, vita, offers a framework for producing a curriculum vitae. The
class may be customised both for subject (example class option files support both com-
puter scientists and singers), and for language (both the options provided are available
for both English and Slovene). Extensions may be written by creating new class option
files, or by using macros defined in the class to define new entry types, etc.
Didier Verna’s class, curve, is based on a model in which the CV is made of a set
of rubrics (each one dealing with a major item that you want to discuss, such as ‘ed-
ucation’, ‘work experience’, etc). The class’s documentation is supported by a couple
of example files, and an emacs mode is provided.
Xavier Danaux offers a class moderncv which supports typesetting modern curric-
ula vitarum, both in a classic and in a casual style. It is fairly customizable, allowing
you to define your own style by changing the colours, the fonts, etc.
The European Commission has recommended a format for curricula vitarum
within Europe, and Nicola Vitacolonna has developed a class europecv to produce

97
it. While (by his own admission) the class doesn’t solve all problems, it seems
well-thought out and supports all current official EU languages (together with a few
non-official languages, such as Catalan, Galician and Serbian).
The alternative to using a separate class is to impose a package on one of the stan-
dard classes. An example, Axel Reichert’s currvita package, has been recommended
to the FAQ team. Its output certainly looks good.
There is also a LaTeX 2.09 package resume, which comes with little but advice
against trying to use it.
currvita.sty : macros/latex/contrib/currvita
curve.cls: macros/latex/contrib/curve
europecv.cls: macros/latex/contrib/europecv
moderncv.cls: macros/latex/contrib/moderncv
resume.sty : obsolete/macros/latex209/contrib/resume/resume.sty
vita.cls: macros/latex/contrib/vita

168 Letters and the like


LaTeX itself provides a letter document class, which is widely disliked; the present
author long since gave up trying with it. If you nevertheless want to try it, but are
irritated by its way of vertically-shifting a single-page letter, try the following hack:
\makeatletter
\let\@texttop\relax
\makeatother
in the preamble of your file.
Doing-it-yourself is a common strategy; Knuth (for use with plain TeX, in the TeX-
book), and Kopka and Daly (in their Guide to LaTeX) offer worked examples.
Nevertheless, there are contributed alternatives — in fact there are an awfully large
number of them: the following list, of necessity, makes but a small selection.
The largest, most comprehensive, class is newlfm; the lfm part of the name implies
that the class can create letters, faxes and memoranda. The documentation is volumi-
nous, and the package seems very flexible.
Axel Kielhorn’s akletter class is the only other one, recommended for inclusion in
this FAQ, whose documentation is available in English.
The dinbrief class, while recommended, is only documented in German.
There are letter classes in each of the excellent KOMA-script (scrlttr2: documenta-
tion is available in English) and ntgclass (brief : documentation in Dutch only) bundles.
While these are probably good (since the bundles themselves inspire trust) they’ve not
been specifically recommended by any users.
akletter.cls: macros/latex/contrib/akletter
brief.cls: Distributed as part of macros/latex/contrib/ntgclass
dinbrief.cls: macros/latex/contrib/dinbrief
newlfm.cls: macros/latex/contrib/newlfm
scrlttr2.cls: Distributed as part of macros/latex/contrib/koma-script

169 Other “document font” sizes?


The LaTeX standard classes have a concept of a (base) “document font” size; this size
is the basis on which other font sizes (those from \tiny to \Huge) are determined. The
classes are designed on the assumption that they won’t be used with sizes other than the
set that LaTeX offers by default (10–12pt), but people regularly find they need other
sizes. The proper response to such a requirement is to produce a new design for the
document, but many people don’t fancy doing that.
A simple solution is to use the extsizes bundle. This bundle offers “extended”
versions of the article, report, book and letter classes, at sizes of 8, 9, 14, 17 and 20pt
as well as the standard 10–12pt. Since little has been done to these classes other than
to adjust font sizes and things directly related to them, they may not be optimal — but
they’re certainly practical.
More satisfactory are the KOMA-script classes, which are designed to work prop-
erly with the class option files that come with extsizes, and the memoir class that has
its own options for document font sizes 9pt, 14pt and 17pt.
98
extsizes bundle: macros/latex/contrib/extsizes
KOMA script bundle: macros/latex/contrib/koma-script
memoir.cls: macros/latex/contrib/memoir

P.2 Document structure


170 The style of document titles
The titling package provides a number of facilities that permit manipulation of the ap-
pearance of a \maketitle command, the \thanks commands within it, and so on.
The package also defines a titlingpage environment, that offers something in be-
tween the standard classes’ titlepage option and the titlepage environment, and is
itself somewhat configurable.
The memoir class includes all the functionality of the titling package, while the
KOMA-script classes have their own range of different titling styles.
Finally, the indefatigable Vincent Zoonekynd supplies examples of how to program
alternative title styles. The web page is not useful to users unless they are willing to do
their own LaTeX programming.
KOMA script bundle: macros/latex/contrib/koma-script
memoir.cls: macros/latex/contrib/memoir
titling.sty : macros/latex/contrib/titling

171 The style of section headings


Suppose that the editor of your favourite journal has specified that section headings
must be centred, in small capitals, and subsection headings ragged right in italic, but
that you don’t want to get involved in the sort of programming described in section 2.2
of The LaTeX Companion (see TeX-related books; the programming itself is discussed
elsewhere in this FAQ). The following hack will probably satisfy your editor. Define
yourself new commands

\newcommand{\ssection}[1]{%
\section[#1]{\centering\normalfont\scshape #1}}
\newcommand{\ssubsection}[1]{%
\subsection[#1]{\raggedright\normalfont\itshape #1}}

and then use \ssection and \ssubsection in place of \section and \subsection.
This isn’t perfect: section numbers remain in bold, and starred forms need a separate
redefinition.
The package sectsty provides an easy-to-use set of tools to do this job, while the
package titlesec has a structured approach based on redefinition of the sectioning and
chapter commands themselves. Titlesec’s approach allows it to offer far more radical
adjustment: its options provide (in effect) a toolbox for designing your own sectioning
commands’ output.
The fncychap package provides a nice collection of customised chapter heading
designs. The anonchap package provides a simple means of typesetting chapter head-
ings “like section headings” (i.e., without the “Chapter” part of the heading); the
tocbibind package provides the same commands, in pursuit of another end. Unfor-
tunately, fncychap is not attuned to the existence of front- and backmatter in book class
documents.
The memoir class includes facilities that match sectsty and titlesec, as well as a bun-
dle of chapter heading styles (including an anonchap-equivalent). The KOMA-script
classes also have sets of tools that provide equivalent functionality, notably format-
ting specifications \partformat, \chapterformat, \sectionformat, . . . , as well as
several useful overall formatting specifications defined in class options.
Finally, the indefatigable Vincent Zoonekynd supplies examples of how to program
alternative chapter heading styles and section heading styles. The web pages are not
useful to users unless they are willing to do their own LaTeX programming.
The documentation of fncychap is distributed as a separate PostScript file.
anonchap.sty : macros/latex/contrib/misc/anonchap.sty
fncychap.sty : macros/latex/contrib/fncychap
KOMA script bundle: macros/latex/contrib/koma-script

99
memoir.cls: macros/latex/contrib/memoir
sectsty.sty : macros/latex/contrib/sectsty
titlesec.sty : macros/latex/contrib/titlesec
tocbibind.sty : macros/latex/contrib/tocbibind

172 Appendixes
LaTeX provides an exceedingly simple mechanism for appendixes: the command
\appendix switches the document from generating sections (in article class) or chap-
ters (in report or book classes) to producing appendixes. Section or chapter numbering
is restarted and the representation of the counter switches to alphabetic. So:
\section{My inspiration}
...

\section{Developing the inspiration}


...

\appendix
\section{How I became inspired}
...

would be typeset (in an article document) something like:


1 My inspiration
...
2 Developing the inspiration
...
A How I became inspired
...
which is quite enough for many ordinary purposes. Note that, once you’ve switched to
typesetting appendixes, LaTeX provides you with no way back — once you’ve had an
appendix, you can no longer have an “ordinary” \section or \chapter.
The appendix provides several ways of elaborating on this simple setup. Straight-
forward use of the package allows you to have a separate heading, both in the body of
the document and the table of contents; this would be achieved by
\usepackage{appendix}
...
\appendix
\appendixpage
\addappheadtotoc

The \appendixpage command adds a separate title “Appendices” above the first ap-
pendix, and \addappheadtotoc adds a similar title to the table of contents. These
simple modifications cover many people’s needs about appendixes.
The package also provides an appendices environment, which provides for fancier
use. The environment is best controlled by package options; the above example would
be achieved by
\usepackage[toc,page]{appendix}
...
\begin{appendices}
...
\end{appendices}

The great thing that the appendices environment gives you, is that once the environ-
ment ends, you can carry on with sections or chapters as before — numbering isn’t
affected by the intervening appendixes.
The package provides another alternative way of setting appendixes, as inferior di-
visions in the document. The subappendices environment allows you to put separate
appendixes for a particular section, coded as \subsections, or for a particular chapter,
coded as \sections. So one might write:
100
\usepackage{appendix}
...
\section{My inspiration}
...
\begin{subappendices}
\subsection{How I became inspired}
...
\end{subappendices}

\section{Developing the inspiration}


...

Which will produce output something like:

1 My inspiration
...
1.A How I became inspired
...
2 Developing the inspiration
...

There are many other merry things one may do with the package; the user is referred
to the package documentation for further details.
The memoir class includes the facilities of the appendix package. The KOMA-
script classes offer a \appendixprefix command for manipulating the appearance of
appendixes.
appendix.sty : macros/latex/contrib/appendix
KOMA script bundle: macros/latex/contrib/koma-script
memoir.cls: macros/latex/contrib/memoir

173 Indent after section headings


LaTeX implements a style that doesn’t indent the first paragraph after a section heading.
There are coherent reasons for this, but not everyone likes it. The indentfirst package
suppresses the mechanism, so that the first paragraph is indented.
indentfirst.sty : Distributed as part of macros/latex/required/tools

174 How to create a \subsubsubsection


LaTeX’s set of “sections” stops at the level of \subsubsection. This reflects a design
decision by Lamport — for, after all, who can reasonably want a section with such
huge strings of numbers in front of it?
In fact, LaTeX standard classes do define “sectioning” levels lower than \subsubsection,
but they don’t format them like sections (they’re not numbered, and the text is run-in
after the heading). These deeply inferior section commands are \paragraph and
\subparagraph; you can (if you must) arrange that these two commands produce
numbered headings, so that you can use them as \subsubsubsections and lower.
The titlesec allows you to adjust the definitions of the sectioning macros, and it
may be used to transform a \paragraph’s typesetting so that it looks like that of a
\section.
If you want to program the change yourself, you’ll find that the commands
(\section all the way down to \subparagraph) are defined in terms of the inter-
nal \@startsection command, which takes 6 arguments. Before attempting this sort
of work, you are well advised to read the LaTeX sources (ltsect.dtx in the LaTeX
distribution) and the source of the standard packages (classes.dtx). The LaTeX
Companion discusses use of \@startsection for this sort of thing.
LaTeX source: macros/latex/base
titlesec.sty : macros/latex/contrib/titlesec

101
175 The style of captions
Changes to the style of captions may be made by redefining the commands that produce
the caption. So, for example, \fnum@figure (which produces the float number for
figure floats) may be redefined, in a package of your own, or between \makeatletter–
\makeatother:

\renewcommand{\fnum@figure}{\textbf{Fig.~\thefigure}}

which will cause the number to be typeset in bold face. (Note that the original def-
inition used \figurename — \figurename.) More elaborate changes can be made
by patching the \caption command, but since there are packages to do the job, such
changes (which can get rather tricky) aren’t recommended for ordinary users.
The float package provides some control of the appearance of captions, though it’s
principally designed for the creation of non-standard floats. The caption and ccaption
(note the double “c”) packages provide a range of different formatting options.
ccaption also provides ‘continuation’ captions and captions that can be placed out-
side of float environments. The (very simple) capt-of package also allows captions
outside a float environment. Note that care is needed when doing things that assume
the sequence of floats (as in continuation captions), or potentially mix non-floating
captions with floating ones.
The memoir class includes the facilities of the ccaption package; the KOMA-script
classes also provide a wide range of caption-formatting commands.
The documentation of caption is available by processing a file manual.tex, which
is created when you unpack caption.dtx
Note that the previously-recommended package caption2 has now been overtaken
again by caption; however, caption2 remains available for use in older documents.
caption.sty : macros/latex/contrib/caption
capt-of.sty : macros/latex/contrib/misc/capt-of.sty
ccaption.sty : macros/latex/contrib/ccaption
float.sty : macros/latex/contrib/float
KOMA script bundle: macros/latex/contrib/koma-script
memoir.cls: macros/latex/contrib/memoir

176 Alternative head- and footlines in LaTeX


The standard LaTeX document classes define a small set of ‘page styles’ which specify
head- and footlines for your document (though they can be used for other purposes,
too). The standard set is very limited, but LaTeX is capable of much more. The internal
LaTeX coding needed to change page styles is not particularly challenging, but there’s
no need — there are packages that provide useful abstractions that match the way we
typically think about these things.
The fancyhdr package provides simple mechanisms for defining pretty much every
head- or footline variation you could want; the directory also contains some documen-
tation and one or two smaller packages. Fancyhdr also deals with the tedious behaviour
of the standard styles with initial pages, by enabling you to define different page styles
for initial and for body pages.
While fancyhdr will work with KOMA-script classes, an alternative package,
scrpage2, eases integration with the classes. Scrpage2 may also be used as a fancyhdr
replacement, providing similar facilities. The KOMA-script classes themselves permit
some modest redefinition of head- and footlines, without the use of the extra package.
Memoir also contains the functionality of fancyhdr, and has several predefined
styles.
Documentation of fancyhdr is distributed with the package, in a separate file; doc-
umentation of scrpage2 is integrated with the scrgui* documentation files that are
distributed with the KOMA-script classes.
fancyhdr.sty : macros/latex/contrib/fancyhdr
KOMA script bundle: macros/latex/contrib/koma-script
memoir.cls: macros/latex/contrib/memoir

102
177 Wide figures in two-column documents
Floating figures and tables ordinarily come out the same width as the page, but in two-
column documents they’re restricted to the width of the column. This is sometimes
not good enough; so there are alternative versions of the float environments — in two-
column documents, figure* provides a floating page-wide figure (and table* a page-
wide table) which will do the necessary.
The “*”ed float environments can only appear at the top of a page, or on a whole
page — h or b float placement directives are simply ignored.
Unfortunately, page-wide equations can only be accommodated inside float en-
vironments. You should include them in figure environments, or use the float or
ccaptionpackage to define a new float type.
ccaption.sty : macros/latex/contrib/ccaption
float.sty : macros/latex/contrib/float

178 1-column abstract in 2-column document


One often requires that the abstract of a paper should appear across the entire page,
even in a two-column paper. The required trick is:
\documentclass[twocolumn]{article}
...
\begin{document}
... % \author, etc
\twocolumn[
\begin{@twocolumnfalse}
\maketitle
\begin{abstract}
...
\end{abstract}
\end{@twocolumnfalse}
]
Unfortunately, with the above \thanks won’t work in the \author list. If you need
such specially-numbered footnotes, you can make them like this:
\title{Demonstration}
\author{Me, You\thanks{}}
\twocolumn[
... as above ...
]
{
\renewcommand{\thefootnote}%
{\fnsymbol{footnote}}
\footnotetext[1]{Thanks for nothing}
}
and so on.
As an alternative, among other facilities the abstract package provides a \saythanks
command and a onecolabstract environment which remove the need to fiddle with
the \thanks and footnoting. They can be used like this:
\twocolumn[
\maketitle % full width title
\begin{onecolabstract} % full width abstract
... text
\end{onecolabstract}
]
\saythanks % typeset any \thanks
The memoir class offers all the facilities of abstract.
abstract.sty : macros/latex/contrib/abstract
memoir.cls: macros/latex/contrib/memoir

179 Really blank pages between chapters


Book (by default) and report (with openright class option) ensure that each chapter
starts on a right-hand (recto) page; they do this by inserting a \cleardoublepage

103
command between chapters (rather than a mere \clearpage). The empty page thus
created gets to have a normal running header, which some people don’t like.
The (excellent) fancyhdr manual covers this issue, basically advising the creation
of a command \clearemptydoublepage:

\let\origdoublepage\cleardoublepage
\newcommand{\clearemptydoublepage}{%
\clearpage
{\pagestyle{empty}\origdoublepage}%
}

The “obvious” thing is then to use this command to replace \cleardoublepage in a


patched version of the \chapter command. (Make a package of your own containing
a copy of the command out of the class.) This isn’t particularly difficult, but you can
instead simply subvert \cleardoublepage (which isn’t often used elsewhere):

\let\cleardoublepage\clearemptydoublepage

Note: this command works because \clearemptydoublepage uses a copy of \cleardoublepage:


instructions on macro programming patching techniques explain the problem and why
this is a solution.
Note that the KOMA-Script replacements for the book amd report classes (scrbook
and scrreprt offers class options cleardoubleempty, cleardoubleplain and cleardoublestandard
(using the running page style, as normal) that control the appearance of these empty
pages. The classes also offer do-it-yourself commands \cleardoubleempty (etc.).
The memoir class (and the nextpage package) provide commands \cleartooddpage
and \cleartoevenpage, which both take an optional argument (the first, with no ar-
gument, being an equivalent of \cleardoublepage). One can achieve ‘special’ effects
by putting commands in the optional argument: the \clearemptydoublepage we’re
after would be achieved by \cleartooddpage[\thispagestyle{empty}]. The
commands will also serve if you want the surreal effect of “This page intentionally left
blank” in the centre of an otherwise empty page.
fancyhdr : macros/latex/contrib/fancyhdr
memoir.cls: macros/latex/contrib/memoir
nextpage.sty : macros/latex/contrib/misc/nextpage.sty
scrbook.cls, scrrept.cls: Part of macros/latex/contrib/koma-script

180 Balancing columns at the end of a document


The twocolumn option of the standard classes causes LaTeX to set the text of a doc-
ument in two columns. However, the last page of the document typically ends up
with columns of different lengths — such columns are said to be “unbalanced”. Many
(most?) people don’t like unbalanced columns.
The simplest solution to the problem is to use the multicol package in place of the
twocolumn option, as multicol balances the columns on the final page by default. How-
ever, the use of multicol does come at a cost: its special output routine disallows the use
of in-column floats, though it does still permit full-width (e.g., figure* environment)
floats.
As a result, there is a constant push for a means of balancing columns at the end
of a twocolumn document. Of course, the job can be done manually: \pagebreak
inserted at the appropriate place on the last page can often produce the right effect, but
this seldom appeals, and if the last page is made up of automatically-generated text (for
example, bibliography or index) inserting the command will be difficult.
The flushend package offers a solution to this problem. It’s a somewhat dangerous
piece of macro code, which patches one of the most intricate parts of the LaTeX kernel
without deploying any of the safeguards discussed in patching commands. The package
only changes the behaviour at end document (its \flushend command is enabled by
default), and one other command permits adjustment of the final balance; other pack-
ages in the bundle provide means for insertion of full width material in two-column
documents.
The balance package also patches the output routine (somewhat more carefully
than flushend).

104
The user should be aware that any of these packages are liable to become confused
in the presence of floats: if problems arise, manual adjustment of the floats in the
document is likely to be necessary. It is this difficulty (what’s required in any instance
can’t really be expressed in current LaTeX) that led the author of multicol to suppress
single-column-wide floats.
balance.sty : Distributed as part of macros/latex/contrib/preprint
flushend.sty : Distributed as part of macros/latex/contrib/sttools
multicol.sty : Distributed as part of macros/latex/required/tools

181 My section title is too wide for the page header


By default, LaTeX sectioning commands make the chapter or section title available for
use by page headers and the like. Page headers operate in a rather constrained area, and
it’s common for titles too be too big to fit: the LaTeX sectioning commands therefore
take an optional argument:
\section[short title]{full title}

If the hshort titlei is present, it is used both for the table of contents and for the page
heading. The usual answer to people who complain that their title is too big for the
running head is to suggest that they the optional argument.
However, using the same text for the table of contents as for the running head may
also be unsatisfactory: if your chapter titles are seriously long (like those of a Victorian
novel), a valid and rational scheme is to have a shortened table of contents entry, and a
really terse entry in the running head.
One of the problems is the tendency of page headings to be set in capitals (which
take up more space); so why not set headings as written for “ordinary” reading? It’s
not possible to do so with unmodified LaTeX, but the fancyhdr package provides a
command \nouppercase for use in its header (and footer) lines to suppress LaTeX’s
uppercasing tendencies. Classes in the KOMA-script bundle don’t uppercase in the first
place.
In fact, the sectioning commands use ‘mark’ commands to pass information to
the page headers. For example, \chapter uses \chaptermark, \section uses
\sectionmark, and so on. With this knowledge, one can achieve a three-layer struc-
ture for chapters:
\chapter[middling version]{verbose version}
\chaptermark{terse version}

which should supply the needs of every taste.


Chapters, however, have it easy: hardly any book design puts a page header on
a chapter start page. In the case of sections, one has typically to take account of the
nature of the \*mark commands: the thing that goes in the heading is the first mark
on the page (or, failing any mark, the last mark on any previous page). As a result the
recipe for sections is more tiresome:
\section[middling version]{verbose version%
\sectionmark{terse version}}
\sectionmark{terse version}

(the first \sectionmark deals with the header of the page the \section command falls
on, and the second deal with subsequent pages; note that here, you need the optional
argument to \section, even if “middling version” is in fact the same text as “long
version”.)
A similar arrangement is necessary even for chapters if the class you’re using is
odd enough that it puts a page header on a chapter’s opening page.
Note that the titlesec package manages the running heads in a completely different
fashion; users of that package should refer to the documentation.
The memoir class avoids all the silliness by providing an extra optional argument
for chapter and sectioning commands, for example:
\section[middling version][terse version]{verbose version}

As a result, it is always possible for users of memoir to tailor the header text to fit, with
very little trouble.
105
fancyhdr.sty : macros/latex/contrib/fancyhdr
KOMA script bundle: macros/latex/contrib/koma-script
memoir.cls: macros/latex/contrib/memoir
titlesec.sty : macros/latex/contrib/titlesec

182 Page numbering “hni of hmi”


Finding the page number of the last page of a document, from within the document, is
somewhat tricky. The lastpage package is therefore supplied to make life easy for us
all; it defines a label LastPage whose number is right (after sufficiently many passes
through LaTeX, of course). The memoir class also defines a “last page” label.
The documentation of the fancyhdr package spells out exactly how one might ac-
tually use this information to produce page numbering as suggested in the question.
fancyhdr documentation: macros/latex/contrib/fancyhdr
lastpage.sty : macros/latex/contrib/lastpage

183 Page numbering by chapter


When I was a young man, a common arrangement for loose bound technical manuals
is to number pages by chapter. (It’s quite a good scheme, in those situations: even if
your corrections add a whole page to the chapter, the most you have to redistribute is
that chapter.)
The problem, at first sight, seems pretty much the same as that in another answer
on running numbers within a chapter (running numbers within a chapter), and the basic
technique is indeed pretty similar.
However, tidying-up loose ends, making sure the page number gets reset to the
correct value at the start of each chapter, and so on, is slightly more challenging. This
is why the chappg package was written: it does the obvious things, and more.
Users have been known to ask for running page numbers within a section, but this
really doesn’t make sense: you need to run page numbers within document objects that
always start on a fresh page.
Documentation of chappg is to be found in the package file.
chappg.sty : macros/latex/contrib/chappg

P.3 Page layout


184 Printer paper sizes
Paper sizes can be a pain: they’re a forgotten backwater, because there’s no DVI com-
mand to specify the paper size of the document. One usually finds American “letter”
paper size being used, by default, in macro packages (such as plain and LaTeX); but
distributions provide configuration files for DVI drivers (and since most distributions
originate in Europe, the drivers usually default to ISO “A4” paper size).
This is (of course) pretty unsatisfactory. Users may change the paper size their
document is designed for, pretty easily (and once off), but they have to ensure that
every run of xdvi, dvips, or whatever, is given the correct override for using anything
non-‘standard’.
Of course, the default paper size for DVI drivers may be changed by a distribution
management command, but this still doesn’t provide for people using the “wrong” sort
of paper for some reason.
An interestingly different issue arises for users of PDFTeX — the PDF format
does have the means of expressing paper size, but much of the core software pre-
dates PDFTeX, so not even PDFLaTeX sets the correct values into \pdfpagewidth
and \pdfpageheight.
The DVI drivers dvips and dvipdfm define \special commands for the document
to specify its own paper size; so in those cases, as in the case of PDFTeX and VTeX,
the paper size can be programmed by the document. Users who wish to, may of course
consult the manuals of the various programs to write the necessary code.
The geometry package (whose main business is defining typeset page areas), also
takes notice of the paper size the document is going to print to, and can issue the
commands necessary to ensure the correct size is used. If geometry is used when a
document is being processed by either PDFLaTeX or VTeX, it will set the necessary
dimensions as a matter of course. If the document is being processed by LaTeX on a

106
TeX or e-TeX engine, there are two package options (dvipdfm and dvips) which in-
struct geometry which \special commands to use. (Note that the options are ignored
if you are using either PDFLaTeX or VTeX.)
So, the resolution of the problem is to add
\usepackage[dvixxx,...]{geometry}

(where dvixxx is your current favourite DVI driver), and the document will run cor-
rectly with any of LaTeX (whether or not run on VTeX) or PDFLaTeX.
Give the typearea package the pagesize and it will do the same job, for PDFLaTeX
output and PostScript output from LaTeX via dvips.
geometry.sty : macros/latex/contrib/geometry
typearea.sty : Distributed as part of macros/latex/contrib/koma-script

185 Changing the margins in LaTeX


Changing the layout of a document’s text on the page involves several subtleties not
often realised by the beginner. There are interactions between fundamental TeX con-
straints, constraints related to the design of LaTeX, and good typesetting and design
practice, that mean that any change must be very carefully considered, both to ensure
that it “works” and to ensure that the result is pleasing to the eye.
LaTeX’s defaults sometimes seem excessively conservative, but there are sound
reasons behind how Lamport designed the layouts themselves, whatever one may feel
about his overall design. For example, the common request for “one-inch margins all
round on A4 paper” is fine for 10- or 12-pitch typewriters, but not for 10pt (or even
11pt or 12pt) type because readers find such wide, dense, lines difficult to read. There
should ideally be no more than 75 characters per line (though the constraints change
for two-column text).
So Lamport’s warning to beginners in his section on ‘Customizing the Style’ —
“don’t do it” — should not lightly be ignored.
This set of FAQs recommends that you use a package to establish consistent settings
of the parameters: the interrelationships are taken care of in the established packages,
without you needing to think about them, but
The following answers deal with the ways one may choose to proceed:
• Choose which package to use.
• Find advice on setting up page layout by hand.
There is a related question — how to change the layout temporarily — and there’s an
answer that covers that, too:
• Change the margins on the fly.
186 Packages to set up page designs
The ‘ultimate’ tool for adjusting the dimensions and position of the printed material on
the page is the geometry package; a very wide range of adjustments of the layout may
be relatively straightforwardly programmed, and package documentation is good and
comprehensive.
As is usual, users of the memoir class have built-in facilities for this task, and users
of the KOMA-script classes are recommended to use an alternative package, typearea.
In either case it is difficult to argue that users should go for geometry: both alternatives
are good.
The documentation of geometry is a bit overwhelming, and learning all its capabili-
ties may be more than you ever need. Somewhat simpler to use is the vmargin package,
which has a canned set of paper sizes (a superset of that provided in LaTeX 2ε ), provi-
sion for custom paper, margin adjustments and provision for two-sided printing.
geometry.sty : macros/latex/contrib/geometry
KOMA script bundle: macros/latex/contrib/koma-script
layout.sty : Distributed as part of macros/latex/required/tools
memoir.cls: macros/latex/contrib/memoir
typearea.sty : Distributed as part of macros/latex/contrib/koma-script
vmargin.sty : macros/latex/contrib/vmargin
107
187 How to set up page layout “by hand”
So you’re eager to do it yourself, notwithstanding the cautions outlined in “changing
margins”.
It’s important that you first start by familiarising yourself with LaTeX’s page layout
parameters. For example, see section C.5.3 of the LaTeX manual (pp. 181-182), or
corresponding sections in many of the other good LaTeX manuals (see LaTeX books).
LaTeX controls the page layout with a number of parameters, which allow you to
change the distance from the edges of a page to the left and top edges of your typeset
text, the width and height of the text, and the placement of other text on the page. How-
ever, they are somewhat complex, and it is easy to get their interrelationships wrong
when redefining the page layout. The layout package defines a \layout command
which draws a diagram of your existing page layout, with the dimensions (but not their
interrelationships) shown.
Even changing the text height and width, \textheight and \textwidth, requires
more care than you might expect: the height should be set to fit a whole number of text
lines (in terms of multiples of \baselinskip), and the width should be constrained by
the number of characters per line, as mentioned in “changing margins”.
Margins are controlled by two parameters: \oddsidemargin and \evensidemargin,
whose names come from the convention that odd-numbered pages appear on the right-
hand side (‘recto’) of a two-page spread and even-numbered pages on the left-hand
side (‘verso’). Both parameters actually refer to the left-hand margin of the relevant
pages; in each case the right-hand margin is specified by implication, from the value of
\textwidth and the width of the paper. (In a one-sided document, which is the default
in many classes, including the standard article and report classes, \oddsidemargin
stands for both.)
The “origin” (the zero position) on the page is one inch from the top of the paper
and one inch from the left side; positive horizontal measurements extend right across
the page, and positive vertical measurements extend down the page. Thus, the parame-
ters \evensidemargin, \oddsidemargin and \topmargin, should be set to be 1 inch
less than the true margin; for margins closer to the left and top edges of the page than
1 inch, the margin parameters must be set to negative values.
188 Changing margins “on the fly”
One of the surprises characteristic of TeX use is that you cannot change the width or
height of the text within the document, simply by modifying the text size parameters;
TeX can’t change the text width on the fly, and LaTeX only ever looks at text height
when starting a new page.
So the simple rule is that the parameters should only be changed in the preamble of
the document, i.e., before the \begin{document} statement (so before any typesetting
has happened.
To adjust text width within a document we define an environment:

\newenvironment{changemargin}[2]{%
\begin{list}{}{%
\setlength{\topsep}{0pt}%
\setlength{\leftmargin}{#1}%
\setlength{\rightmargin}{#2}%
\setlength{\listparindent}{\parindent}%
\setlength{\itemindent}{\parindent}%
\setlength{\parsep}{\parskip}%
}%
\item[]}{\end{list}}

The environment takes two arguments, and will indent the left and right margins, re-
spectively, by the parameters’ values. Negative values will cause the margins to be
narrowed, so \begin{changemargin}{-1cm}{-1cm} narrows the left and right mar-
gins by 1 centimetre.
Given that TeX can’t do this, how does it work? — well, the environment (which is
a close relation of the LaTeX quote environment) doesn’t change the text width as far
as TeX is concerned: it merely moves text around inside the width that TeX believes
in.

108
The chngpage package provides ready-built commands to do the above; it includes
provision for changing the shifts applied to your text according to whether you’re on
an odd or an even page of a two-sided document. The package’s documentation (in
the file itself) suggests a strategy for changing text dimensions between pages — as
mentioned above, changing the text dimensions within the body of a page may lead to
unpredictable results.
Changing the vertical dimensions of a page is clunkier still: the LaTeX command
\enlargethispage adjusts the size of the current page by the size of its argument.
Common uses are

\enlargethispage{\baselineskip}

to make the page one line longer, or

\enlargethispage{-\baselineskip}

to make the page one line shorter.


chngpage.sty : macros/latex/contrib/misc/chngpage.sty

189 How to get rid of page numbers


The package nopageno will suppress page numbers in a whole document.
To suppress page numbers from a single page, use \thispagestyle{empty}
somewhere within the text of the page. (Note that \maketitle and \chapter both use
\thispagestyle internally, so you need to call it after you’ve called them.)
To suppress page numbers from a sequence of pages, you may use \pagestyle
{empty} at the start of the sequence, and restore the original page style at the end. Un-
fortunately, you still have to use \thispagestyle after any \maketitle or \chapter
command.
In the memoir class, the troublesome commands (\maketitle, \chapter, etc.)
invoke their own page style (title, chapter, etc.), which you may redefine using the
class’s own techniques to be equivalent to “empty”. The KOMA-script classes have
commands that contain the page style to be used, so one might say:

\renewcommand*{\titlepagestyle}{empty}

An alternative (in all classes) is to use the rather delightful \pagenumbering


{gobble}; this has the simple effect that any attempt to print a page number produces
nothing, so there’s no issue about preventing any part of LaTeX from printing the
number. However, the \pagenumbering command does have the side effect that it
resets the page number (to 1), which may be undesirable.
The scrpage2 package separates out the representation from the resetting; so one
can say

\renewcommand*{\pagemark}{}

to have the same effect as the gobble trick, without resetting the page number.
nopageno: macros/latex/contrib/carlisle/nopageno.sty
KOMA script bundle: macros/latex/contrib/koma-script
memoir.cls: macros/latex/contrib/memoir
scrpage2.sty : Distributed as part of macros/latex/contrib/koma-script

190 \pagestyle{empty} on first page in LaTeX


If you use \pagestyle{empty}, but the first page is numbered anyway, you are prob-
ably using the \maketitle command too. The behaviour is not a bug but a fea-
ture. The standard LaTeX classes are written so that initial pages (pages containing
a \maketitle, \part, or \chapter) have a different page style from the rest of the
document; to achieve this, the commands internally issue \thispagestyle{plain}.
This is usually not acceptable behaviour if the surrounding page style is ‘empty’.
Possible workarounds include:

• Put \thispagestyleempty immediately after the \maketitle command, with


no blank line between them.

109
• Use the fancyhdr or scrpage2 packages, which allow you to customise the style
for initial pages independently of that for body pages.
For example, use fancyhdr commands:
\fancypagestyle{plain}{%
\fancyhf{}%
\renewcommand{\headrulewidth}{0pt}%
\renewcommand{\footrulewidth}{0pt}%
}
and the “empty” page style (invoked by \chapter commands and title pages) will
have no header or footer.
• If you are using either the memoir class or a KOMA-script class, use the techniques
outlined for them in “no page numbers”.

fancyhdr.sty : macros/latex/contrib/fancyhdr
KOMA script bundle: macros/latex/contrib/koma-script
memoir.cls: macros/latex/contrib/memoir
nopageno.sty : macros/latex/contrib/carlisle/nopageno.sty
scrpage2.sty : Distributed as part of macros/latex/contrib/koma-script

191 How to create crop marks


If you’re printing something that’s eventually to be reproduced in significant quantities,
and bound, it’s conventional to print on paper larger than your target product, and to
place “crop marks” outside the printed area. These crop marks are available to the
production house, for lining up reproduction and trimming machines.
You can save yourself the (considerable) trouble of programming these marks for
yourself by using the package crop, which has facilities to satisfy any conceivable
production house. Users of the memoir class don’t need the package, since memoir has
its own facilities for programming crop marks.
crop.sty : macros/latex/contrib/crop
memoir.cls: macros/latex/contrib/memoir

192 ‘Watermarks’ on every page


It’s often useful to place some text (such as ‘DRAFT’) in the background of every page
of a document. For LaTeX users, this can be achieved with the draftcopy package.
This can deal with many types of DVI processors (in the same way that the graphics
package does) and knows translations for the word ‘DRAFT’ into a wide range of
languages (though you can choose your own word, too).
More elaborate watermarks may be achieved using the eso-pic package, which in
turn uses the package everyshi, part of Martin Schröder’s ms bundle. Eso-pic attaches
a picture environment to every page as it is shipped out; you can put things into
that environment. The package provides commands for placing things at certain useful
points (like “text upper left” or “text centre”) in the picture, but you’re at liberty to do
what you like.
The wallpaper package builds, as above, on eso-pic. Apart from the single-image
backdrops described above (“wallpapers”, of course, to this package), the package pro-
vides facilities for tiling images. All its commands come in pairs: on for “general” use,
and one applying to the current page only.
draftcopy.sty : macros/latex/contrib/draftcopy
eso-pic.sty : macros/latex/contrib/eso-pic
everyshi.sty : Distributed as part of macros/latex/contrib/ms
wallpaper.sty : macros/latex/contrib/wallpaper

193 Typesetting things in landscape orientation


It’s often necessary to typeset part of a document in landscape orientation; to achieve
this, one needs not only to change the page dimensions, but also to instruct the output
device to print the strange page differently.
There are two “ordinary” mechanisms for doing two slight variations of landscape
typesetting:

110
• If you have a single floating object that is wider than it is deep, and will only
fit on the page in landscape orientation, use the rotating package; this defines
sidewaysfigure and sidewaystable environments which create floats that oc-
cupy a whole page.
Note that rotating has problems in a document that also loads the float package,
which recommended in other answers in these FAQs, for example that on float
placement. The rotfloat package loads rotating for you, and smooths the interac-
tion with float.
• If you have a long sequence of things that need to be typeset in landscape (per-
haps a code listing, a wide tabbing environment, or a huge table typeset using
longtable or supertabular), use the lscape package (or pdflscape if you’re gener-
ating PDF output, whether using PDFLaTeX or dvips and generating PDF from
that). Both packages define an environment landscape, which clears the current
page and restarts typesetting in landscape orientation (and clears the page at the
end of the environment before returning to portrait orientation).

No currently available package makes direct provision for typesetting in both portrait
and landscape orientation on the same page (it’s not the sort of thing that TeX is well
set-up to do). If such behaviour was an absolute necessity, one might use the techniques
described in "flowing text around figures", and would rotate the landscape portion using
the rotation facilities of the graphics package. (Returning from landscape to portrait
orientation would be somewhat easier: the portrait part of the page would be a bottom
float at the end of the landscape section, with its content rotated.)
To set an entire document in landscape orientation, one might use lscape around the
whole document. A better option is the landscape option of the geometry package; if
you also give it dvips or pdftex option, geometry also emits the rotation instructions
to cause the output to be properly oriented. The memoir class has the same facilities,
in this respect, as does geometry.
A word of warning: most current TeX previewers do not honour rotation requests
in DVI files. Your best bet is to convert your output to PostScript or to PDF, and to
view these ‘final’ forms with an appropriate viewer.
geometry.sty : macros/latex/contrib/geometry
graphics.sty : Distributed as part of macros/latex/required/graphics
longtable.sty : Distributed as part of macros/latex/required/tools
lscape.sty : Distributed as part of macros/latex/required/graphics
memoir.cls: macros/latex/contrib/memoir
pdflscape.sty : Distributed with Heiko Oberdiek’s packages macros/latex/
contrib/oberdiek
rotating.sty : macros/latex/contrib/rotating
rotfloat.sty : macros/latex/contrib/rotfloat
supertabular.sty : macros/latex/contrib/supertabular

194 Putting things at fixed positions on the page


TeX’s model of the world is (broadly speaking) that the author writes text, and TeX
and its macros decide how it all fits on the page. This is not good news for the author
who has, from whatever source, a requirement that certain things go in exactly the right
place on the page.
There are places on the page, from which things may be hung, and two LaTeX
packages allow you position things relative to such points, thus providing a means of
absolute positioning.
The textpos package aids the construction of pages from “blobs”, dotted around
over the page (as in a poster); you give it the location, and it places your typeset box
accordingly.
The eso-pic defines a “shipout picture” that covers the page. The user may add
picture-mode commands to this picture, which of course can include box placements
as well as the other rather stilted commands of picture-mode. (Eso-pic requires the
services of everyshi, which must therefore also be available.)
eso-pic.sty : macros/latex/contrib/eso-pic

111
everyshi.sty : Distributed as part of macros/latex/contrib/ms
textpos.sty : macros/latex/contrib/textpos

195 Preventing page breaks between lines


One commonly requires that a block of typeset material be kept on the same page; it
turns out to be surprisingly tricky to arrange this.
LaTeX provides a samepage environment which claims it does this sort of thing for
you. It proceeds by setting infinite penalties for all sorts of page-break situations; but
in many situations where you want to prevent a page break, samepage doesn’t help. If
you’re trying to keep running text together, you need to end the paragraph inside the
environment (see preserving paragraph parameters). Also, if the things you are trying
to keep together insert their own pagebreak hints, samepage has no power over them: a
good exaple is list items — they suggest page breaks between them. Even if samepage
does work, it’s likely to leave stuff jutting out at the bottom of the page.
A convenient trick is to set all the relevant stuff in a \parbox (or a minipage if
it contains things like verbatim text that may not be in the argument of a \parbox).
The resulting box certainly won’t break between pages, but that’s not to say that it will
actually do what you want it to do: again, the box may be left jutting out at the bottom
of the page.
Why do neither of these obvious things work? Because TeX can’t really distin-
guish between infinitely awful things. Samepage will make any possible break point
“infinitely bad” and boxes don’t even offer the option of breaks, but if the alternative
is the leave an infinitely bad few centimetres of blank paper at the bottom of the page,
TeX will take the line of least resistance and do nothing.
This problem still arises even if you have \raggedbottom in effect: TeX doesn’t
notice the value of that until it starts actually shipping a page out. One approach is to
set:
\raggedbottom
\addtolength{\topskip}{0pt plus 10pt}

The 10pt offers a hint to the output routine that the column is stretchable; this will
cause TeX to be more tolerant of the need to stretch while building the page. If you’re
doing this as a temporary measure, cancel the change to \topskip by:
\addtolength{\topskip}{0pt plus-10pt}

as well as resetting \flushbottom. (Note that the 10pt never actually shows up, be-
cause it is overwhelmed when the page is shipped out by the stretchability introduced
by \raggedbottom; however, it could well have an effect if \flushbottom was in
effect.)
An alternative (which derives from a suggestion by Knuth in the TeXbook) is the
package needspace or the memoir class, which both define a command \needspace
whose argument tells it what space is needed. If the space isn’t available, the current
page is cleared, and the matter that needs to be kept together will be inserted on the
new page. For example, if 4 lines of text need to be kept together, the sequence
\par
\needspace{4\baselineskip}
% the stuff that must stay together
<text generating lines 1-4>
% now stuff we don’t mind about

Yet another trick by Knuth is useful if you have a sequence of small blocks of text that
need, individually, to be kept on their own page. Insert the command \filbreak before
each small block, and the effect is achieved. The technique can be used in the case of
sequences of LaTeX-style sections, by incorporating \filbreak into the definition of
a command (as in patching commands). A simple and effective patch would be:
\let\oldsubsubsection=\subsubsection
\renewcommand{\subsubsection}{%
\filbreak
\oldsubsubsection
}
112
While the trick works for consecutive sequences of blocks, it’s slightly tricky to get
out of such sequences unless the sequence is interrupted by a forced page break (such
as \clearpage, which may be introduced by a \chapter command, or the end of the
document). If the sequence is not interrupted, the last block is likely to be forced onto
a new page, regardless of whether it actually needs it.
If one is willing to accept that not everything can be accomplished totally automat-
ically, the way to go is to typeset the document and to check for things that have the po-
tential to give trouble. In such a scenario (which has Knuth’s authority behind it, if one
is to believe the rather few words he says on the subject in the TeXbook) one can decide,
case by case, how to deal with problems at the last proof-reading stage. The alterna-
tives are to insert \clearpage commands as necessary, or to use \enlargethispage.
Supposing you have a line or two that stray: issue the command \enlargethispage
{2\baselineskip} and two lines are added to the page you’re typesetting. It depends
on the document whether this looks impossibly awful or entirely acceptable, but the
command remains a useful item in the armoury.
memoir.cls: macros/latex/contrib/memoir
needspace.sty : macros/latex/contrib/misc/needspace.sty

196 Parallel setting of text


It’s commonly necessary to present text in two languages ‘together’ on a page, or on a
two-page spread. For this to be satisfactory, one usually needs some sort of alignment
between the two texts.
The parallel package satisfies the need, permitting typesetting in two columns (not
necessarily of the same width) on one page, or on the two opposing pages of a two-page
spread. Use can be as simple as

\usepackage{parallel}
...
\begin{Parallel}{<left-width>}{<right-width}
\ParallelLText{left-text}
\ParallelRText{right-text}
\ParallelPar
...
\end{Parallel}

The parcolumns package can (in principle) deal with any number of columns: the
documentation shows its use with three columns. Usage is rather similar to that of
parallel, though there is of course a “number of columns to specify”:

\usepackage{parcolumns}
...
\begin{parcolumns}[<options>]{3}
\colchunk{<Column 1 text>}
\colchunk{<Column 2 text>}
\colchunk{<Column 3 text>}
\colplacechunks
...
\end{parcolumns}

The hoptionsi can specify the widths of the columns, whether to place rules between
the columns, whether to set the columns sloppy, etc.
The ledpar package is distributed with (and integrated with) the ledmac package.
It provides parallel setting carefully integrated with the needs of a scholarly text, per-
mitting translation, or notes, or both, to be set in parallel with the ‘base’ text of the
document.
ledpar.sty : Distributed with macros/latex/contrib/ledmac
parallel.sty : macros/latex/contrib/parallel
parcolumns.sty : Distributed as part of macros/latex/contrib/sauerj

113
197 Typesetting epigraphs
Epigraphs are those neat quotations that authors put at the start of chapters (or even at
the end of chapters: Knuth puts things at the ends of chapters of the TeXbook).
Typesetting them is a bit of a fiddle, but not impossible to do for yourself. Fortu-
nately, there are two packages that do the job, to some extent; there are also facilities
in the two “big” classes (memoir and koma-script.
The epigraph package defines an \epigraph command, for creating a single epi-
graph (as at the top of a chapter):

\chapter{The Social Life of Rabbits}


\epigraph{Oh! My ears and whiskers!}%
{Lewis Carroll}

and an epigraphs environment, for entering more than one epigraph consecutively, in a
sort of list introduced by \qitem commands:

\begin{epigraphs}
\qitem{What I tell you three times is true}%
{Lewis Carroll}
\qitem{Oh listen do, I’m telling you!}%
{A.A. Milne}
\end{epigraphs}

The \epigraphhead command enables you to place your epigraph above a chapter
header:

\setlength{\unitlength}{1pt}
...
\chapter{The Social Life of Rabbits}
\epigraphhead[<distance>]{%
\epigraph{Oh! My ears and whiskers!}%
{Lewis Carroll}%
}

The hdistancei says how far above the chapter heading the epigraph is to go; it’s ex-
pressed in terms of the \unitlength that’s used in the picture environment; the
package’s author recommends 70pt.
The package also offers various tricks for adjusting the layout of chapter header
(necessary if you’ve found a hugely long quotation for an \epigraphhead), for patch-
ing the bibliography, for patching \part pages, and so on. (Some of these suggested
patches lead you through writing your own package. . . )
The quotchap package redefines chapter headings (in a moderately striking way),
and provides an environment savequotes in which you can provide one (or more)
quotations to use as epigraphs. The facilities seem not as flexible as those of epigraph,
but it’s probably easier to use.
The memoir class offers all the facilities of the epigraph package. The Koma-
script classes have commands \setchapterpreamble and \dictum to provide these
facilities.
epigraph.sty : macros/latex/contrib/epigraph
KOMA script bundle: macros/latex/contrib/koma-script
memoir.cls: macros/latex/contrib/memoir
quotchap.sty : macros/latex/contrib/quotchap

198 (La)TeX PDF output prints at wrong size


Having got everything else right, you should be aware that the problem may have noth-
ing to do with (La)TeX and everything to do with the program you use for printing. A
regular cause for such problems lies with Acrobat Reader, which by default enables its
option to scale pages to fit on the printable area of the paper. Since a printer can rarely
print right to the edge, this means that pdf-files will be shrunk by some (small) factor
(even if the pdf-file is formatted for A4, and your paper size is set to A4 as well).
Correcting this silliness is not very hard, but the exact details depend on the version
of Acrobat Reader (or "Adobe Reader" from version 6.0 onwards) you have installed:
114
• Mac OS X, Adobe Reader 6:
in the print dialogue, on the “copies & pages” pane, you’ll find a popup menu titled
“Page Scaling”. Make sure that the menu reads “None”.
• Windows, Adobe Reader 6:
in the print dialogue, select “None” from the drop-down list “Page Scaling”.
• Windows, Linux Acrobat (Reader) 5.0:
In the print dialog, make sure the “Shrink oversized pages to fit” checkbox is
unchecked. It may also be useful to uncheck the “Expand small pages to fit paper
size” checkbox as well.

P.4 Spacing of characters and lines


199 Double-spaced documents in LaTeX
A quick and easy way of getting inter-line space for copy-editing is to change
\baselinestretch — \linestretch{1.2} (or, equivalently \renewcommand{\baselinestretch}
{1.2}) may be adequate. Note that \baselinestretch changes don’t take effect until
you select a new font, so make the change in the preamble before any font is selected.
Don’t try changing \baselineskip: its value is reset at any size-changing command
so that results will be inconsistent.
For preference (and certainly for a production document, such as a dissertation or
an article submission), use a line-spacing package. The only one currently supported is
setspace (do not be tempted by doublespace — its performance under current LaTeX
is at best problematical). Setspace switches off double-spacing at places where even
the most die-hard official would doubt its utility (footnotes, figure captions, and so on);
it’s very difficult to do this consistently if you’re manipulating \baselinestretch
yourself.
Of course, the real solution (other than for private copy editing) is not to use double-
spacing at all. Universities, in particular, have no excuse for specifying double-spacing
in submitted dissertations: LaTeX is a typesetting system, not a typewriter-substitute,
and can (properly used) make single-spaced text even more easily readable than double-
spaced typewritten text. If you have any influence on your university’s system (for
example, through your dissertation supervisor), it may be worth attempting to get the
rules changed (at least to permit a “well-designed book” format).
Double-spaced submissions are also commonly required when submitting papers
to conferences or journals. Fortunately (judging by the questions from users in this
author’s department), this demand is becoming less common.
Documentation of setspace appears as TeX comments in the package file itself.
setspace.sty : macros/latex/contrib/setspace/setspace.sty

200 Changing the space between letters


A common technique in advertising copy (and other text whose actual content need
not actually be read) is to alter the space between the letters (otherwise known as the
tracking). As a general rule, this is a very bad idea: it detracts from legibility, which
is contrary to the principles of typesetting (any respectable font you might be using
should already have optimum tracking built into it).
The great type designer, Eric Gill, is credited with saying “he who would letterspace
lower-case text, would steal sheep”. (The attribution is probably apocryphal: others are
also credited with the remark. Stealing sheep was, in the 19th century, a capital offence
in Britain.) As the remark suggests, though, letterspacing of upper-case text is less
awful a crime; the technique used also to be used for emphasis of text set in Fraktur (or
similar) fonts.
Straightforward macros (usable, in principle, with any TeX macro package) may
be found in letterspacing (which is the name of the .tex file; it also appears as the
letterspace package in some distributions).
A more comprehensive solution is to be found in the soul package (which is opti-
mised for use with LaTeX, but also works with Plain TeX). Soul also permits hyphen-
ation of letterspaced text; Gill’s view of such an activity is not (even apocryphally)
recorded. (Spacing-out forms part of the name of soul; the other half is described in
another question.)
letterspacing.tex : macros/generic/misc/letterspacing.tex
soul.sty : macros/latex/contrib/soul

115
201 Setting text ragged right
The trick with typesetting ragged right is to be sure you’ve told the TeX engine “make
this paragraph ragged, but never too ragged”. The LaTeX \raggedright command
(and the corresponding flushleft environment) has a tendency to miss the “never”
part, and will often create ridiculously short lines, for some minor benefit later in the
paragraph. The Plain TeX version of the command doesn’t suffer this failing, but is
rather conservative: it is loath to create too large a gap at the end of the line, but in some
circumstances (such as where hyphenation is suppressed hyphenation is suppressed)
painfully large gaps may sometimes be required.
Martin Schröder’s ragged2e package offers the best of both worlds: it provides
raggedness which is built on the Plain TeX model, but which is easily configurable.
It defines easily-remembered command (e.g., \RaggedRight) and environment (e.g.,
FlushLeft) names that are simply capitalised transformations of the LaTeX kernel
originals. The documentation discusses the issues and explains the signficance of the
various parameters of ragged2e’s operation.
ragged2e.sty : Distributed as part of macros/latex/contrib/ms

202 Cancelling \ragged commands


LaTeX provides commands \raggedright and \raggedleft, but none to cancel their
effect. The \centering command is implemented in the same way as the \ragged*
commands, and suffers in the same way.
The following code (to be inserted in a package of your own, or as internal LaTeX
code —internal LaTeX code) defines a command that restores flush justification at both
margins:
\def\flushboth{%
\let\\\@normalcr
\@rightskip\z@skip \rightskip\@rightskip
\leftskip\z@skip
\parindent 1.5em\relax}
There’s a problem with the setting of \parindent in the code: it’s necessary because
both the \ragged commands set \parindent to zero, but the setting isn’t a constant
of nature: documents using a standard LaTeX class with twocolumn option will have
1.0em by default, and there’s no knowing what you (or some other class) will have
done.
If you are using Martin Schröder’s ragged2e package, it is worth updating to the
latest release (January 2003), which has a \justifying command to match its versions
of the LaTeX ‘ragged’ commands. The package also provides a justify environment,
which permits areas of justified text in a larger area which is ragged.
ragged2e.sty : Distributed as part of macros/latex/contrib/ms

P.5 Typesetting specialities


203 Including a file verbatim in LaTeX
A good way is to use Rainer Schöpf’s verbatim package, which provides a command
\verbatiminput that takes a file name as argument:

\usepackage{verbatim}
...
\verbatiminput{verb.txt}

Another way is to use the alltt environment, which requires the alltt package. The
environment interprets its contents ‘mostly’ verbatim, but executes any (La)TeX com-
mands it finds:
\usepackage{alltt}
...
\begin{alltt}
\input{verb.txt}
\end{alltt}

of course, this is little use for inputting (La)TeX source code. . .


The moreverb package extends the verbatim package, providing a listing envi-
ronment and a \listinginput command, which line-number the text of the file. The
116
package also has a \verbatimtabinput command, that honours TAB characters in
the input (the package’s listing environment and the \listinginput command also
both honour TAB).
The sverb package provides verbatim input (without recourse to the facilities of the
verbatim package):
\usepackage{sverb}
...
\verbinput{verb.txt}

The fancyvrb package offers configurable implementations of everything verbatim,


sverb and moreverb have, and more besides. It is nowadays the package of choice for
the discerning typesetter of verbatim text, but its wealth of facilities makes it a complex
beast and study of the documentation is strongly advised.
The memoir class includes the relevant functionality of the verbatim and moreverb
packages.
alltt.sty : Part of the LaTeX distribution.
fancyvrb.sty : macros/latex/contrib/fancyvrb
memoir.cls: macros/latex/contrib/memoir
moreverb.sty : macros/latex/contrib/moreverb
sverb.sty : Distributed as part of macros/latex/contrib/mdwtools
verbatim.sty : Distributed as part of macros/latex/required/tools

204 Including line numbers in typeset output


For general numbering of lines, there are two packages for use with LaTeX, lineno
(which permits labels attached to individual lines of typeset output) and numline.
Both of these packages play fast and loose with the LaTeX output routine, which
can cause problems: the user should beware. . .
If the requirement is for numbering verbatim text, moreverb, memoir or fancyvrb
(see including files verbatim) may be used.
One common use of line numbers is in critical editions of texts, and for this the
edmac package offers comprehensive support; ledmac is a LaTeX port of edmac.
The vruler package sidesteps many of the problems associated with line-numbering,
by offering (as its name suggests) a rule that numbers positions on the page. The effect
is good, applied to even-looking text, but is poor in texts that involve breaks such as
interpolated mathematics or figures. Documentation of the package, in the package
itself, is pretty tough going, though there is an example (also inside the package file).
edmac: macros/plain/contrib/edmac
fancyvrb.sty : macros/latex/contrib/fancyvrb
ledmac.sty : macros/latex/contrib/ledmac
lineno.sty : macros/latex/contrib/lineno
memoir.cls: macros/latex/contrib/memoir
moreverb.sty : macros/latex/contrib/moreverb
numline.sty : macros/latex/contrib/numline/numline.sty
vruler.sty : macros/latex/contrib/misc/vruler.sty

205 Code listings in LaTeX


‘Pretty’ code listings are sometimes considered worthwhile by the “ordinary” program-
mer, but they have a serious place in the typesetting of dissertations by computer sci-
ence and other students who are expected to write programs. Simple verbatim listings
of programs are commonly useful, as well.
Verbatim listings are dealt with elsewhere, as is the problem of typesetting algo-
rithm specifications.
The listings package is widely regarded as the best bet for formatted output (it
is capable of parsing program source, within the package itself), but there are sev-
eral well-established packages that rely on a pre-compiler of some sort. You may use
listings to typeset snippets that you include within your source:

117
\usepackage{listings}
\lstset{language=C}
...
\begin{document}
\begin{lstlisting}
#include <stdio.h>

int main(int argc, char ** argv)


{
printf("Hello world!\n");
return 0;
}
\end{lstlisting}
\end{document}

or you can have it typeset whole files:


\usepackage{listings}
\lstset{language=C}
...
\begin{document}
\lstinputlisting{main.c}
\end{document}

These very simple examples may be decorated in a huge variety of ways, and of course
there are other languages in the package’s vocabulary than just C. . .
Most people, advising others on (La)TeX lists, seem to regard listings as the be-all
and end-all on this topic. But there are alternatives, which may be worth considering,
in some situations.
Highlight is attractive if you need more than one output format for your program:
as well as (La)TeX output, highlight will produce (X)HTML, RTF and XSL-FO repre-
sentations of your program listing. Documentation is available on the highlight project
site.
The lgrind system is another well-established pre-compiler, with all the facilities
one might need and a wide repertoire of languages; it is derived from the very long-
established tgrind, whose output is based on Plain TeX
The tiny_c2l system is more recent: users are encouraged to generate their own
driver files for languages it doesn’t already deal with.
The C++2LaTeX system comes with strong recommendations for use with C and
C++.
An extremely simple system is c2latex, for which you write LaTeX source in your C
program comments. The program then converts your program into a LaTeX document
for processing. The program (implicitly) claims to be “self-documenting”.
c2latex : support/c2latex
C++2LaTeX : support/C++2LaTeX-1_1pl1
highlight: support/highlight
lgrind : nonfree/support/lgrind
listings.sty : macros/latex/contrib/listings
tgrind : support/tgrind
tiny_c2l: support/tiny_c2l

206 Typesetting pseudocode in LaTeX


There is no consensus on the ‘right’ way to typeset pseudocode. Consequently, there
are a variety of LaTeX packages to choose from for producing æsthetically pleasing
pseudocode listings.
Pseudocode differs from actual program listings in that it lacks strict syntax and se-
mantics. Also, because pseudocode is supposed to be a clear expression of an algorithm
it may need to incorporate mathematical notation, figures, tables, and other LaTeX fea-
tures that do not appear in conventional programming languages. Typesetting program
listings is described elsewhere.

118
You can certainly create your own environment for typesetting pseudocode using,
for example, the tabbing or list environments — it’s not difficult, but it may prove
boring. So it’s worth trying the following packages, all designed specifically for type-
setting pseudocode.
The algorithms bundle (which contains packages algorithm and algorithmic, both
of which are needed for ordinary use) has a simple interface and produces fairly nice
output. It provides primitives for statements, which can contain arbitrary LaTeX com-
mands, comments, and a set of iterative and conditional constructs. These primitives
can easily be redefined to produce different text in the output. However, there is no
support for adding new primitives. Typesetting the pseudocode itself is performed in
algorithmic; the algorithms package uses the facilities of the float package to number
algorithms sequentially, enable algorithms to float like figures or tables, and support
including a List of Algorithms in a document’s front matter.
Packages in the algorithmicx bundle are similiar both in concept and output form
to algorithmic but additionally provide support for adding new keywords and altering
the formatting. It provides the algpseudocode package which is (almost) a drop-in
replacement for algorithmic. Another package in the bundle, algpascal, uses Pascal-
like keywords, indents differently from algpseudocode, and puts command arguments
in maths mode instead of text mode. There is no floating environment but algorithmicx,
like algorithmic, is compatible with the algorithm package.
The alg package, like algorithms, offers a floating algorithm environment with all
of the ensuing niceties. alg, however, can caption its floats in a variety of (natural)
languages. In addition, alg unlike algorithms, makes it easy to add new constructs.
The newalg package has a somewhat similar interface to algorithms, but its output
is designed to mimic the rather pleasant typesetting used in the book “Introduction to
Algorithms” by Corman, Leiserson, Rivest and Stein. Unfortunately, newalg does not
support a floating environment or any customisation of the output.
“Bona fide” use of the style of “Introduction to Algorithms” may be achieved with
Cormen’s own clrscode: this is the package as used in the second edition of the book.
Similarly, the style of “Combinatorial Algorithms: Generation, Enumeration and
Search” is supported by the pseudocode package, written by the authors of the book. It
has the common ‘Pascal-like’ style, and has some interesting constructs for what one
thinks of as Pascal blocks.
The algorithm2e is of very long standing, and is widely used and recommended. It
loads the float package to provide the option of floating algorithm descriptions, but you
can always use the “H” option of float to have the algorithm appear “where you write
it”.
The usage of the program package is a little different from that of the other pack-
ages. It typesets programs in maths mode instead of text mode; and linebreaks are
significant. program lacks a floating environment but does number algorithms like alg
and algorithms. Customisation and extension are not supported. Documentation of the
program package (such as it is) appears in a file program.msg in the distribution.
None of the above are perfect. The factors that should influence your choice of
package include the output style you prefer, how much you need to extend or modify
the set of keywords, and whether you require algorithms to float like figures and tables.
algorithm2e.sty : macros/latex/contrib/algorithm2e
algorithmicx bundle: macros/latex/contrib/algorithmicx
algorithms bundle: macros/latex/contrib/algorithms
alg.sty : macros/latex/contrib/alg
clrscode.sty : macros/latex/contrib/clrscode
float.sty : macros/latex/contrib/float
newalg.sty : macros/latex/contrib/newalg
program.sty : macros/latex/contrib/program
pseudocode.sty : macros/latex/contrib/pseudocode

207 Generating an index in (La)TeX


Making an index is not trivial; what to index, and how to index it, is difficult to decide,
and uniform implementation is difficult to achieve. You will need to mark all items to
be indexed in your text (typically with \index commands).
119
It is not practical to sort a large index within TeX, so a post-processing program is
used to sort the output of one TeX run, to be included into the document at the next
run.
The following programs are available:
makeindex Comes with most distributions — a good workhorse, but is not well-
arranged to deal with other sort orders than the canonical ASCII ordering.
The makeindex documentation is a good source of information on how to create
your own index. Makeindex can be used with some TeX macro packages other
than LaTeX, such as Eplain (Eplain), and TeX (whose macros can be used inde-
pendently with Plain TeX).
idxtex for LaTeX under VMS, which comes with a glossary-maker called glotex.
texindex A witty little shell/sed-script-based utility for LaTeX under Unix.
The Texinfo system also uses a program texindex, whose source is to be found
in the texinfo distribution. The ltxindex package provides macros to allow LaTeX
users to use this texindex.
xindy arose from frustration at the difficulty of making a multi-language version of
makeindex. It is designed to be a successor to makeindex, by a team that in-
cluded the then-current maintainer of makeindex. It successfully addresses many
of makeindex’s shortcomings, including difficulties with collation order in differ-
ent languages, and it is highly flexible. Sadly, its take-up is proving rather slow.
idxtex : indexing/glo+idxtex
ltxindex.sty : macros/latex/contrib/ltxindex
makeindex : indexing/makeindex
makeindex (Macintosh): systems/mac/macmakeindex2.12.sea.hqx
texindex (the script): indexing/texindex
texindex (the program): Distributed with macros/texinfo/texinfo
texsis (system): macros/texsis
texsis (makeindex support): macros/texsis/index/index.tex
xindy : support/xindy

208 Typesetting URLs


URLs tend to be very long, and contain characters that would naturally prevent them
being hyphenated even if they weren’t typically set in \ttfamily, verbatim. Therefore,
without special treatment, they often produce wildly overfull \hboxes, and their typeset
representation is awful.
There are three packages that help solve this problem:
• The path package, which defines a \path command. The command defines each
potential break character as a \discretionary, and offers the user the opportunity
of specifying a personal list of potential break characters. Its chief disadvantage
is fragility in LaTeX moving arguments. The Eplain macros — define a similar
\path command.
Path, though it works in simple situations, makes no attempt to work with LaTeX
(it is irremediably fragile). Despite its long and honourable history, it is no longer
recommended for LaTeX use.
• The url package, which defines an \url command (among others, including its
own \path command). The command gives each potential break character a
maths-mode ‘personality’, and then sets the URL itself (in the user’s choice of
font) in maths mode. It can produce (LaTeX-style) ‘robust’ commands (use of
\protect) for use within moving arguments. Note that, because the operation
is conducted in maths mode, spaces within the URL argument are ignored unless
special steps are taken.
It is possible to use the url package in Plain TeX, with the assistance of the miniltx
package (which was originally developed for using the LaTeX graphics package
in Plain TeX). A small patch is also necessary: the required sequence is therefore:
\input miniltx
\expandafter\def\expandafter\+\expandafter{\+}
\input url.sty
120
• The hyperref package, which uses the typesetting code of url, in a context where
the typeset text forms the anchor of a link.

The author of this answer prefers the (rather newer) url package (directly or indi-
rectly); both path and url work well with Plain TeX (though of course, the fancy LaTeX
facilities of url don’t have much place there). (hyperref isn’t available in a version for
use with Plain TeX.)
Documentation of both path and url is in the package files.
hyperref.sty : macros/latex/contrib/hyperref
miniltx.tex : Distributed as part of macros/plain/graphics
path.sty : macros/latex/contrib/misc/path.sty
url.sty : macros/latex/contrib/misc/url.sty

209 Typesetting music in TeX


In the early days, a simple music package called MuTeX was written by Angelika
Schofer and Andrea Steinbach, which demonstrated that music typesetting was possi-
ble; the package was very limited, and is hardly ever used nowadays. Daniel Taupin
took up the baton, and developed MusicTeX, which allows the typesetting of poly-
phonic and other multiple-stave music; MusicTeX remains available, but is most defi-
nitely no longer recommended.
MusicTeX has been superseded by its successor MusiXTeX, which is a three-pass
system (with a processor program that computes values for the element spacing in the
music), and achieves finer control than is possible in the unmodified TeX-based mech-
anism that MusicTeX uses. Daniel Taupin’s is the only version of MusiXTeX currently
being developed (the original author, Andreas Egler, had an alternative version, but he
is now working on a different package altogether).
Input to MusixTeX is extremely tricky stuff, and Don Simons’ preprocessor pmx
is the preferred method of creating input for Taupin’s version. Pmx greatly eases use
of MusixTeX, but it doesn’t support the full range of MusixTeX’s facilities directly;
however, it does allow in-line MusixTeX code in pmx sources.
Dirk Laurie’s M-Tx allows preparation of music with lyrics; it operates “on top of”
pmx
Another simple notation is supported by abc2mtex; this is a package designed to
notate tunes stored in an ASCII format (abc notation). It was designed primarily for
folk and traditional tunes of Western European origin (such as Irish, English and Scot-
tish) which can be written on one stave in standard classical notation, and creates input
intended for MusicTeX. However, it should be extendable to many other types of mu-
sic.
Digital music fans can typeset notation for their efforts by using midi2tex, which
translates MIDI data files into MusicTeX source code.
There is a mailing list ([email protected]) for discus-
sion of typesetting music in TeX; it mostly covers MusixTeX and related systems.
To subscribe, use http://icking-music-archive.org/mailman/listinfo/tex-
music/
An alternative (free) means of embedding music examples into (La)TeX documents
is Lilypond. Lilypond is (at heart) a batch music typesetting system with plain text
input that does most of its work without TeX. Lilypond’s input syntax is far less cryptic
than is MusixTeX’s, and it handles much more stuff automatically, yielding the same
or better quality with less effort. Lilypond can also produce basic MIDI oputput.
abc2mtex : support/abc2mtex
M-Tx : support/mtx
midi2tex : support/midi2tex
musictex : macros/musictex
musixtex (Taupin’s version): macros/musixtex/taupin
musixtex (Egler’s version): macros/musixtex/egler
mutex : macros/mtex
pmx : support/pmx

121
210 Zero paragraph indent
The conventional way of typesetting running text has no separation between para-
graphs, and the first line of each paragraph in a block of text indented.
In contrast, one common convention for typewritten text was to have no inden-
tation of paragraphs; such a style is often required for “brutalist” publications such
as technical manuals, and in styles that hanker after typewritten manuscripts, such as
officially-specified dissertation formats.
Anyone can see, after no more than a moment’s thought, that if the paragraph indent
is zero, the paragraphs must be separated by blank space: otherwise it is sometimes
going to be impossible to see the breaks between paragraphs.
The simple-minded approach to zero paragraph indentation is thus:
\setlength{\parindent}{0pt}
\setlength{\parskip}{\baselineskip}
and in the very simplest text, it’s a fine solution.
However, the non-zero \parskip interferes with lists and the like, and the result
looks pretty awful. The parskip package patches things up to look reasonable; it’s not
perfect, but it deals with most problems.
The Netherlands Users’ Group’s set of classes includes an article equivalent
(artikel3) and a report equivalent (rapport3) whose design incorporates zero para-
graph indent and non-zero paragraph skip.
NTG classes: macros/latex/contrib/ntgclass
parskip.sty : macros/latex/contrib/misc/parskip.sty

211 Big letters at the start of a paragraph


A common style of typesetting, now seldom seen except in newspapers, is to start a
paragraph (in books, usually the first of a chapter) with its first letter set large enough
to span several lines.
This style is known as “dropped capitals”, or (in French) “lettrines”, and TeX’s
primitive facilities for hanging indentation make its (simple) implementation pretty
straightforward.
The dropping package does the job simply, but has a curious attitude to the cal-
culation of the size of the font to be used for the big letters. Examples appear in the
package documentation, so before you process the .dtx, the package itself must al-
ready be installed. Unfortunately, dropping has an intimate relation to the set of device
drivers available in an early version of the LaTeX graphics package, and it cannot be
trusted to work with recent offerings like PDFTeX, VTeX or DVIpdfm.
On such occasions, the more recent lettrine package is more likely to succeed. It
has a well-constructed array of options, and the examples (a pretty impressive set)
come as a separate file in the distribution (also available in PostScript, so that they can
be viewed without installing the package itself).
dropping : macros/latex/contrib/dropping
lettrine: macros/latex/contrib/lettrine

212 The comma as a decimal separator


If you use a comma in maths mode, you get a small space after it; this space is inap-
propriate if the comma is being used as a decimal separator. An easy solution to this
problem, in maths mode, is to type 3{,}14 instead of typing 3,14. However, if your
language’s typographic rules require the comma as a decimal separator, such usage can
rapidly become extremely tiresome. There are two packages that can help relieve the
tedium: icomma and ziffer.
Icomma ensures that there will be no extra space after a comma, unless you type a
space after it (as in f(x, y), for instance), in which case the usual small space after
the comma appears. Ziffer is specifically targeted at the needs of those typesetting
German, but covers the present need, as well as providing the double-minus sign used
in German (and other languages) for the empty ‘cents’ part of an amount of currency.
icomma.sty : Distributed as part of macros/latex/contrib/was
ziffer.sty : macros/latex/contrib/misc/ziffer.sty

122
213 Breaking boxes of text
(La)TeX boxes may not be broken, in ordinary usage: once you’ve typeset something
into a box, it will stay there, and the box will jut out beyond the side or the bottom of
the page if it doesn’t fit in the typeset area.
If you want a substantial portion of your text to be framed (or coloured), the restric-
tion starts to seem a real imposition. Fortunately, there are ways around the problem.
The framed package provides framed and shaded environments; both put their
content into something which looks like a framed (or coloured) box, but which breaks
as necessary at page end. The environments “lose” footnotes, marginpars and head-
line entries, and will not work with multicol or other column-balancing macros. The
memoir class includes the functionality of the framed package.
The boites package provides a breakbox environment; examples of its use may be
found in the distribution, and the package’s README file contains terse documentation.
The environments may be nested, and may appear inside multicols environments;
however, floats, footnotes and marginpars will be lost.
For Plain TeX users, the facilities of the backgrnd package may be useful; this pack-
age subverts the output routine to provide vertical bars to mark text, and the macros are
clearly marked to show where coloured backgrounds may be introduced (this requires
shade, which is distributed as tex macros and device-independent MetaFont for the
shading). The author of backgrnd claims that the package works with LaTeX 2.09, but
there are reasons to suspect that it may be unstable working with current LaTeX.
backgrnd.tex : macros/generic/misc/backgrnd.tex
boites.sty : macros/latex/contrib/boites
framed.sty : macros/latex/contrib/misc/framed.sty
memoir.cls: macros/latex/contrib/memoir
shade.tex : macros/generic/shade

214 Overstriking characters


This may be used, for example, to indicate text deleted in the course of editing. Both
the ulem and the soul packages provide the necessary facilities.
Overstriking for cancellation in maths expressions is achieved by a different mech-
anism.
Documentation of ulem is in the package file.
soul.sty : macros/latex/contrib/soul
ulem.sty : macros/latex/contrib/misc/ulem.sty

215 Realistic quotes for verbatim listings


The cmtt font has “curly” quotes (‘thus’), which are pleasing on the eye, but don’t
really tally with what one sees on a modern xterm (which look like `this').
The appearance of these quotes is critical in program listings, particularly in
those of Unix-like shell scripts. The upquote package modifies the behaviour of the
verbatim environment so that the output is a clearer representation of what the user
must type.
upquote.sty : macros/latex/contrib/upquote/upquote.sty

216 Printing the time


TeX has a primitive register that contains “the number of minutes since midnight”; with
this knowledge it’s a moderately simple programming job to print the time (one that no
self-respecting Plain TeX user would bother with anyone else’s code for).
However, LaTeX provides no primitive for “time”, so the non-programming LaTeX
user needs help.
Two packages are available, both providing ranges of ways of printing the date, as
well as of the time: this question will concentrate on the time-printing capabilities, and
interested users can investigate the documentation for details about dates.
The datetime package defines two time-printing functions: \xxivtime (for 24-hour
time), \ampmtime (for 12-hour time) and \oclock (for time-as-words, albeit a slightly
eccentric set of words).
The scrtime package (part of the compendious KOMA-Script bundle) takes a
package option (12h or 24h) to specify how times are to be printed. The command
123
\thistime then prints the time appropriately (though there’s no am or pm in 12h
mode). The \thistime command also takes an optional argument, the character to
separate the hours and minutes: the default is of course :.
datetime.sty : macros/latex/contrib/datetime
scrtime.sty : Distributed as part of macros/latex/contrib/koma-script

217 Redefining counters’ \the-commands


Whenever you request a new LaTeX counter, LaTeX creates a bunch of behind-the-
scenes commands, as well as definining the counter itself.
Among other things, \newcounter{fred} creates a command \thefred , which
expands to “the value of fred ” when you’re typesetting.
The definition of \thefred should express the value of the counter: it is almost
always always a mistake to use the command to produce anything else. The value may
reasonably be expressed as an arabic, a roman or a greek number, as an alphabetic
expression, or even as a sequence (or pattern of) symbols. If you need a decision
process on whether to re-define \thefred , consider what might happen when you do
so.
So, for example, if you want your section numbers to be terminated by a period,
you could make \thesection expand with a terminating period. However, such a
change to \thesection makes the definition of \thesubsection look distinctly odd:
you are going to find yourself redefining things left, right and centre. Rather, use the
standard techniques for adjusting the presentation of section numbers.
Or, suppose you want the page number to appear at the bottom of each page sur-
rounded by dashes (“--~nnn~--”). Would you want to achieve this by redefining
\thepage, given the likely appearance of the table of contents with the dashes at-
tached every page number, or of the modified \pageref references. In this case, the
modification is best done by redefining the page style itself, perhaps package fancyhdr.

P.6 Tables of contents and indexes


218 The format of the Table of Contents, etc.
The formats of entries in the table of contents (TOC) are controlled by a number of
internal commands (discussed in section 2.3 of The LaTeX Companion. The com-
mands \@pnumwidth, \@tocrmarg and \@dotsep control the space for page numbers,
the indentation of the right-hand margin, and the separation of the dots in the dotted
leaders, respectively. The series of commands named \l@xxx , where xxx is the name
of a sectional heading (such as chapter or section, . . . ) control the layout of the
corresponding heading, including the space for section numbers. All these internal
commands may be individually redefined to give the effect that you want.
Alternatively, the package tocloft provides a set of user-level commands that may
be used to change the TOC formatting. Since exactly the same mechanisms are used
for the List of Figures and List of Tables, the layout of these sections may be controlled
in the same way.
The KOMA-Script classes provides an optional variant structure for the table of
contents, and calculates the space needed for the numbers automatically. The memoir
class includes the functionality of tocloft.
KOMA script bundle: macros/latex/contrib/koma-script
memoir.cls: macros/latex/contrib/memoir
tocloft.sty : macros/latex/contrib/tocloft

219 Unnumbered sections in the Table of Contents


The way the relevant parts of sectioning commands work is exemplified by the way the
\chapter command uses the counter secnumdepth (described in Appendix C of the
LaTeX manual):
1. put something in the .aux file, which will appear in the .toc;
2. if secnumdepth ≥ 0, increase the counter for the chapter and write it out.
3. write the chapter title.
Other sectioning commands are similar, but with other values used in the test.
So a simple way to get headings of funny ‘sections’ such as prefaces in the table of
contents is to use the counter:
124
\setcounter{secnumdepth}{-1}
\chapter{Preface}

Unfortunately, you have to set secnumdepth back to its usual value (which is 2 in the
standard styles) before you do any ‘section’ which you want to be numbered.
Similar settings are made, automatically, in the LaTeX book class by the \frontmatter
and \backmatter commands.
The value of the counter tocdepth controls which headings will be finally printed
in the table of contents. This normally has to be set in the preamble and is a constant for
the document. The package tocvsec2 package provides a convenient interface to allow
you to change the secnumdepth and/or the tocdepth counter values at any point in the
body of the document; this provides convenient independent controls over the sectional
numbering and the table of contents.
The package abstract (see one-column abstracts) includes an option to add the
abstract to the table of contents, while the package tocbibind has options to include
the table of contents itself, the bibliography, index, etc., to the table of contents.
The KOMA-Script classes have commands \addchap and \addsec, which work
like \chapter and \section but aren’t numbered. The memoir class incorporates the
facilities of all three of the abstract, tocbibind and tocvsec2 packages.
abstract.sty : macros/latex/contrib/abstract
KOMA script bundle: macros/latex/contrib/koma-script
memoir.cls: macros/latex/contrib/memoir
tocbibind.sty : macros/latex/contrib/tocbibind
tocvsec2.sty : macros/latex/contrib/tocvsec2

220 Bibliography, index, etc., in TOC


The standard LaTeX classes (and many others) use \section* or \chapter* for auto-
generated parts of the document (the tables of contents, lists of figures and tables, the
bibliography and the index). As a result, these items aren’t numbered (which most
people don’t mind), and (more importantly) they don’t appear in the table of contents.
The correct solution (as always) is to have a class of your own that formats your
document according to your requirements. The macro to do the job (\addcontentsline)
is fairly simple, but there is always an issue of ensuring that the contents entry quotes
the correct page. Supposing that our the document is chapter-based (class report or
book, for example), the text:

\bibliography{frooble}
\addcontentsline{toc}{chapter}{Bibliography}

will produce the wrong answer if the bibliography is more than one page long. Instead,
one should say:

\cleardoublepage
\addcontentsline{toc}{chapter}{Bibliography}
\bibliography{frooble}

(Note that \cleardoublepage does the right thing, even if your document is single-
sided — in that case, it’s a synonym for \clearpage). Ensuring that the entry refers to
the right place is trickier still in a \section-based class.
The common solution, therefore, is to use the tocbibind package, which provides
many facilities to control the way these entries appear in the table of contents.
Classes of the KOMA-script bundle provide this functionality as a set of class op-
tions; the memoir class includes tocbibind itself.
KOMA script bundle: macros/latex/contrib/koma-script
memoir.cls: macros/latex/contrib/memoir
tocbibind.sty : macros/latex/contrib/tocbibind

125
221 Table of contents, etc., per chapter
The common style, of a “small” table of contents for each part, chapter, or even section,
is supported by the minitoc package. The package also supports mini-lists of tables
and figures; but, as the documentation observes, mini-bibliographies are a different
problem — see bibliographies per chapter.
The package’s basic scheme is to generate a little .aux file for each chapter, and to
process that within the chapter. Simple usage would be:
\usepackage{minitoc}
...
\begin{document}
...
\dominitoc \tableofcontents
\dominilof \listoffigures
...
\chapter{blah blah}
\minitoc \mtcskip \minilof
...

though a lot of elaborations are possible (for example, you don’t need a \minitoc for
every chapter).
Babel doesn’t know about minitoc, but minitoc makes provision for other document
languages than English — a wide variety is available. Fortunately, the current version
of the hyperref package does know about minitoc and treats \minitoc tables in the
same way as “real” tables of contents.
babel.sty : macros/latex/required/babel
hyperref.sty : macros/latex/contrib/hyperref
minitoc.sty : macros/latex/contrib/minitoc

222 Multiple indexes


LaTeX’s standard indexing capabilities (those provided by the makeidx package) only
provide for one index in your document; even quite modest documents can be improved
by indexes for separate topics.
The multind package provides simple and straightforward multiple indexing. You
tag each \makeindex, \index and \printindex command with a file name, and in-
dexing commands are written to (or read from) the name with the appropriate (.idx or
.ind) extension appended. The \printindex command is modified from the LaTeX
standard so that it doesn’t create its own chapter or section heading; you therefore de-
cide what names (or sectioning level, even) to use for the indexes, and \indexname is
completely ignored.
To create a “general” and an “authors” index, one might write:
\usepackage{multind}
\makeindex{general}
\makeindex{authors}
...
\index{authors}{Another Idiot}
...
\index{general}{FAQs}
...
\printindex{general}{General index}
\printindex{authors}{Author index}

To complete the job, run LaTeX on your file enough times that labels, etc., are stable,
and then execute the commands
makeindex general
makeindex authors

before running LaTeX again. Note that the names of the index files to process are not
necessarily related to the name of the LaTeX file you’re processing, at all. (There’s no
documentation that comes with the package: what you see above is as good as you will
get. . . )
126
The index package provides a comprehensive set of indexing facilities, including
a \newindex command that allows the definition of new styles of index. \newindex
takes a ‘tag’ (for use in indexing commands), replacements for the .idx and .ind file
extensions, and a title for the index when it’s finally printed; it can also change the item
that’s being indexed against (for example, one might have an index of artists referenced
by the figure number where their work is shown).
Using index, to create an author index together with a “normal” index, one would
start with preamble commands:
\usepackage{index}
\makeindex
\newindex{aut}{adx}{and}{Name Index}

which load the package, define a “main” (original-style) index, and then define an
author index. Then, in the body of the document, we might find commands like:
\index[aut]{Another Idiot}
...
\index{FAQs}

Which place an entry in the author index, and then one in the main index. At the end
of the document, we have two commands:
\printindex
\printindex[aut]

Which will print the main index and then the author index. Supposing this lot to be
in myfile.tex, after enough runs through LaTeX that labels are stable, execute the
following commands (Unix-style shell commands shown here, but the principle is the
same whatever system you’re using):
makeindex myfile
makeindex myfile.adx -o myfile.and

and rerun LaTeX. The makeindex commands process myfile.idx to myfile.ind (the
default action), and then myfile.adx to myfile.and, the two files needed as input by
the two \printindex commands in myfile.tex.
The splitidx package can operate in the same way as the others: load the package
with the split option, and declare each index with a \newindex command:
\newindex[hindex namei]{hshortcuti}

and splitidx will generate a file \jobname.hshortcuti to receive index entries gener-
ated by commands like \sindex[hshortcuti]{hitem i}. As with the other packages,
this method is limited by TeX’s total number of output files. However, splitindex also
comes with a small executable splitindex (available for a variety of operating systems);
if you use this auxiliary program (and don’t use split), there’s no limit to the number
of indexes. Apart from this trick, splitidx supports the same sorts of things as does
index. An example of use appears in the documentation.
The memoir class has its own multiple-index functionality (as well as index layout
options, which other packages delegate to the index style used by makeindex.
index.sty : macros/latex/contrib/index
makeidx.sty : Part of the LaTeX distribution
memoir.cls: macros/latex/contrib/memoir
multind.sty : macros/latex209/contrib/misc/multind.sty
splitidx.sty and splitindex : macros/latex/contrib/splitindex

P.7 Labels and references


223 Referring to things by their name
LaTeX’s labelling mechanism is designed for the impersonal world of the academic
publication, in which everything has a number: an extension is necessary if we are
to record the name of things we’ve labelled. The two packages available extend the
LaTeX sectioning commands to provide reference by the name of the section.
127
The titleref package is a simple extension which provides the command \titleref;
it is a stand-alone package — don’t use it in a document in which you also need to use
hyperref .
The byname package is part of the smartref bundle and works well with smartref ,
and works (to an extent) with hyperref , but the links it defines are not hyperlinks.
The memoir class incorporates the functionality of titleref , but doesn’t work with
byname (though a search of comp.text.tex on groups.google.com will find a patch
to byname to remedy the problem).
The hyperref bundle includes a package nameref , which will work standing alone
(i.e., without hyperref : of course, in this mode its references are not hyperlinked).
If you load hyperref itself, nameref is automatically loaded. Memoir requires the
memhfixc when running with hyperref ; however, following the sequence

\documentclass[...]{memoir}
...
\usepackage[...]{hyperref}
\usepackage{memhfixc}

nameref commands may be used in a memoir document.


Each of the name-reference packages defines a reference command with the same
name as the package: \titleref, \byname and \nameref. The nameref package also
defines a command \byshortnameref, which uses the optional ‘short’ title argument
to the chapter and section commands.
byname.sty : Distributed with macros/latex/contrib/smartref
hyperref.sty : macros/latex/contrib/hyperref
memoir.cls: macros/latex/contrib/memoir
nameref.sty : Distributed with macros/latex/contrib/hyperref
smartref.sty : macros/latex/contrib/smartref
titleref.sty : macros/latex/contrib/misc/titleref.sty

224 Referring to labels in other documents


When producing a set of inter-related documents, you’ll often want to refer to labels in
another document of the set; but LaTeX, of its own accord, doesn’t permit this.
So the package xr was written: if you say

\usepackage{xr}
\externaldocument{volume1}

will load all the references from volume1 into your present document.
But what if the documents both have a section labelled “introduction” (likely
enough, after all)? The package provides a means to transform all the imported labels,
so you don’t have to change label names in either document. For example:

\usepackage{xr}
\externaldocument[V1-]{volume1}

loads the references from volume1, but prefixes every one with the string V1-. So you
would refer to the introduction to volume 1 as:

\usepackage{xr}
\externaldocument[V1-]{volume1}
...
... the introduction to volume1 (\ref{V1-introduction})...

To have the facilities of xr working with hyperref , you need xr-hyper. For simple
hyper-cross-referencing (i.e., to a local PDF file you’ve just compiled), write:

\usepackage{xr-hyper}
\usepackage{hyperref}
\externaldocument[V1-]{volume1}
...
... the \nameref{V1-introduction})...

128
and the name reference will appear as an active link to the “introduction” chapter of
volume1.pdf.
To link to a PDF document on the Web, for which you happen to have the .aux file,
write:

\usepackage{xr-hyper}
\usepackage{hyperref}
\externaldocument[V1-]{volume1}[http://mybook.com/volume1.pdf]
...
... the \nameref{V1-introduction})...

xr.sty : Distributed as part of macros/latex/required/tools


xr-hyper.sty : Distributed with macros/latex/contrib/hyperref

Q How do I do. . . ?
Q.1 Mathematics
225 Proof environment
It has long been thought impossible to make a proof environment which automatically
includes an ‘end-of-proof’ symbol. Some proofs end in displayed maths; others do not.
If the input file contains ...\] \end{proof} then LaTeX finishes off the displayed
maths and gets ready for a new line before it reads any instructions connected with
ending the proof, so the code is very tricky. You can insert the symbol by hand, but the
ntheorem package now solves the problem for LaTeX users: it does indeed provide an
automatic way of signalling the end of a proof.
The AMSLaTeX package amsthm also provides a proof environment that does the
job; though you need to insert a \qedhere command if the proof ends with a displayed
equation:

\begin{proof}
text...
\begin{equation*}
maths... \tag*{\qedhere}
\end{equation*}
\end{proof}

The \tag*{\qedhere} construction may be used in any of AMSLaTeX’s numbering


environments.
amsthm.sty : Distributed as part of the AMSLaTeX bundle macros/latex/
required/amslatex
ntheorem : macros/latex/contrib/ntheorem

226 Roman theorems


If you want to take advantage of the powerful \newtheorem command without the
constraint that the contents of the theorem is in a sloped font (for example, to use
it to create remarks, examples, proofs, . . . ) then you can use the AMSLaTeX amsthm
package (which now supersedes the theorem package previously recommended in these
answers). Alternatively, the following sets up an environment remark whose content
is in roman.
\newtheorem{preremark}{Remark}
\newenvironment{remark}%
{\begin{preremark}\upshape}{\end{preremark}}
The ntheorem package provides roman theorems directly.
amsthm.sty : Distributed as part of macros/latex/required/amslatex
ntheorem.sty : macros/latex/contrib/ntheorem
theorem.sty : Distributed as part of macros/latex/required/tools

129
227 Defining a new log-like function in LaTeX
Use the \mathop command, as in:

\newcommand{\diag}{\mathop{\mathrm{diag}}}

Subscripts and superscripts on \diag will be placed below and above the function
name, as they are on \lim. If you want your subscripts and superscripts always placed
to the right, do:

\newcommand{\diag}{\mathop{\mathrm{diag}}\nolimits}

AMSLaTeX (in its amsopn package, which is automatically loaded by amsmath)


provides a command \DeclareMathOperator that takes does the same job as the first
definition above. To create our original \diag command, one would say:

\DeclareMathOperator{\diag}{diag}

\DeclareMathOperator* declares the operator always to have its sub- and super-
scripts in the “\limits position”.
The amsopn command \operatorname allows you to introduce ad hoc opera-
tors into your mathematics; as with \DeclareMathOperator there’s a starred version
\operatorname* for sub- and superscripts in the limits position.
(It should be noted that “log-like” was reportedly a joke on Lamport’s part; it is of
course clear what was meant.)
amsopn.sty : In the AMSLaTeX distribution macros/latex/required/amslatex

228 Set specifications and Dirac brackets


One of the few glaring omissions from TeX’s mathematical typesetting capabilities is
a means of setting separators in the middle of mathematical expressions. TeX provides
primitives called \left and \right, which can be used to modify brackets (of what-
ever sort) around a mathematical expression, as in: \left( <expression> \right) —
the size of the parentheses is matched to the vertical extent of the expression.
However, in all sorts of mathematical enterprises one may find oneself needing a
\middle command, to be used in expressions like

\left\{ x \in \mathbb{N} \middle| x \mbox{ even} \right\}

to specify the set of even natural numbers. The e-TeX system defines just such a com-
mand, but users of Knuth’s original need some support. Donald Arseneau’s braket
package provides commands for set specifications (as above) and for Dirac brackets
(and bras and kets). The package uses the e-TeX built-in command if it finds itself
running under e-TeX.
braket.sty : macros/latex/contrib/misc/braket.sty

229 Cancelling terms in maths expressions


A technique used when explaining the behaviour of expressions or equations (often for
pedagogical purposes). The cancel package provides several variants of cancellation
marks (“\”, “/” and “X”), and a means of cancelling ‘to’ a particular value.
Documentation of cancel is in the package file.
cancel.sty : macros/latex/contrib/misc/cancel.sty

230 Adjusting maths font sizes


In Plain TeX, when you introduce a new font size you must also declare what size
fonts are to be used in mathematics with it. This is done by declaring \textfont,
\scriptfont and \scriptscriptfont for the maths families you’re using; all such
things are described in chapter 17 of the TeXbook and in other books and tutorials that
discuss Plain TeX in sufficient detail.
In LaTeX, of course, all this stuff is automated: there is a scheme that, for each
(text) font size, determines what maths font sizes are to be used. The scheme first
checks a set of “known” text sizes, for each of which maths sizes are declared in ad-
vance. If the text size isn’t “known”, the script- and scriptscriptfont sizes are calculated
as fixed ratios of the tex font size. (The values used are \defaultscriptratio=0.7,
and \defaultscriptscriptratio=0.5.)

130
The fixed-ratio formula is capable of producing inconvenient results (particularly
if you are using fonts which LaTeX believes are only available in a fixed set of sizes).
You may also want to replace LaTeX’s ideas altogether, for example by setting maths
noticeably larger or smaller than its surrounding text. For this purpose, the LaTeX
command \DeclareMathSizes{htfsi}{htsi}{hssi}{hsssi} may be used (this is the
same command that LaTeX itself uses to define its own set of sizes). This establishes
(or re-establishes) the maths font sizes to be used when the surrounding text font size
is htfsi; (htsi being the size used for \textfont, hssi for \scriptfont and hsssi
for \scriptscriptfont).
For example, you might want to use a font with a smaller body height than Com-
puter Modern, but still prefer CM math to any of the alternatives. In this case, you
might use:

\DeclareMathSizes{10}{9}{7}{5}

to get 9pt maths when the surrounding body text is (nominal) 10pt.
\DeclareMathSizes may only be used in the preamble of the document: only one
association is available for each text font size for the whole document. The default
settings are specified in fontdef.dtx in the latex distribution, and are compiled into
fontmath.ltx; the arguments to the command are just numbers (‘pt’ is assumed), but
some of them are written using LaTeX abbreviations for standard font sizes. Beware
simply copying (parts of) the LaTeX definitions — since they contain those internal
abbreviations, they need to be treated as internal commands.
fontdef.dtx : macros/latex/base/fontdef.dtx
fontmath.ltx : macros/latex/unpacked/fontmath.ltx

231 Ellipses
Ellipses are commonly required, and LaTeX natively supplies a fair range (\dots,
\cdots, \vdots and \ddots). By using the graphics package, one can change the
slope of the \ddots command, as in
$ ... \reflectbox{$\ddots$} ... $
While this works, it is not a recommended way of achieving the desired result (see
below). Moreover, LaTeX’s range is not adequate to everyone’s requirements, and at
least three packages provide extensions to the set.
The AMSLaTeX bundle provides a range of “semantically-named” ellipses, for use
in different situations: \dotsb for use between pairs of binary operators, \dotsc for
use between pairs of commas, and so on.
The yhmath package defines an \adots command, which is the analogue of
\ddots, sloping forwards rather than backwards. The yhmath package comes with a
rather interesting font that extends the standard cmex; details are in the documentation.
The disadvantage of this setup is, that although \adots is merely a macro, the package
tries to load its own font and produces a “missing font” substitution warning message
if you haven’t installed the font.
The mathdots package (besides fixing up the behaviour of (La)TeX \ddots and
\vdots when the font size changes) provides an “inverse diagonal” ellipsis \iddots
(doing the same job as yhmath’s \adots, but better).
Documentation of yhmath appears, processed, in the distribution (thus saving you
the bother of installing the package before being able to read the documentation). Doc-
umentation of mathdots appears at the end the package file itself.
amslatex : macros/latex/required/amslatex
graphics.sty : Part of the macros/latex/required/graphics bundle
mathdots.sty : macros/generic/mathdots
yhmath (fonts): fonts/yhmath
yhmath (macros): macros/latex/contrib/yhmath

232 Sub- and superscript positioning for operators


The commonest hand-written style for expressions is to place the limit expressions on
operators such as \sum and \int physically above and below the operator. In (La)TeX,
we write these limit expressions using sub- and superscripts applied to the operator, but
they don’t always appear in the “handwritten” way in TeX’s output.
131
The reason is, that when an expression appears in non-display maths, in running
text (and is therefore in TeX \textstyle), placing the limits thus could lead to ragged
line spacing (and hence difficult-to-read text). It is therefore common (in \textstyle)
to place the limits as one would sub- and superscripts of variables.
This is not universally satisfactory, so the primitive \limits is provided:
$\sum\limits_{n=1}^{m} ...$

which will place the limits right above and below the symbol (and be blowed to the
typography. . . ).
Contrariwise, you may wish to change the arrangement of the limits when in
\displaystyle. For this purpose, there’s a corresponding \nolimits:

\[\sum\nolimits_{n=1}^{m} ...\]

which will place the limits as they would be in \textstyle.


Alternatively, one can manipulate the \textstyle/\displaystyle state of the
mathematics. To get “\limits placement” in inline maths,

$\displaystyle\sum_{n=1}^{m} ...$

and for “\nolimits placement” in display maths, \nolimits:

\[\textstyle\sum_{n=1}^{m} ...\]

will serve. Either of these forms may have effects other than on the operator you’re
considering, but there are still those who prefer this formulation.
(Note that the macro \int normally has \nolimits built in to its definition. There
is an example in the TeXbook to show how odd \int\limits looks when typeset.)
233 Text inside maths
When we type maths in (La)TeX, the letters from which we make up ordinary text
assume a special significance: they all become single-letter variable names. The letters
appear in italics, but it’s not the same sort of italics that you see when you’re typing
ordinary text: a run of maths letters (for example “here”) looks oddly “lumpy” when
compared with the word written in italic text. The difference is that the italic text is
kerned to make the letters fit well together, whereas the maths is set to look as if you’re
multiplying h by e by r by e. The other way things are odd in TeX maths typing is that
spaces are ignored: at best we can write single words in this oddly lumpy font.
So, if we’re going to have good-looking text in amongst maths we’re writing, we
have to take special precautions. If you’re using LaTeX, the following should help.
The simplest is to use \mbox or \textrm:

$e = mc^2 \mbox{here we go again}$

The problem is that, with either, the size of the text remains firmly at the surrounding
text size, so that

$z = a_{\mbox{other end}}$

looks quite painfully wrong.


The other simple technique, \textrm, is more promising:
$z = a_{\textrm{other end}}$

is definitely right. However, the surrounding text may not be in your roman font; if you
care about matching text, you need to choose between \textrm, \textsf, and so on.
(The maths-mode instance of your roman font (\mathrm) gets the size right, but
since it’s intended for use in maths, its spaces get ignored — use \mathrm for upright
roman alphabetic variable names, but not otherwise.)
You can correct these problems with size selectors in the text, as:

$z = a_{\mbox{\scriptsize other end}}$

132
which works if your surrounding text is at default document size, but gives you the
wrong size otherwise.
These short cuts are (just about) OK for the “occasional” mathematician, but se-
rious mathematics calls for a technique that relieves the typist of the sort of thought
required. As usual, the AMSLaTeX system provides what’s necessary — the \text
command. The command is actually provided by the amstext package, but the “global”
amsmath package loads it, so anyone using AMSLaTeX proper has the command avail-
able, so even joke mathematicians can write:
\usepackage{amsmath}
...
$z = a_{\text{other end}}$

and the text will be at the right size, and in the same font as surrounding text.
AMSLaTeX also makes provision for interpolated comments in the middle of one
of its multi-line display structures, through the \intertext command. For example:
\begin{align}
A_1&=N_0(\lambda;\Omega’)-\phi(\lambda;\Omega’),\\
A_2&=\phi(\lambda;\Omega’)-\phi(\lambda;\Omega),\\
\intertext{and} A_3&=\mathcal{N}(\lambda;\omega).
\end{align}

places the text “and” on a separate line before the last line of the display. If the in-
terjected text is short, or the equations themselves are light-weight, you may find that
\intertext leaves too much space. Slightly more modest is the \shortintertext
command from the mathtools package:
\begin{align}
a =& b
\shortintertext{or}
c =& b
\end{align}

To have the text on the same line as the second equation, one can use the flalign
environment (from amsmath) with lots of dummy equations (represented by the double
& signs):

\begin{flalign}
&& a =& b && \\
\text{or} && c =& b &&
\end{flalign}

Comprehensive documentation of AMSLaTeX is to be found in amsldoc, in the


distribution; it is also available on the web at ftp://ftp.ams.org/pub/tex/doc/
amsmath/amsldoc.pdf
amsldoc.tex : Distributed as part of AMSLaTeX macros/latex/required/
amslatex
amsmath.sty : Distributed as part of AMSLaTeX macros/latex/required/
amslatex
amstext.sty : Distributed as part of AMSLaTeX macros/latex/required/
amslatex
mathtools.sty : Distributed as part of the mh bundle macros/latex/contrib/mh

234 Re-using an equation


To repeat an existing equation, one wants not only to have the same mathematics in it,
one also wants to re-use the original label it had. The amsmath package comes to our
help, here:
\usepackage{amsmath}
...
\begin{equation}
a=b
\label{eq1}
133
\end{equation}
...
Remember that
\begin{equation}
a=b
\tag{\ref{eq1}}
\end{equation}

Here, the second instance of a = b will be typeset with a copy, made by the \tag
command, of the label of the first instance.
Comprehensive documentation of AMSLaTeX is to be found in amsldoc, in the
distribution; it is also available on the web at ftp://ftp.ams.org/pub/tex/doc/
amsmath/amsldoc.pdf
amsldoc.tex : Distributed as part of AMSLaTeX macros/latex/required/
amslatex
amsmath.sty : Distributed as part of AMSLaTeX macros/latex/required/
amslatex

235 Line-breaking in in-line maths


TeX, by default, allows you to split a mathematical expression at the end of the line; it
allows breaks at relational operators (like “=”, “<”, etc.) and at binary operators (like
“+”, “-”, etc.). In the case of large expressions, this can sometimes be a life-saver.
However, in the case of simple expressions like a = b + c, a break can be really
disturbing to the reader, and one would like to avoid it.
Fortunately, these breaks are controllable: there are “penalties” associated with
each type of operator: the penalty says how undesirable a break at each point is. Default
values are:

\relpenalty = 500
\binoppenalty = 700

You make the break progressively less attractive by increasing these values. You can
actually forbid all breaks, everywhere, by:

\relpenalty = 10000
\binoppenalty = 10000

If you want just to prevent breaks in a single expression, write:

{%
\relpenalty = 10000
\binoppenalty = 10000
$a=b+c$
}

and the original values will remain undisturbed outside the braces. This is tedious:
there is often value in an alternative approach, in which you say which parts of the
expression may not break whatever happens, and fortunately this is surprisingly easy.
Suppose we want to defer a break until after the equality, we could write:

${a+b+c+d} = z+y+x+w$

The braces say “treat this subformula as one atom” and (in TeX at least) atoms don’t
get split: not a \binoppenalty change in sight.

Q.2 Lists
236 Fancy enumeration lists
The enumerate package allows you to control the display of the enumeration counter.
The package adds an optional parameter to the enumerate environment, which is used
to specify the layout of the labels. The layout parameter contains an enumeration type
(‘1’ for arabic numerals, ‘a’ or ‘A’ for alphabetic enumeration, and ‘i’ or ‘I’ for Roman
numerals), and things to act as decoration of the enumeration. So, for example

134
\usepackage{enumerate}
...
\begin{enumerate}[(a)]
\item ... ...
\end{enumerate}

starts a list whose labels run (a), (b), (c), . . . ; while

\usepackage{enumerate}
...
\begin{enumerate}[I/]
\item ... ...
\end{enumerate}

starts a list whose labels run I/, II/, III/, . . .


The paralist package, whose primary purpose is compaction of lists, provides the
same facilities for its enumerate-like environments.
If you need non-stereotyped designs, the enumitem package gives you most of the
flexibility you might want to design your own. The silly roman example above could
be achieved by:

\usepackage{enumitem}
...
\begin{enumerate}[label=\Roman{*}/]
\item ... ...
\end{enumerate}

Note that the ‘*’ in the key value stands for the list counter at this level. You can also
manipulate the format of references to list item labels:

\usepackage{enumitem}
...
\begin{enumerate}[label=\Roman{*}/, ref=(\roman{*})]
\item ... ...
\end{enumerate}

to make references to the list items format appear as (i), (ii), (iii), etc.
The memoir class includes functions that match those in the enumerate package,
and has similar functionality for itemize lists.
enumerate.sty : Distributed as part of macros/latex/required/tools
enumitem.sty : macros/latex/contrib/enumitem
memoir.cls: macros/latex/contrib/memoir
paralist.sty : macros/latex/contrib/paralist

237 How to reduce list spacing


Lamport’s book lists various parameters for the layout of list (things like \topsep,
\itemsep and \parsep), but fails to mention that they’re set automatically within the
list itself. This happens because each list executes a command \@listhdepthi (the
depth appearing as a lower-case roman numeral); what’s more, the top-level \@listi
is usually reset when the font size is changed. As a result, it’s rather tricky for the user
to control list spacing. Of course, the real answer is to use a document class designed
with more modest list spacing, but we all know such things are hard to come by. The
memoir class wasn’t designed for more compact lists per se, but offers the user control
over the list spacing.
There are packages that provide some control of list spacing, but they seldom ad-
dress the separation from surrounding text (defined by \topsep). The expdlist package,
among its many controls of the appearance of description lists, offers a compaction
parameter (see the documentation); the mdwlist package offers a \makecompactlist
command for users’ own list definitions, and uses it to define compact lists itemize*,
enumerate* and description*. In fact, you can write lists such as these commands
define pretty straightforwardly — for example:

135
\newenvironment{itemize*}%
{\begin{itemize}%
\setlength{\itemsep}{0pt}%
\setlength{\parskip}{0pt}}%
{\end{itemize}}

The paralist package provides several approaches to list compaction:


• its asparaenum environment formats each item as if it were a paragraph introduced
by the enumeration label (which saves space if the item texts are long);
• its compactenum environment is the same sort of compact list as is provided in
expdlist and mdwlist; and
• its inparaenum environment produces a list “in the paragraph”, i.e., with no line
break between items, which is a great space-saver if the list item texts are short.
The package will manipulate its enumerate environment labels just like the enumerate
package does.
Paralist also provides itemize equivalents (asparaitem, etc.), and description
equivalents (asparadesc, etc.).
The multenum package offers a more regular form of paralist’s inparaenum; you
define a notional grid on which list entries are to appear, and list items will always
appear at positions on that grid. The effect is somewhat like that of the ‘tab’ keys
on traditional typewriters; the package was designed for example sheets, or lists of
answers in the appendices of a book.
The ultimate in compaction (of every sort) is offered by the savetrees package;
compaction of lists is included. The package’s prime purpose is to save space at every
touch and turn: don’t use it if you’re under any design constraint whatever!
The expdlist, mdwlist and paralist packages all offer other facilities for list config-
uration: you should probably not try the “do-it-yourself” approaches outlined below if
you need one of the packages for some other list configuration purpose.
For ultimate flexibility (including manipulation of \topsep), the enumitem pack-
age permits adjustment of list parameters using a “key=hvaluei” format; so for exam-
ple, one might write
\usepackage{enumitem}
...
\begin{enumerate}[topsep=0pt, partopsep=0pt]
\item ... ...
\end{enumerate}

to suppress all spacing above and below your list. Enumitem also permits manipulation
of the label format in a more “basic” manner than the enumerate package does.
enumerate.sty : Distributed as part of macros/latex/required/tools
enumitem.sty : macros/latex/contrib/enumitem
expdlist.sty : macros/latex/contrib/expdlist
memoir.cls: macros/latex/contrib/memoir
mdwlist.sty : Distributed as part of macros/latex/contrib/mdwtools
multenum.sty : macros/latex/contrib/multenum
paralist.sty : macros/latex/contrib/paralist
savetrees.sty : macros/latex/contrib/savetrees

238 Interrupting enumerated lists


It’s often convenient to have commentary text, ‘outside’ the list, between successive
entries of a list. In the case of itemize lists this is no problem, since there’s never
anything to distinguish successive items, while in the case of description lists, the
item labels are under the user’s control so there’s no automatic issue of continuity.
For enumerate lists, the labels are generated automatically, and are context-
sensitive, so the context (in this case, the state of the enumeration counter) needs to be
preserved.
The belt-and-braces approach is to remember the state of the enumeration in your
own counter variable, and then restore it when restarting enumerate:
136
\newcounter{saveenum}
...
\begin{enumerate}
...
\setcounter{saveenum}{\value{enumi}}
\end{enumerate}
<Commentary text>
\begin{enumerate}
\setcounter{enumi}{\value{saveenum}}
...
\end{enumerate}

This is reasonable, in small doses. . . Problems (apart from sheer verbosity) are
getting the level right (“should I use counter enumi, enumii, . . . ”) and remembering
not to nest the interruptions (i.e., not to have a separate list, that is itself interrupted) in
the “commentary text”).
The mdwlist package defines commands \suspend and \resume that simplify the
process:

\begin{enumerate}
...
\suspend{enumerate}
<Commentary text>
\resume{enumerate}
...
\end{enumerate}

The package allows an optional name (as in \suspend[id]{enumerate}) to allow you


to identify a particular suspension, and hence provide a handle for manipulating nested
suspensions.
If you’re suspending a fancy-enumeration list, you need to re-supply the optional
“item label layout” parameters required by the enumerate package when resuming the
list, whether by the belt-and-braces approach, or by the mdwlist \resume{enumerate}
technique. The task is a little tedious in the mdwlist case, since the optional argument
has to be encapsulated, whole, inside an optional argument to \resume, which requires
use of extra braces:

\begin{enumerate}[\textbf{Item} i]
...
\suspend{enumerate}
<comment>
\resume{enumerate}[{[\textbf{Item} i]}]
...
\end{enumerate}

The enumitem package, in its most recent release, will allow you to resume lists, at one
level only:
\begin{enumerate}
...
\end{enumerate}
<comment>
\begin{enumerate}[resume]
...
\end{enumerate}

which feels just as “natural” as the mdwtools facility, and has the advantage of playing
well with the other excellent facilities of enumitem.
enumerate.sty : Distributed as part of macros/latex/required/tools
enumitem.sty : macros/latex/contrib/enumitem
mdwlist.sty : Distributed as part of macros/latex/contrib/mdwtools

137
Q.3 Tables, figures and diagrams
239 The design of tables
In recent years, several authors have argued that the examples, set out by Lamport in
his LaTeX manual, have cramped authors’ style and have led to extremely poor table
design. It is in fact difficult even to work out what many of the examples in Lamport’s
book “mean”.
The criticism focuses on the excessive use of rules (both horizontal and vertical)
and on the poor vertical spacing that Lamport’s macros offer.
The problem of vertical spacing is plain for all to see, and is addressed in several
packages — see “spacing of lines in tables”.
The argument about rules is presented in the excellent essay that prefaces the doc-
umentation of Simon Fear’s booktabs package, which (of course) implements Fear’s
scheme for ‘comfortable’ rules. (The same rule commands are implemented in the
memoir class.)
Lamport’s LaTeX was also inflexibly wrong in “insisting” that captions should
come at the bottom of a table. Since a table may extend over several pages, traditional
typography places the caption at the top of a table float. The \caption command will
get its position wrong (by 10pt) if you simply write:

\begin{table}
\caption{Example table}
\begin{tabular}{...}
...
\end{tabular}
\end{table}

The topcapt package solves this problem:

\usepackage{topcaption}
...
\begin{table}
\topcaption{Example table}
\begin{tabular}{...}
...
\end{tabular}
\end{table}

The KOMA-script classes provide a similar command \captionabove; they also


have a class option tablecaptionabove which arranges that \caption means
\captionabove, in table environments. The caption (from the release of 23 Jan-
uary, 2004) may be loaded with an option that has the same effect:

\usepackage[tableposition=top]{caption}

Doing the job yourself is pretty easy: topcapt switches the values of the LaTeX 2ε
parameters \abovecaptionskip (default value 10pt) and \belowcaptionskip (de-
fault value 0pt), so:

\begin{table}
\setlength{\abovecaptionskip}{0pt}
\setlength{\belowcaptionskip}{10pt}
\caption{Example table}
\begin{tabular}{...}
...
\end{tabular}
\end{table}

does the job. (The package itself is very slightly more elaborate. . . )
booktabs.sty : macros/latex/contrib/booktabs
KOMA script bundle: macros/latex/contrib/koma-script
memoir.cls: macros/latex/contrib/memoir
topcapt.sty : macros/latex/contrib/misc/topcapt.sty

138
240 Fixed-width tables
There are two basic techniques for making fixed-width tables in LaTeX: you can make
the gaps between the columns stretch, or you can stretch particular cells in the table.
Basic LaTeX can make the gaps stretch: the tabular* environment takes an extra
argument (before the clpr layout one) which takes a length specification: you can say
things like “15cm” or “\columnwidth” here. You must also have an \extracolsep
command in the clpr layout argument, inside an @{} directive. So, for example, one
might have
\begin{tabular*}{\columnwidth}{@{\extracolsep{\fill}}lllr}
The \extracolsep applies to all inter-column gaps to its right as well; if you don’t
want all gaps stretched, add \extracolsep{0pt} to cancel the original.
The tabularx package defines an extra clpr column specification, X; X columns
behave as p columns which expand to fill the space available. If there’s more than one
X column in a table, the spare space is distributed between them.
The tabulary package (by the same author) provides a way of “balancing” the space
taken by the columns of a table. The package defines column specifications C, L, R and
J, giving, respectively, centred, left, right and fully-justified versions of space-sharing
columns. The package examines how long each column would be “naturally” (i.e., on
a piece of paper of unlimited width), and allocates space to each column accordingly.
There are “sanity checks” so that really large entries don’t cause everything else to
collapse into nothingness (there’s a “maximum width” that any column can exert), and
so that tiny entries can’t get smaller than a specified minimum. Of course, all this
work means that the package has to typeset each row several times, so things that leave
“side-effects” (for example, a counter used to produce a row-number somewhere) are
inevitably unreliable, and should not even be tried.
The ltxtable combines the features of the longtable and tabularx packages: it’s
important to read the documentation, since usage is distinctly odd.
ltxtable.sty : Distributed as part of macros/latex/contrib/carlisle
tabularx.sty : Distributed as part of macros/latex/required/tools
tabulary.sty : macros/latex/contrib/tabulary

241 Variable-width columns in tables


This is a slightly different take on the problem addressed in “fixed-width tables” —
here we have a column whose size we can’t absolutely predict when we design the
document.
While the basic techniques (the tabularx, tabulary and ltxtable packages) are the
same for this problem as for the fixed-width table problem, there’s one extra tool that
we can call to our aid, which may be preferable in some situations.
Suppose we have data in one column which we read from an external source, and
the source itself isn’t entirely predictable. The data in the column may end up pretty
narrow in every row of the table, or it may be wide enough that the table would run over
the edge of the page; however, we don’t want to make the column as wide as possible
“just in case”, by defining a fixed size for the table. We would like the column to be as
small as possible, but have the possibility to spread to a maximum width and (if even
that width is exceeded) turn into a p-style column.
The varwidth package, discussed in “automatic sizing of minipages”, provides a
solution. If you load it together with the LaTeX “required” array package, i.e.:
\usepackage{array}
\usepackage{varwidth}

varwidth defines a new column-type “V”, which you can use as follows:
\begin{tabular}{l V{3.5cm} r}
foo & blah & bar \\
foo & blah blah & bar \\
\end{tabular}

when the second column ends up less than 3.5cm wide; or you can use it as follows:
\begin{tabular}{l V{3.5cm} r}
foo & blah & bar \\
139
foo & blah blah & bar \\
foo & blah blah blah blah blah blah
& bar \\
\end{tabular}

where the second column will end up noticeably wider, and will wrap to a second line
in the third row.
array.sty : Distributed as part of macros/latex/required/tools
varwidth.sty : macros/latex/contrib/misc/varwidth.sty

242 Spacing lines in tables


(La)TeX mechanisms for maintaining the space between lines (the “leading”) rely on
TeX’s paragraph builder, which compares the shape of consecutive lines and adjusts
the space between them.
These mechanisms can’t work in exactly the same way when (La)TeX is building
a table, because the paragraph builder doesn’t get to see the lines themselves. As a re-
sult, tables sometimes typeset with lines uncomfortably close together (or occasionally
ridiculously far apart).
Traditional (moving metal type) typographers would adjust the spacing between
lines of a table by use of a “strut” (a metal spacer). A TeX user can do exactly the
same thing: most macro packages define a \strut command, that defines a space
appropriate to the current size of the text; placing a \strut command at the end of a
troublesome row is the simplest solution to the problem — if it works. Other solutions
below are LaTeX-specific, but some may be simply translated to Plain TeX commands.
If your table exhibits a systematic problem (i.e., every row is wrong by the same
amount) use \extrarowheight, which is defined by the array package:
\usepackage{array}% in the preamble
...
\setlength{\extrarowheight}{length}
\begin{tabular}{....}

To correct a single row whose maladjustment isn’t corrected by a \strut com-


mand, you can define your own, using \rule{0pt}{length} — which is a near ap-
proximation to the command that goes inside a \strut. The bigstrut package defines a
strut command that you can use for this purpose: \bigstrut on its own opens up both
above and below the current line; \bigstrut[t] opens up above the line, \bigstrut
[b] opens up below the line.
General solutions are available, however. The tabls package automatically gener-
ates an appropriately-sized strut at the end of each row. Its disadvantages are that it’s
really rather slow in operation (since it gets in the way of everything within tables) and
its (lack of) compatibility with other packages.
The cellspace does a (possibly inferior) job by defining a new table/array column
type “S”, which you apply to each column specification. So, for example,
\cmdinvoke{begin}{tabular}{l l l p{3cm}}

would become
\cmdinvoke{begin}{tabular}{Sl Sl Sl Sp{3cm}}

and so on. This technique shows promise of not interfering so much with other pack-
ages, but this author has heard of no reports from the field.
The booktabs package comes with a thought-provoking essay about how tables
should be designed. Since table row-spacing problems most often appear in collisions
with rules, the author’s thesis, that LaTeX users tend too often to rule their tables, is
interesting. The package provides rule commands to support the author’s scheme, but
deals with inter-row spacing too. The most recent release of booktabs sports compati-
bility with packages such as longtable.
Documentation of both bigstrut and tabls may be found as comments in the package
files themselves.
array.sty : Distributed as part of macros/latex/required/tools
bigstrut.sty : Distributed as part of macros/latex/contrib/multirow
140
booktabs.sty : macros/latex/contrib/booktabs
cellspace.sty : macros/latex/contrib/cellspace
tabls.sty : macros/latex/contrib/misc/tabls.sty

243 Tables longer than a single page


Tables are, by default, set entirely in boxes of their own: as a result, they won’t split
over a page boundary. Sadly, the world keeps turning up tables longer than a single
page that we need to typeset.
For simple tables (whose shape is highly regular), the simplest solution may well be
to use the tabbing environment, which is slightly tedious to set up, but which doesn’t
force the whole aligment onto a single page.
The longtable package builds the whole table (in chunks), in a first pass, and then
uses information it has written to the .aux file during later passes to get the setting
“right” (the package ordinarily manages to set tables in just two passes). Since the
package has overview of the whole table at the time it’s doing “final” setting, the table
is set “uniformly” over its entire length, with columns matching on consecutive pages.
longtable has a reputation for failing to interwork with other packages, but it does
work with colortbl, and its author has provided the ltxtable package to provide (most
of) the facilities of tabularx (see fixed-width tables) for long tables: beware of its rather
curious usage constraints — each long table should be in a file of its own, and included
by \LTXtable{width}{file}. Since longtable’s multiple-page tables can’t possibly
live inside floats, the package provides for captions within the longtable environment
itself.
A seeming alternative to ltxtable is ltablex; but it is outdated and not fully func-
tional. Its worst problem is its strictly limited memory capacity (longtable is not so
limited, at the cost of much complication in its code); ltablex can only deal with rela-
tively small tables, it doesn’t seem likely that support is available; but its user interface
is much simpler than ltxtable, so if its restrictions aren’t a problem for you, it may be
worth a try.
The supertabular package starts and stops a tabular environment for each page
of the table. As a result, each ‘page worth’ of the table is compiled independently, and
the widths of corresponding columns may differ on successive pages. However, if the
correspondence doesn’t matter, or if your columns are fixed-width, supertabular has
the great advantage of doing its job in a single run.
Both longtable and supertabular allow definition of head- and footlines for the
table; longtable allows distinction of the first and last head and foot.
The xtab package fixes some infelicities of supertabular, and also provides a “last
head” facility (though this, of course, destroys supertabular’s advantage of operating
in a single run).
The stabular package provides a simple-to-use “extension to tabular” that allows
it to typeset tables that run over the end of a page; it also has usability extensions, but
doesn’t have the head- and footline capabilities of the major packages.
Documentation of ltablex is to be found in the package file.
longtable.sty : Distributed as part of macros/latex/required/tools
ltablex.sty : macros/latex/contrib/ltablex/ltablex.sty
ltxtable.sty : Generate by running macros/latex/contrib/carlisle/
ltxtable.tex
stabular.sty : Distributed as part of macros/latex/contrib/sttools
supertabular.sty : macros/latex/contrib/supertabular
xtab.sty : macros/latex/contrib/xtab

244 How to alter the alignment of tabular cells


One often needs to alter the alignment of a tabular p (‘paragraph’) cell, but problems
at the end of a table row are common. With a p cell that looks like:

... & \centering blah ... \\

one is liable to encounter errors that complain about a “misplaced \noalign” or “extra
alignment tab”, or the like. The problem is that the command \\ means different things
in different circumstances: the tabular environment switches the meaning to a value
141
for use in the table, and \centering, \raggedright and \raggedleft all change
the meaning to something incompatible. Note that the problem only arises in the last
cell of a row: since each cell is set into a box, its settings are lost at the & (or \\) that
terminates it.
The simple (old) solution is to preserve the meaning of \\:
\newcommand\PBS[1]{\let\temp=\\%
#1%
\let\\=\temp
}

which one uses as:

... & \PBS\centering blah ... \\

(for example).
The technique using \PBS was developed in the days of LaTeX 2.09 because the
actual value of \\ that the tabular environment uses was only available as an internal
command. Nowadays, the value is a public command, and you can in principle use it
explicitly:
... & \centering blah ... \tabularnewline

which may be incorporated into a simple macro as:

\newcommand{\RBS}{\let\\=\tabularnewline}

and used as

... & \centering\RBS blah ... \\

(note, you Preserve backslash with \PBS before the command that changes it, and Re-
store it with \RBS after the command; in fact, \RBS is marginally preferable, but the
old trick lingers on).
The \PBS and \RBS tricks also serve well in array package “field format” preamble
specifications:

\begin{tabular}{... >{\centering\RBS}p{50mm}}
...

or

\begin{tabular}{... >{\PBS\centering}p{50mm}}
...

In the tabularx and tabulary packages, there’s a command \arraybackslash that has
same effect as \RBS (above); so in those packages, one might say:

\begin{tabular}{... >{\centering\arraybackslash}p{50mm}}
...

in place of the example above; in fact, the very latest (2003/12/01) release of array.sty
also provides a \tabularnewline command, that has the “basic tabular/array” mean-
ing of ‘\\’. The command does rather lack brevity, but at least you don’t have to define
it for yourself.
array.sty : Distributed as part of macros/latex/required/tools
tabularx.sty : Distributed as part of macros/latex/required/tools
tabulary.sty : macros/latex/contrib/tabulary

142
245 The thickness of rules in LaTeX tables
The rules in a LaTeX table are by default 0.4pt thick; this is in fact a default built in at
the lowest level, and applies to all rules (including those separating blocks of running
text).
Sometimes, however, we look at a table and find we want the rules to stand out —
perhaps to separate the text from the rest of the body text, or to make the sections of
the table stand out from one another. However, a quick review of any LaTeX manual
will reveal no technique for making any one rule stand out, and a little experimentation
shows that it is indeed pretty difficult to prevent a change “bleeding” out to affect other
rules in the same table.
If you look at what we have to say on the design of tables, elsewhere among these
FAQs, and you may sense that the design of LaTeX simply skipped the issues surround-
ing table design: that’s presumably why there’s no facilities to help you.
Specifically, the length \arrayrulewidth affects the thickness of the rules (both
horizontal and vertical) within both tabular and array environments. If you change
from the default (see above) only as far as

\setlength{\arrayrulewidth}{1pt}

the change is remarkably striking. However, really quite subtle user level programming
proves incapable of changing just one rule: it’s necessary to delve into the (rather
tricky) code of \hline and \cline themselves.
Fortunately, this job has already been done for the community: the booktabs pack-
age defines three different classes of rule (\toprule, \midrule and \bottomrule),
and the package documentation offers hints on how to use them. You are strongly
advised to read the documentation pretty carefully.
The memoir class includes the booktabs package, and repeats the documentation in
its compendious manual.
Note that none of the above mentions the issue of the weight of vertical rules
(except in passing). For the reasons, see the documentation of the booktabs pack-
age (again); vertical rules in tables are in any case even more trickily coded than are
horizontal rules, and if their lack of configurability makes them still less attractive, so
much the better for the design of your document.
booktabs.sty : macros/latex/contrib/booktabs
memoir.cls: macros/latex/contrib/memoir

246 Flowing text around figures in LaTeX


There are several LaTeX packages that purport to do this, but they all have their limita-
tions because the TeX machine isn’t really designed to solve this sort of problem. Piet
van Oostrum has conducted a survey of the available packages; he recommends:

floatflt floatflt is an improved version (for LaTeX 2ε ) of floatfig.sty, and its


syntax is:
\begin{floatingfigure}[options]{width of figure}
figure contents
\end{floatingfigure}
There is a (more or less similar) floatingtable environment.
The tables or figures can be set left or right, or alternating on even/odd pages in a
double-sided document.
The package works with the multicol package, but doesn’t work well in the
neighbourhood of list environments (unless you change your LaTeX document).
wrapfig wrapfig has syntax:

\begin{wrapfigure}[height of figure in lines]{l|r,...}[overhang]{width}


figure, caption, etc.
\end{wrapfigure}

The syntax of the wraptable environment is similar.


Height can be omitted, in which case it will be calculated by the package; the
package will use the greater of the specified and the actual width. The {l,r,etc.}
parameter can also be specified as i(nside) or o(utside) for two-sided documents,
and uppercase can be used to indicate that the picture should float. The overhang

143
allows the figure to be moved into the margin. The figure or table will entered into
the list of figures or tables if you use the \caption command.
The environments do not work within list environments that end before the figure
or table has finished, but can be used in a parbox or minipage, and in twocolumn
format.
picins Picins is part of a large bundle that allows inclusion of pictures (e.g., with
shadow boxes, various MS-DOS formats, etc.). The command is:
\parpic(width,height)(x-off,y-off )[Options][Position]
{Picture}
Paragraph text
All parameters except the Picture are optional. The picture can be positioned left
or right, boxed with a rectangle, oval, shadowbox, dashed box, and a caption can
be given which will be included in the list of figures.
Unfortunately (for those of us whose understanding of German is not good), the
documentation is in German. Piet van Oostrum has written an English summary.

floatflt.sty : macros/latex/contrib/floatflt
picins.sty : systems/msdos/picins/picins.zip
picins documentation summary: macros/latex209/contrib/picins/picins.
txt
wrapfig.sty : macros/latex/contrib/wrapfig

247 Diagonal separation in corner cells of tables


You want to label both the top or bottom row and the left- or rightmost column, some-
where at the corner of the table where the row and column meet. A simple way to
achieve the result is to construct the table with an arrangement of rules (and possibly
\multicolumn entries), to look like:

-----------------
x y
--------------
1 2 3 4 5
-----------------
1
2
3
4
5
-----------------

However, this doesn’t satisfy everyone: many want the labelling in a single cell at
the top left of the table. It sounds a simple enough requirement, yet it calls for some
slightly tricky LaTeX coding. The slashbox package does the job for you: it defines
commands \slashbox and \backslashbox, each taking the two labels as arguments.
It draws a picture with the two labels on either side of a slanting line; the command
(and hence the picture) may be placed in the corner cell, where the labelled row and
column meet.
The package isn’t the world’s neatest: it uses LaTeX picture mode to draw its line,
and picture mode has many tedious restrictions (and doesn’t, in all honesty, produce
particularly good pictures). Load slashbox with the pict2e package, and at least the
picture quality will be improved.
Documentation of slashbox is less than satisfactory: a LaTeX source file of rather
startling starkness accompanies the package file in the distribution. It does, however,
process to a DVI file that gives some idea of how the \slashbox may be expected to
look. (The third example in the file shows the effect of picture mode’s restrictions:
the dividing line doesn’t go from corner to corner in the box: to correct this requires
revision of slashbox — pict2e alone doesn’t help in this regard.)
slashbox.sty : macros/latex/contrib/slashbox

248 How to change a whole row of a table


Each cell of a table is set in a box, so that a change of font style (or whatever) only
lasts to the end of the cell. If one has a many-celled table, or a long one which needs
144
lots of rows emphasising, putting a font style change command in every cell will be
impossibly tedious.
With the array package, you can define column modifiers which will change the
font style for a whole column. However, with a bit of subtlety, one can make such
modifiers affect rows rather than columns. So, we set things up by:
\usepackage{array}
\newcolumntype{$}{>{\global\let\currentrowstyle\relax}}
\newcolumntype{^}{>{\currentrowstyle}}
\newcommand{\rowstyle}[1]{\gdef\currentrowstyle{#1}%
#1\ignorespaces
}

Now, we put ‘$’ before the first column specifier; and we put ‘^’ before the modifiers of
subsequent ones. We then use \rowstyle at the start of each row we want to modify:

\begin{tabular}{|$l|^l|^l|} \hline
\rowstyle{\bfseries}
Heading & Big and & Bold \\ \hline
Meek & mild & entry \\
Meek & mild & entry \\
\rowstyle{\itshape}
Strange & and & italic \\
Meek & mild & entry \\ \hline
\end{tabular}

The array package works with several other tabular-like environments from other
packages (for example longtable), but unfortunately this trick won’t always work.
array.sty : Distributed as part of macros/latex/required/tools

249 Merging cells in a column of a table


It’s easy to come up with a table design that requires a cell that spans several rows.
An example is something where the left-most column labels the rest of the table; this
can be done (in simple cases) by using diagonal separation in corner cells, but that
technique rather strictly limits what can be used as the content of the cell.
The multirow package enables you to construct such multi-row cells, in a very
simple manner. For the simplest possible use, one might write:

\begin{tabular}{|c|c|}
\hline
\multirow{4}{*}{Common g text}
& Column g2a\\
& Column g2b \\
& Column g2c \\
& Column g2d \\
\hline
\end{tabular}

and multirow will position “Common g text” at the vertical centre of the space defined
by the other rows. Note that the rows that don’t contain the “multi-row” specification
must have empty cells where the multi-row is going to appear.
The “*” may be replaced by a column width specification. In this case, the argu-
ment may contain forced line-breaks:

\begin{tabular}{|c|c|}
\hline
\multirow{4}{25mm}{Common\\g text}
& Column g2a\\
& Column g2b \\
& Column g2c \\
& Column g2d \\
\hline
\end{tabular}

145
A similar effect (with the possibility of a little more sophistication) may be achieved
by putting a smaller table that lines up the text into a *-declared \multirow.
The \multirow command may also used to write labels vertically down one or
other side of a table (with the help of the graphics or graphicx package, which provide
the \rotatebox command):
\begin{tabular}{|l|l|}
\hline
\multirow{4}{*}{\rotatebox{90}{hi there}}
& Column g2a\\
& Column g2b \\
& Column g2c \\
& Column g2d \\
\hline
\end{tabular}

(which gives text going upwards; use angle -90 for text going downwards, of course).
Multirow is set up to interact with the bigstrut package (which is also discussed in
the answer to spacing lines in tables). You use an optional argument to the \multirow
command to say how many of the rows in the multi-row have been opened up with
\bigstrut.
The documentation of both multirow and bigstrut is to be found, as comments, in
the package files themselves.
bigstrut.sty : Distributed as part of macros/latex/contrib/multirow
multirow.sty : macros/latex/contrib/multirow

Q.4 Floating tables, figures, etc.


250 Floats on their own on float pages
It’s sometimes necessary to force a float to live on a page by itself. (It’s sometimes even
necessary for every float to live on a page by itself.) When the float fails to ‘set’, and
waits for the end of a chapter or of the document, the natural thing to do is to declare
the float as
\begin{figure}[p!]
but the overriding ! modifier has no effect on float page floats; so you have to make
the float satisfy the parameters. Moving tables and figures offers some suggestions, but
doesn’t solve the one-float-per-page question.
The ‘obvious’ solution, using the counter totalnumber (“total number of floats
per page”) doesn’t work: totalnumber only applies to floats on ‘text’ pages (pages
containing text as well as one or more float). So, to allow any size float to take a whole
page, set \floatpagefraction really small, and to ensure that no more than one float
occupies a page, make the separation between floats really big:
\renewcommand\floatpagefraction{.001}
\makeatletter
\setlength\@fpsep{\textheight}
\makeatother

251 Extra vertical space in floats


A common complaint is that extra vertical space has crept into figure or table float-
ing environments. More common still are users who post code that introduces this extra
space, and haven’t noticed the problem!
The trouble arises from the fact that the center environment (and its siblings
flushleft and flushright) are actually based on LaTeX’s list-handling code; and
lists always separate themselves from the material around them. Meanwhile, there
are parameters provided to adjust the spacing between floating environments and their
surroundings; so if we have:
\begin{figure}
\begin{center}
\includegraphics{...}
\caption{...}
\end{center}
\end{figure}
146
or worse still:
\begin{figure}
\begin{center}
\includegraphics{...}
\end{center}
\caption{...}
\end{figure}
unwarranted vertical space is going to appear.
The solution is to let the float and the objects in it position themselves, and to use
“generic” layout commands rather than their list-based encapsulations.
\begin{figure}
\centering \includegraphics{...}
\caption{...}
\end{figure}
(which even involves less typing).
This alternative code will work with any LaTeX package. It will not work with
obsolete (pre-LaTeX 2ε ) packages such as psfig or epsf — see graphics inclusion for
discussion of the genesis of \includegraphics.
252 Placing two-column floats at bottom of page
You specified placement ‘[htbp]’ for your full-width figure or table, but they always
get placed at the top of the page. . . Well, it is what the documentation says: LaTeX,
unadorned, only allows full-width floats at the top of a page, or occupying (part of) a
float page.
The stfloats package ameliorates the situation somewhat, and makes LaTeX hon-
our ‘[b]’ placement as well; the dblfloatfix package combines a tidied version of the
changes made in stfloats with the float ordering corrections defined in fixltx2e.
A particular problem with stfloats and dblfloatfix is that the float will appear, at its
earliest, on the page after it is specified. This has two undesirable side-effects: first,
there may be no bottom float on the first page of a document, and second, float numbers
may become “entangled” (particularly if you’re using dblfloatfix that ensures that the
early-specified bottom float is set before any single column floats).
(The FAQ team doesn’t know of any package that will make LaTeX honour ‘[h]’
placement of double-column floats, but the midfloat package can be pressed into service
to provide something approximating the effect it would have.)
dblfloatfix.sty : macros/latex/contrib/misc/dblfloatfix.sty
stfloats.sty, midfloat.sty : Distributed as part of macros/latex/contrib/
sttools

253 Floats in multicolumn setting


If you use
\begin{figure}
...
\end{figure}
in a multicols environment, the figure won’t appear. If instead you use
\begin{figure*}
...
\end{figure*}
the figure will stretch right across the page (just the same as a figure* in standard
LaTeX’s twocolumn option).
It’s possible to have single-column figures and tables with captions, using the ‘[H]’
placement option introduced by the float package but you might have to fiddle with
the placement because they won’t ‘float’, and exhibit other strange behaviours (such as
silently running off the end of the column at the end of the multicols environment).
float.sty : macros/latex/contrib/float
multicol.sty : Distributed as part of macros/latex/required/tools

147
254 Facing floats on 2-page spread
If a pair of floats are to be forced to form a 2-page spread (in a book, or whatever),
the first must lie on the left side of the spread, on an even-numbered page. The dpfloat
package provides for this: the construction to use is:

\begin{figure}[p]
\begin{leftfullpage}
<left side figure>
\end{leftfullpage}
\end{figure}
\begin{figure}[p]
\begin{fullpage}
<right side figure>
\end{fullpage}
\end{figure}

The construction has no effect unless the standard class option twoside is in effect.
Full documentation of the package (such as it is) is to be found in README.dpfloat
dpfloat.sty : macros/latex/contrib/dpfloat

255 Vertical layout of float pages


By default, LaTeX vertically centres the floats on a float page; the present author is not
alone in not liking this arrangement. Unfortunately, the control of the positioning is
“buried” in LaTeX-internal commands, so some care is needed to change the layout.
Float pages use three LaTeX lengths (i.e., TeX skips) to define their layout:

\@fptop defines the distance from the top of the page to the top of the first float,
\@fpsep defines the separation between floats, and
\@fpbot defines the distance from the bottom of the last float on the page to the bottom
of the page.

(In fact, the output routine places a skip of \@fpsep above each float, so the \@fptop
skip is always followed by a correction for that.)
The LaTeX defaults are:
\@fptop = 0pt + 1fil
\@fpsep = 8pt + 2fil
\@fpbot = 0pt + 1fil
so that the gaps expand to fill the space not occupied by floats, but if there is more than
one float on the page, the gap between them will expand to twice the space at top and
bottom.
Those who understand this stuff will be able to play elaborate games, but the com-
monest requirement, that the floats start at the top of the page, is a simple thing to
do:

\makeatletter
\setlength{\@fptop}{0pt}
\makeatother

Surprisingly, you may find this setting leaves your floats too high on the page. One can
justify a value of 5pt (in place of 0pt) — it’s roughly the difference between \topskip
and the height of normal (10pt) text.
Note that this is a “global” setting (best established in a class file, or at worst in the
document preamble); making the change for a single float page is likely (at the least)
to be rather tricky.

Q.5 Footnotes
256 Footnotes in tables
The standard LaTeX \footnote command doesn’t work in tables; the table traps the
footnotes and they can’t escape to the bottom of the page.
If your table is floating, your best bet is (unfortunately) to put the table in a
minipage environment and to put the notes underneath the table, or to use Donald
Arseneau’s package threeparttable (which implements “table notes” proper).

148
The ctable package extends the model of threeparttable, and also uses the ideas
of the booktabs package. The \ctable command does the complete job of setting
the table, placing the caption, and defining the notes. The “table” may consist of dia-
grams, and a parameter in \ctable’s optional argument makes the float that is created
a “figure” rather than a “table”.
Otherwise, if your table is not floating (it’s just a ‘tabular’ in the middle of some
text), there are several things you can do to fix the problem.

• Use \footnotemark to position the little marker appropriately, and then put in
\footnotetext commands to fill in the text once you’ve closed the tabular en-
vironment. This is described in Lamport’s book, but it gets messy if there’s more
than one footnote.
• Stick the table in a minipage anyway. This provides all the ugliness of footnotes
in a minipage with no extra effort.
• Use threeparttable anyway; the package is intended for floating tables, and the
result might look odd if the table is not floating, but it will be reasonable.
• Use tabularx or longtable from the LaTeX tools distribution; they’re noticeably
less efficient than the standard tabular environment, but they do allow footnotes.
• Grab hold of footnote, and put your tabular environment inside a savenotes
environment. Alternatively, say \makesavenoteenv{tabular} in the preamble
of your document, and tables will all handle footnotes correctly.
• Use mdwtab from the same bundle; it will handle footnotes properly, and has other
facilities to increase the beauty of your tables. It may also cause other table-related
packages (not the standard ‘tools’ ones, though) to become very unhappy and stop
working.

The documentation of threeparttable appears in the package file itself; that of ctable
is distributed as a PDF file (for convenience’s sake).
ctable.sty : macros/latex/contrib/ctable
footnote.sty : Distributed as part of macros/latex/contrib/mdwtools
longtable.sty : Distributed as part of macros/latex/required/tools
mdwtab.sty : Distributed as part of macros/latex/contrib/mdwtools
threeparttable.sty : macros/latex/contrib/misc/threeparttable.sty
tabularx.sty : Distributed as part of macros/latex/required/tools

257 Footnotes in LaTeX section headings


The \footnote command is fragile, so that simply placing the command in \section’s
arguments isn’t satisfactory. Using \protect\footnote isn’t a good idea either: the
arguments of a section command are used in the table of contents and (more dan-
gerously) potentially also in page headings. Unfortunately, there’s no mechanism to
suppress the footnote in the heading while allowing it in the table of contents, though
having footnotes in the table of contents is probably unsatisfactory anyway.
To suppress the footnote in headings and table of contents:

• Take advantage of the fact that the mandatory argument doesn’t ‘move’ if the op-
tional argument is present: \section[title]{title\footnote{title ftnt}}
• Use the footmisc package, with package option stable — this modifies footnotes
so that they softly and silently vanish away if used in a moving argument.

footmisc.sty : macros/latex/contrib/footmisc

258 Footnotes in captions


Footnotes in captions are especially tricky: they present problems of their own, on top
of the problems one experiences with footnotes in section titles and with footnotes in
tables. Fortunately, the requirement for footnotes in captions is extremely rare: if you
are experiencing problems, it is worth reviewing what you are trying to say by placing
this footnote. Note that the threeparttable scheme (see, again, footnotes in tables) also
applies to notes in captions, and may very well be preferable to whatever you were
thinking of.
If you are going to proceed:

149
• use an optional argument in your \caption command, that doesn’t have the foot-
note in it; this prevents the footnote appearing in the “List of . . . ”, and
• put your whole float in a minipage so as to keep the footnotes with the float.
so we have:
\begin{figure}
\begin{minipage}{\textwidth}
...
\caption[Caption for LOF]%
{Real caption\footnote{blah}}
\end{minipage}
\end{figure}

However, as well as all of the above, one also has to deal with the tendency of the
\caption command to produce the footnote’s text twice. For this last problem, there
is no tidy solution this author is aware of.
If you’re suffering the problem, a well-constructed \caption command in a
minipage environment within a float (as in the example above) can produce two
copies of the footnote body “blah”. (In fact, the effect only occurs with captions that
are long enough to require two lines to be typeset, and so wouldn’t appear with such a
short caption.)
The documentation of the ccaption package describes a really rather awful work-
around.
ccaption.sty : macros/latex/contrib/ccaption
threeparttable.sty : macros/latex/contrib/misc/threeparttable.sty

259 Footnotes whose texts are identical


If the same footnote turns up at several places within a document, it’s often inappropri-
ate to repeat the footnote in its entirety over and over again. We can avoid repetition by
semi-automatic means, or by simply labelling footnotes that we know we’re going to
repeat and then referencing the result. There is no completely automatic solution (that
detects and suppresses repeats) available.
If you know you only have one footnote, which you want to repeat, the solution is
simple: merely use the optional argument of \footnotemark to signify the repeats:
...\footnote{Repeating note}
...
...\footnotemark[1]

. . . which is very easy, since we know there will only ever be a footnote number 1. A
similar technique can be used once the footnotes are stable, reusing the number that
LaTeX has allocated. This can be tiresome, though, as any change of typesetting could
change the relationships of footnote and repeat: labelling is inevitably better.
Simple hand-labelling of footnotes is possible, using a counter dedicated to the job:
\newcounter{fnnumber}
...
...\footnote{Text to repeat}%
\setcounter{fnnumber}{\thefootnote}%
...
...\footnotemark[\thefnnumber]

but this is somewhat tedious. LaTeX’s labelling mechanism can be summoned to our
aid, but there are ugly error messages before the \ref is resolved on a second run
through LaTeX:
...\footnote{Text to repeat\label{fn:repeat}}
...
...\footnotemark[\ref{fn:repeat}]

Alternatively, one may use the \footref command, which has the advantage of work-
ing even when the footnote mark isn’t expressed as a number. The command is defined
in the footmisc package and in the memoir class (at least); \footref reduces the above
example to:
150
...\footnote{Text to repeat\label{fn:repeat}}
...
...\footref{fn:repeat}

This is the cleanest simple way of doing the job. Note that the \label command must
be inside the argument of \footnote.
The fixfoot package takes away some of the pain of the matter: you declare foot-
notes you’re going to reuse, typically in the preamble of your document, using a
\DeclareFixedFoot command, and then use the command you’ve ‘declared’ in the
body of the document:
\DeclareFixedFootnote{\rep}{Text to repeat}
...
...\rep{}
...\rep{}

The package ensures that the repeated text appears at most once per page: it will usually
take more than one run of LaTeX to get rid of the repeats.
fixfoot.sty : macros/latex/contrib/fixfoot
footmisc.sty : macros/latex/contrib/footmisc
memoir.cls: macros/latex/contrib/memoir

260 More than one sequence of footnotes


The need for more than one series of footnotes is common in critical editions (and other
literary criticism), but occasionally arises in other areas.
Of course, the canonical critical edition macros, edmac, offer the facility, as does
its LaTeX port, the ledmac package.
Multiple ranges of footnotes are offered to LaTeX users by the manyfoot package.
The package provides a fair array of presentation options, as well. The (rather new)
critical editions ednotes package is built upon a basis that includes manyfoot, as its
mechanism for multiple sets of footnotes.
edmac: macros/plain/contrib/edmac
ednotes: macros/latex/contrib/ednotes
ledmac: macros/latex/contrib/ledmac
manyfoot.sty : Distributed as part of the macros/latex/contrib/ncctools
bundle
261 Footnotes numbered “per page”
The obvious solution is to make the footnote number reset whenever the page number is
stepped, using the LaTeX internal mechanism. Sadly, the place in the document where
the page number is stepped is unpredictable, not (“tidily”) at the end of the printed
page; so the link only ever works by luck.
As a result, resetting footnotes is inevitably a two-pass process, using labels of
some sort. It’s nevertheless important, given the common requirement for footnotes
marked by symbols (with painfully small symbol sets). There are three packages that
manage it, one way or another.
The footnpag package does per-page footnotes and nothing else.
The perpage package provides a general mechanism for resetting counters per page,
so can obviously be used for this task. The interface is pretty simple: \MakePerPage
{footnote} will do the job. If you want to restart the counter at something other
than 1 (for example to avoid something in the LaTeX footnote symbol list), you can
use: \MakePerPage[2]{footnote}.
The footmisc package provides a variety of means of controlling footnote appear-
ance, among them a package option perpage that adjusts the numbering per page.
Documentation of footnpag comes as a DVI file footnpag-user in the distribu-
tion. Documentation of perpage appears in the package file, only: however, it amounts
to no more than appears above. . .
footmisc.sty : macros/latex/contrib/footmisc
footnpag.sty : macros/latex/contrib/footnpag
perpage.sty : Distributed as part macros/latex/contrib/bigfoot

151
Q.6 Document management
262 What’s the name of this file
One might want this so as to automatically generate a page header or footer recording
what file is being processed. It’s not easy. . .
TeX retains what it considers the name of the job, only, in the primitive \jobname;
this is the name of the file first handed to TeX, stripped of its directory name and of any
extension (such as .tex). If no file was passed (i.e., you’re using TeX interactively),
\jobname has the value texput (the name that’s given to .log files in this case).
This is fine, for the case of a small document, held in a single file; most significant
documents will be held in a bunch of files, and TeX makes no attempt to keep track
of files input to the job. So the user has to keep track, himself — the only way is to
patch the input commands and cause them to retain details of the file name. This is
particularly difficult in the case of Plain TeX, since the syntax of the \input command
is so peculiar.
In the case of LaTeX, the input commands have pretty regular syntax, and the
simplest patching techniques can be used on them.
If you’re not inputting things from a file that’s already been input, the job is almost
trivial:

\def\ThisFile{\jobname}
\let\OldInput\input
\renewcommand{\input}[1]{%
\renewcommand{\ThisFile}{#1}%
\OldInput{#1}%
}

With that, the macro \ThisFile always contains the last thing input: it starts pointing
at the base file of your document (\jobname), and thereafter changes every time you
use \input{file}.
Most ordinary users will quickly become irritated with the simplicity of of the
\ThisFile mechanism above. The following code is more cunning: it maintains
details of the files you’ve ‘come through’ to get to where you are, and it restores
\ThisFile to what the previous file was before returning.

\def\ThisFile{\jobname}
\newcounter{FileStack}
\let\OrigInput\input
\renewcommand{\input}[1]{%
\stackinput{#1}{\OrigInput}%
}
\newcommand{\stackinput}[2]{%
\stepcounter{FileStack}%
\expandafter\let
\csname NameStack\theFileStack\endcsname
\ThisFile
\def\ThisFile{#1}%
#2{#1}%
\expandafter\let\expandafter
\ThisFile
\csname NameStack\theFileStack\endcsname
\addtocounter{FileStack}{-1}%
}

To do the same for \include, we need the simple addition:

\let\OrigInclude\include
\renewcommand{\include}[1]{%
\stackinput{#1}{\OrigInclude}%
}

Both examples of patching \input assume you always use LaTeX syntax, i.e., always
use braces around the argument.

152
The FiNK (“File Name Keeper”) package provides a regular means of keeping track
of the current file name (with its extension), in a macro \finkfile. If you need the
unadorned file name (without its ‘.tex’), use the commands:

\def\striptexext#1.tex{#1}
...
\edef\ThisFile{\expandafter\stripext\finkfile}

The FiNK bundle includes a fink.el that provides support under emacs with AUC-
TeX.
fink.sty : macros/latex/contrib/fink

263 All the files used by this document


When you’re sharing a document with someone else (perhaps as part of a co-
development cycle) it’s as well to arrange that both correspondents have the same
set of auxiliary files, as well as the document in question. Your correspondent obvi-
ously needs the same set of files (if you use the url package, she has to have url too,
for example). But suppose you have a bug-free version of the shinynew package but
her copy is still the unstable original; until you both realise what is happening, such a
situation can be very confusing.
The simplest solution is the LaTeX \listfiles command. This places a list of the
files used and their version numbers in the log file. If you extract that list and transmit
it with your file, it can be used as a check-list in case that problems arise.
Note that \listfiles only registers things that are input by the “standard” LaTeX
mechanisms (\documentclass, \usepackage, \input, \include, \includegraphics
and so on). But if you use TeX primitive syntax, as in
\input mymacros
mymacros.tex won’t be listed by \listfiles, since you’ve bypassed the mechanism
that records its use.
The snapshot package helps the owner of a LaTeX document obtain a list of the
external dependencies of the document, in a form that can be embedded at the top of
the document. The intended use of the package is the creation of archival copies of
documents, but it has application in document exchange situations too.
The bundledoc system uses the snapshot to produce an archive (e.g., .tar.gz or
.zip) of the files needed by your document; it comes with configuration files for use
with teTeX and \miktex{}. It’s plainly useful when you’re sending the first copy of a
document.
bundledoc: support/bundledoc
snapshot.sty : macros/latex/contrib/snapshot

264 Marking changed parts of your document


One often needs clear indications of how a document has changed, but the commonest
technique, “change bars” (also known as “revision bars”), requires surprisingly much
trickery of the programmer (the problem being that TeX ‘proper’ doesn’t provide the
programmer with any information about the “current position” from which a putative
start- or end-point of a bar might be calculated; PDFTeX does provide the information,
but we’re not aware yet of any programmer taking advantage of the fact to write a
PDFTeX-based changebar package).
The simplest package that offers change bars is Peter Schmitt’s backgrnd.tex; this
was written as a Plain TeX application that patches the output routine, but it appears to
work at least on simple LaTeX documents. Wise LaTeX users will be alerted by the in-
formation that backgrnd patches their output routine, and will watch its behaviour very
carefully (patching the LaTeX output routine is not something to undertake lightly. . . ).
The longest-established solution is the changebar package, which uses \special
commands supplied by the driver you’re using. You need therefore to tell the pack-
age which driver to generate \specials for (in the same way that you need to tell
the graphics package); the list of available drivers is pretty wide, but does not in-
clude dvipdfm. The package comes with a shell script chbar.sh (for use on Unix
machines) that will compare two documents and generate a third which is marked-
up with changebar macros to highlight changes. The shareware WinEDT editor has
a macro that will generate changebar (or other) macros to show differences from
153
an earlier version of your file, stored in an RCS-controlled repository — see http:
//www.winedt.org/Macros/LaTeX/RCSdiff.php
The vertbars package uses the techniques of the lineno package (which it loads, so
the lineno itself must be installed); it’s thus the smallest of the packages for change bar
marking, since it leaves all the trickery to another package. Vertbars defines a vertbar
environment to create changebars.
The framed package is another that provides bars as a side-effect of other desirable
functionality: its leftbar environment is simply a stripped-down frame (note, though,
that the environment makes a separate paragraph of its contents, so it is best used when
the convention is to mark a whole changed paragraph.
Finally, the memoir class allows marginal editorial comments, which you can ob-
viously use to delimit areas of changed text.
Another way to keep track of changes is employed by some word-processors —
to produce a document that embodies both “old” and “new” versions. The Perl script
latexdiff does this for LaTeX documents; you feed it the two documents, and it pro-
duces a new LaTeX document in which the changes are very visible. An example of
the output is embedded in the documentation, latexdiff-man.pdf (part of the distribu-
tion). A rudimentary revision facility is provided by another Perl script, latexrevise,
which accepts or rejects all changes. Manual editing of the difference file can be used
to accept or reject selected changes only.
backgrnd.tex : macros/generic/misc/backgrnd.tex
changebar.sty : macros/latex/contrib/changebar
framed.sty : macros/latex/contrib/misc/framed.sty
latexdiff, latexrevise: support/latexdiff
lineno.sty : macros/latex/contrib/lineno
memoir.cls: macros/latex/contrib/memoir
vertbars.sty : macros/latex/contrib/misc/vertbars.sty
265 Conditional compilation and “comments”
While LaTeX (or any other TeX-derived package) isn’t really like a compiler, people
regularly want to do compiler-like things using it. Common requirements are condi-
tional ‘compilation’ and ‘block comments’, and several LaTeX-specific means to this
end are available.
The simple \newcommand{\gobble}[1]{} and \iffalse ... \fi aren’t really
satisfactory (as a general solution) for comments, since the matter being skipped is
nevertheless scanned by TeX, not always as you would expect. The scanning imposes
restrictions on what you’re allowed to skip; this may not be a problem in today’s job,
but could return to bite you tomorrow. For an example of surprises that may come to
bite you, consider the following example (derived from real user experience):
\iffalse % ignoring this bit
consider what happens if we
use \verb|\iftrue| -- a surprise
\fi

The \iftrue is spotted by TeX as it scans, ignoring the \verb command; so the
\iffalse isn’t terminated by the following \fi. Also, \gobble is pretty inefficient
at consuming anything non-trivial, since all the matter to be skipped is copied to the
argument stack before being ignored.
If your requirement is for a document from which whole chapters (or the like) are
missing, consider the LaTeX \include/\includeonly system. If you ‘\include’
your files (rather than \input them — see What’s going on in my \include com-
mands?), LaTeX writes macro traces of what’s going on at the end of each chapter to
the .aux file; by using \includeonly, you can give LaTeX an exhaustive list of the
files that are needed. Files that don’t get \included are skipped entirely, but the docu-
ment processing continues as if they were there, and page, footnote, and other numbers
are not disturbed. Note that you can choose which sections you want included interac-
tively, using the askinclude package.
The inverse can be done using the excludeonly package: this allows you to ex-
clude a (list of) \included files from your document, by means of an \excludeonly
command.
154
If you want to select particular pages of your document, use Heiko Oberdiek’s
pagesel or the selectp packages. You can do something similar with an existing PDF
document (which you may have compiled using pdflatex in the first place), using the
pdfpages package. The job is then done with a document looking like:
\documentclass{article}
\usepackage[final]{pdfpages}
\begin{document}
\includepdf[pages=30-40]{yoursource.pdf}
\end{document}

(To include all of the document, you write


\includepdf[pages=-]{yoursource.pdf}

omitting the start and end pages in the optional argument.)


If you want flexible facilities for including or excluding small portions of a file,
consider the comment, version or optional packages.
The comment package allows you to declare areas of a document to be included
or excluded; you make these declarations in the preamble of your file. The command
\includecomment{version-name} declares an environment version-name whose
content will be included in your document, while \excludecomment{version-name}
defines an environment whose content will be excluded from the document. The pack-
age uses a method for exclusion that is pretty robust, and can cope with ill-formed
bunches of text (e.g., with unbalanced braces or \if commands).
These FAQs employ the comment package to alter layout between the printed (two-
column) version and the PDF version for browsing; there are narrowversion and
wideversion for the two versions of the file.
version offers similar facilities to comment.sty (i.e., \includeversion and
\excludeversion commands); it’s far “lighter weight”, but is less robust (and in
particular, cannot deal with very large areas of text being included/excluded).
A significant development of version, confusingly called versions (i.e., merely a
plural of the old package name). Versions adds a command \markversion{version-name}
which defines an environment that prints the included text, with a clear printed mark
around it.
optional defines a command \opt; its first argument is an ‘inclusion flag’, and its
second is text to be included or excluded. Text to be included or excluded must be
well-formed (nothing mismatched), and should not be too big — if a large body of text
is needed, \input should be used in the argument. The documentation (in the package
file itself) tells you how to declare which sections are to be included: this can be done
in the document preamble, but the documentation also suggests ways in which it can
be done on the command line that invokes LaTeX, or interactively.
And, not least of this style of conditional compilation, verbatim (which should
be available in any distribution) defines a comment environment, which enables the
dedicated user of the source text editor to suppress bits of a LaTeX source file. The
memoir class offers the same environment.
An interesting variation is the xcomment package. This defines an environment
whose body is all excluded, apart from environments named in its argument. So, for
example:
\begin{xcomment}{figure,table}
This text is not included
\begin{figure}
This figure is included
\end{figure}
This is not included, either
\begin{table}
This table also included
\end{table}
...
\end{xcomment}

A further valuable twist is offered by the extract package. This allows you to pro-
duce a “partial copy” of an existing document: the package was developed to permit
155
production of a “book of examples” from a set of lecture notes. The package documen-
tation shows the following usage:
\usepackage[
active,
generate=foobar,
extract-env={figure,table},
extract-cmd={chapter,section}
]{extract}

which will cause the package to produce a file foobar.tex containing all the figure
and table environments, and the \chapter and \section commands, from the docu-
ment being processed. The new file foobar.tex is generated in the course of an other-
wise ordinary run on the ‘master’ document. The package provides a good number of
other facilities, including (numeric or labelled) ranges of environments to extract, and
an extract environment which you can use to create complete ready-to-run LaTeX
documents with stuff you’ve extracted.
askinclude.sty : macros/latex/contrib/misc/askinclude.sty
comment.sty : macros/latex/contrib/comment
excludeonly.sty : macros/latex/contrib/misc/excludeonly.sty
extract.sty : macros/latex/contrib/extract
memoir.cls: macros/latex/contrib/memoir
optional.sty : macros/latex/contrib/misc/optional.sty
pagesel.sty : Distributed with Heiko Oberdiek’s packages macros/latex/
contrib/oberdiek
pdfpages.sty : macros/latex/contrib/pdfpages
selectp.sty : macros/latex/contrib/misc/selectp.sty
verbatim.sty : Distributed as part of macros/latex/required/tools
version.sty : macros/latex/contrib/misc/version.sty
versions.sty : macros/latex/contrib/versions/versions.sty
xcomment.sty : Distributed as part of macros/latex/contrib/seminar

266 Bits of document from other directories


A common way of constructing a large document is to break it into a set of files (for
example, one per chapter) and to keep everything related to each of these subsidiary
files in a subdirectory.
Unfortunately, TeX doesn’t have a changeable “current directory”, so that all files
you refer to have to be specified relative to the same directory as the main file. Most
people find this counter-intuitive.
It may be appropriate to use the “path extension” technique used in temporary in-
stallations to deal with this problem. However, if there several files with the same name
in your document, such as chapter1/fig1.eps and chapter2/fig1.eps, you’re not
giving TeX any hint as to which you’re referring to when in the main chapter file you
say \input{sect1}; while this is readily soluble in the case of human-prepared files
(just don’t name them all the same), automatically produced files have a way of having
repetitious names, and changing them is a procedure prone to error.
The import package comes to your help here: it defines an \import command that
accepts a full path name and the name of a file in that directory, and arranges things to
“work properly”. So, for example, if /home/friend/results.tex contains
Graph: \includegraphics{picture}
\input{explanation}

then \import{/home/friend/}{results} will include both graph and explanation


as one might hope. A \subimport command does the same sort of thing for a sub-
directory (a relative path rather than an absolute one), and there are corresponding
\includefrom and \subincludefrom commands.
The chapterfolder package provides commands to deal with its (fixed) model of file
inclusion in a document. It provides commands \cfpart, \cfchapter, \cfsection
and \cfsubsection, each of which takes directory and file arguments, e.g.:
156
\cfpart[pt 1]{Part One}{part1}{part}

which command will issue a ‘normal’ command \part[pt 1]{Part One} and then
input the file part1/part.tex, remembering that part1/ is now the “current folder”.
There are also commands of the form \cfpartstar (which corresponds to a \part*
command).
Once you’re “in” a chapterfolder-included document, you may use \cfinput
to input something relative to the “current folder”, or you may use \input, using
\cfcurrentfolder to provide a path to the file. (There are also \cfcurrentfolderfigure
for a figure/ subdirectory and \cfcurrentfolderlistings for a listings/ sub-
directory.)
Documentation of chapterfolder is in French, but the README in the directory is in
English.
chapterfolder.sty : macros/latex/contrib/chapterfolder
import.sty : macros/latex/contrib/misc/import.sty

267 Version control using RCS, CVS or Subversion


If you use RCS, CVS or Subversion to maintain your (La)TeX documents under ver-
sion control, you may need some mechanism for including the version details in your
document, in such a way that they can be typeset (that is, rather than just hiding them
inside a comment).
The most complete solution for RCS and CVS is to use the (LaTeX) package rcs,
which allows you to parse and display the contents of RCS keyword fields in an ex-
tremely flexible way. The package rcsinfo is simpler, but does most of what you want,
and some people prefer it; it is explicitly compatible with LaTeX2HTML.
If, however, you need a solution which works without using external packages, or
which will work in Plain TeX, then you can use the following minimal solution:

\def\RCS$#1: #2 ${\expandafter\def\csname RCS#1\endcsname{#2}}


\RCS$Revision: 1.409 $ % or any RCS keyword
\RCS$Date: 2006/10/05 07:24:34 $
...
\date{Revision \RCSRevision, \RCSDate}

If you’ve entered the brave new world of subversion, the package svn may be for
you. It has explicit cleverness about dealing with dates:

\documentclass{hfooi}
...
\usepackage{svn}
\SVNdate $Date$
\author{...}
\title{...}
...
\begin{document}
\maketitle
...
\end{document}

will (once subversion has committed a copy of the document) cause \maketitle use
the date that has been written into the $Date$ keyword.
The alternative is the svninfo package, which has much the same mechanisms as
does svn but with a rather different focus. Svninfo does the date trick that svn per-
forms (controlled by a package option), and can set up page foot-lines using package
fancyhdr. There isn’t much to choose between the two packages: you should read the
packages’ documentation to see which you find best.
rcs.sty : macros/latex/contrib/rcs
rcsinfo.sty : macros/latex/contrib/rcsinfo
svn.sty : macros/latex/contrib/svn
svninfo.sty : macros/latex/contrib/svninfo

157
268 Makefiles for LaTeX documents
LaTeX is a tricky beast for running make on: the need to instruct LaTeX to run several
times for essentially different reasons (for example, “get the table of contents stable”,
“get the labels stable”, “add the bibliography”, “add the index”) is actually rather diffi-
cult to express in the ‘ordinary’ sort of dependency graph that one constructs for make.
For this reason, the only make-like package on CTAN (for a long time) was latexmk,
which is a Perl script that analyses your LaTeX source for its dependencies, runs Bib-
TeX or makeindex as and when it notices that those programs’ input (parts of the .aux
file, or the .idx file, respectively) has changed, and so on. Latexmk is a fine solution
(and was used in generating printable versions of these FAQs for a long time); it has
recently been upgraded and has many bells and whistles that allow it to operate as if it
were a poor man’s WYSIWYG system.
The texinfo system comes with a utility called texi2dvi, which is capable of “con-
verting” either LaTeX or texinfo files into DVI (or into PDF, using PDFTeX).
A later contribution is the bundle latexmake, which offers a set of make rules that
invoke texi2dvi as necessary.
The curious may examine the rules employed to run the present FAQ through La-
TeX: we don’t present them as a complete solution, but some of the tricks employed
are surely re-usable.
FAQ distribution: help/uk-tex-faq
latexmake: support/latexmake
latexmk : support/latexmk
texi2dvi: Distributed as part of macros/texinfo/texinfo

269 How many pages are there in my document?


Simple documents (those that start at page 1, and don’t have any breaks in their page
numbering until their last page) present no problem to the seeker after this truth. The
number of pages is reported by the lastpage package in its LastPage label.
For more complicated documents (most obviously, books with frontmatter in a dif-
ferent series of page numbers) this simple approach will not do.
The count1to package defines a label TotalPages; this is the value of its copy of
\count1 (a reserved TeX count register) at the end of the document.
Package totpages defines a label TotPages, but it also makes the register it
uses available as a LaTeX counter, TotPages, which you can also reference via
\theTotPages. Of course, the counter TotPages is asynchronous in the same way
that page numbers are, but snapshots may safely be taken in the output routine.
The memoir class defines two counters lastpage and lastsheet, which are set
(after the first run of a document) to the equivalent of the LastPage label and the
TotalPages labels.
Both count1to and totpages need the support of the everyshi package.
count1to.sty and everyshi.sty : Distributed in macros/latex/contrib/ms
lastpage.sty : macros/latex/contrib/lastpage
memoir.cls: macros/latex/contrib/memoir
totpages.sty : macros/latex/contrib/totpages

270 Including Plain TeX files in LaTeX


LaTeX, though originally based on Plain TeX (based on Plain TeX), does not contain all
of Plain TeX’s commands. Worse, some Plain TeX command names appear in LaTeX,
with different semantics. As a result, special measures need to be taken to allow general
Plain TeX documents (or parts of documents) to be typeset within LaTeX.
The truly reliable way is to translate the Plain TeX commands, to produce an equiv-
alent of the original’s semantics. However, this is not practical in many circumstances,
and for those occasions, the plain package will often come to your aid. The package
defines a plain environment, in which a Plain TeX document may be processed:

\begin{plain}
\input{plain-doc}
\end{plain}

158
The package is known to fail, for example, with documents that use AMSTeX; no doubt
it would also fail if asked to load Eplain. All these things can be overcome (although
it’s not often easy), but the environment saves a lot of work on many occasions.
plain.sty : Distributed as part of macros/latex/contrib/carlisle

Q.7 Hyphenation
271 My words aren’t being hyphenated
Let’s assume you’ve selected the right TeX ‘language’ — as explained in “how hyphen-
ation works”, you’re not likely to get the correct results typesetting one language using
the hyphenation rules of another. (Select the proper language, using babel if you’re a
LaTeX user. This may reveal that you need another set of hyphenation patterns; see
“using a new language” for advice on how to install it.)
So what else can go wrong?
• Since TeX version 3.0, the limits on how near to either end of a word hyphenation
may take place have been programmable (see “weird hyphenation”), and for some
reason the values in question may have been corrupted in some macros you are
using. TeX won’t hyphenate less than \lefthyphenmin characters after the start
of a word, nor less than \righthyphenmin before the end of a word; thus it won’t
hyphenate a word shorter than the sum of the two minima, at all. For example,
since the minima are 2 and 3 for English, TeX won’t hyphenate a word shorter
than 5 letters long, if it believes the word to be English.
• TeX won’t hyphenate a word that’s already been hyphenated. For example, the
(caricature) English surname Smyth-Postlethwaite wouldn’t hyphenate, which
could be troublesome. This is correct English typesetting style (it may not be
correct for other languages), but if needs must, you can replace the hyphen in the
name with a \hyph command, defined
\def\hyph{-\penalty0\hskip0pt\relax}
This is not the sort of thing this FAQ would ordinarily recommend. . . The
hyphenat package defines a bundle of such commands (for introducing hyphen-
ation points at various punctuation characters).
• There may be accents in the word. The causes of and remedies for this effect are
discussed in accents and hyphens.
• The hyphenation may simply not have been spotted; while TeX’s algorithm is
good, it’s not infallible, and it does miss perfectly good hyphenations in some
languages. When this happens, you need to give TeX explicit instructions on how
to hyphenate.
The \hyphenation command allows you to give explicit instructions. Provided that
the word will hyphenate at all (that is, it is not prevented from hyphenating by any of
the other restrictions above), the command will override anything the hyphenation pat-
terns might dictate. The command takes one or more hyphenated words as argument —
\hyphenation{ana-lysis pot-able}; note that (as here, for analysis) you can use
the command to overrule TeX’s choice of hyphenation (ana-lysis is the British etymo-
logical hyphenation; some feel the American hyphenation feels ‘unfortunate’. . . ).
hyphenat.sty : macros/latex/contrib/hyphenat

272 Weird hyphenation of words


If your words are being h-yphenated, like this, with jus-t single letters at the beginning
or the end of the word, you may have a version mismatch problem. TeX’s hyphen-
ation system changed between version 2.9 and 3.0, and macros written for use with
version 2.9 can have this effect with a version 3.0 system. If you are using Plain TeX,
make sure your plain.tex file has a version number which is at least 3.0, and rebuild
your format. If you are using LaTeX 2.09 your best plan is to upgrade to LaTeX 2ε . If
for some reason you can’t, the last version of LaTeX 2.09 (released on 25 March 1992)
is still available (for the time being at least) and ought to solve this problem.
If you’re using LaTeX 2ε , the problem probably arises from your hyphen.cfg file,
which has to be created if you’re using a multi-lingual version.
A further source of oddity can derive from the 1995 release of Cork-encoded fonts,
which introduced an alternative hyphen character. The LaTeX 2ε configuration files
in the font release specified use of the alternative hyphen, and this could produce odd
159
effects with words containing an explicit hyphen. The font configuration files in the
December 1995 release of LaTeX 2ε do not use the alternative hyphen character, and
therefore removed this source of problems; the solution, again, is to upgrade your
LaTeX.
LaTeX 2.09: obsolete/macros/latex209/distribs/latex209.tar.gz
plain.tex : macros/plain/base

273 (Merely) peculiar hyphenation


You may have found that TeX’s famed automatic word-division does not produce the
break-points recommended by your dictionary. This may be because TeX is set up
for American English, whose rules for word division (as specified, for example, in
Webster’s Dictionary) are completely different from the British ones (as specified, for
example, in the Oxford Dictionaries). This problem is being addressed by the UK TeX
User community (see Baskerville, issue 4.4) but an entirely satisfactory solution will
take time; the current status is to be found on CTAN (see “using a new language” for
instructions on adding this new “language”).
UK patterns: language/hyphenation/ukhyphen.tex

274 Accented words aren’t hyphenated


TeX’s algorithm for hyphenation gives up when it encounters an \accent command;
there are good reasons for this, but it means that quality typesetting in non-English
languages can be difficult.
For TeX macro packages, you can avoiding the effect by using an appropriately
encoded font (for example, a Cork-encoded font — see the EC fonts) which contains
accented letters as single glyphs. LaTeX users can achieve this end simply by adding
the command
\usepackage[T1]{fontenc}
to the preamble of their document. Other encodings (notably LY1, once promoted by
Y&Y inc) may be used in place of T1. Indeed, most current 8-bit TeX font encodings
will ‘work’ with the relevant sets of hyphenation patterns.
One might hope that, with the many aspirant successors to TeX such as Omega,
LUATeX and ExTeX, all of which base their operations on Unicode, that the whole
basis of encodings will change.
275 Using a new language with Babel
Babel is capable of working with a large range of languages, and a new user often wants
to use a language that her TeX installation is not set up to employ. Simply asking Babel
to use the language, with the command
\usepackage[catalan]{babel}
provokes the warning message
Package babel Warning: No hyphenation patterns were loaded for
(babel) the language ‘Catalan’
(babel) I will use the patterns loaded for \language=0 instead.
The problem is that your TeX system doesn’t know how to hyphenate Catalan text:
you need to tell it how before Babel can do its work properly. To do this, for LaTeX
installations, one needs to change language.dat (which is part of the Babel installa-
tion); it will contain a line
%catalan cahyphen.tex
which, if you remove the comment marker, is supposed to instruct LaTeX to load Cata-
lan hyphenation patterns when you tell it to build a new format.
Unfortunately, in many Babel distributions, the line just isn’t right — you need to
check the name of the file containing the patterns you’re going to use. As you can see,
in the author’s system, the name is supposed to be cahyphen.tex; however the file
actually present on the system is cahyph.tex — fortunately, the error should prove
little more than an inconvenience (most of the files are in better distributions anyway,
but an elusive one may be found on CTAN; if you have to retrieve a new file, ensure
that it’s correctly installed, for which see installing a new package).
Finally, you need to regenerate the formats used (in fact, most users of Babel are
using it in their LaTeX documents, so regenerating the LaTeX-related formats will
ordinarily be enough; however, the author always generates the lot, regardless).
160
teTeX It’s possible to do the whole operation in one go, by using the texconfig com-
mand:
texconfig hyphen latex
which first enters an editor for you to edit language.dat, and then regenerates
the format you specify (latex in this case).
Otherwise, to regenerate all formats, do:
fmtutil --all
If you’re willing to think through what you’re doing (this is not for the faint-
hearted), you can select a sequence of formats and for each one, run:
fmtutil --byfmt hformatnamei
where formatname is something like ‘latex’, or:
fmtutil --byhyphen hhyphenfilei
where hyphenfile is the file specifying hyphenation to the format — usually
language.dat
MiKTeX On a \miktex{} distribution earlier than v2.0, do:
Start→Programs→MiKTeX→Maintenance→Create all format files
or get a DOS window and run:
initexmf --dump
On a \miktex{} distribtution v2.0 or later, the whole procedure can be done via the
GUI. To select the new language, do:
Start→Programs→MiKTeX 2→MiKTeX Options, and select the Languages
tab. Select your language from the list, press the Apply button, and then the OK
button. Then select the General tab and press the Update Now button.
Otherwise, edit the language.dat file (as outlined above), and then run:
initexmf --dump
just as for a pre-v2.0 system.
Caveat: It is (just) possible that your TeX system may run out of “pattern memory”
while generating the new format. Most TeX implementations have fixed-size arrays for
storing the details of hyphenation patterns, but although their size is adjustable in most
modern distributions, actually changing the size is a fiddle. If you do find you’ve run
out of memory, it may be worth scanning the list of languages in your language.dat
to see whether any could reasonably be removed.
babel: macros/latex/required/babel
hyphenation patterns: language/hyphenation

276 Stopping all hyphenation


It may seem an odd thing to want to do (after all, one of TeX’s great advertised virtues
is the quality of its hyphenation) but it’s sometimes necessary. The real problem is,
that the quality of TeX’s output is by default largely dependent on the presence of
hyphenation; if you want to abandon hyphenation, something has to give.
TeX (slightly confusingly) offers four possible mechanisms for suppressing hy-
phenation (there were only two prior to the extensions that arrived with TeX version 3).
First, one can set the hyphenation penalties \hyphenpenalty and \exhyphenpenalty
to an ‘infinite’ value (that is to say, 10000). This means that all hyphenations will suf-
ficiently penalise the line that would contain them, that the hyphenation won’t happen.
The disadvantage of this method is that TeX will re-evaluate any paragraph for which
hyphenations might help, which will slow TeX down.
Second, one can select a language for which no hyphenation patterns exist. Some
distributions create a language nohyphenation, and the hyphenat package uses this
technique for its \nohyphens command which sets its argument without any hyphen-
ation.
Third, one can set \left- and/or \righthyphenmin to a sufficiently large value
that no hyphenation could possibly succeed, since the minimum is larger than the length
of the longest word TeX is willing to hyphenate (the appropriate value is 62).
Fourth, one can suppress hyphenation for all text using the current font by the
command
\hyphenchar\font=-1

This isn’t a particularly practical way for users to suppress hyphenation — the com-
mand has to be issued for every font the document uses — but it’s how LaTeX itself
161
suppresses hyphenation in tt and other fixed-width fonts.
Which of the techniques you should use depends on what you actually want to do.
If the text whose hyphenation is to be suppressed runs for less than a paragraph, your
only choice is the no-hyphens language: the language value is preserved along with
the text (in the same way that the current font is); the values for penalties and hyphen
minima active at the end of a paragraph are used when hyphenation is calculated.
Contrariwise, if you are writing a multilanguage document using the babel package,
you cannot suppress hyphenation throughout using either the no-hyphens language or
the hyphen minima: all those values get changed at a babel language switch: use the
penalties instead.
If you simply switch off hyphenation for a good bit of text, the output will have a
jagged edge (with many lines seriously overfull), and your (La)TeX run will bombard
you with warnings about overfull and underfull lines. To avoid this you have two
options. You may use \sloppy (or its environment version sloppypar), and have
TeX stretch what would otherwise be underfull lines to fill the space offered, and wrap
other lines, while prematurely wrapping overfull lines and stretching the remainder.
Alternatively, you may set the text ragged right, and at least get rid of the overfull lines;
this technique is ‘traditional’ (in the sense that typists do it) and may be expected to
appeal to the specifiers of eccentric document layouts (such as those for dissertations),
but for once their sense conforms with typographic style. (Or at least, style constrained
in this curious way.)
hyphenat.sty : macros/latex/contrib/hyphenat

277 Preventing hyphenation of a particular word


It’s quite possible for (any) hyphenation of a particular word to seem “completely
wrong”, so that you want to prevent it being hyphenated.
If the word occurs in just one place, put it in a box:
\mbox{oddword}

(Plain TeX users should use \hbox, and take care at the start of paragraphs.) However,
boxing the word is not really advisable unless you are sure it only occurs once.
If the word occurs commonly, the best choice is to assert a non-hyphenation for it:
\hyphenation{oddword}

This hyphenation exception (with no break points) will be used in preference to what
TeX’s hyphenation algorithm may come up with.
In a multilingual document, repeat the exception specification for each language
the word may appear in. So:
\usepackage[french,english]{babel}
\selectlanguage{english}
\hyphenation{oddword}
\selectlanguage{french}
\hyphenation{oddword}

(note that babel will select the default language for the document — English, in this
case — at \begin{document}.)
278 Hyphenation exceptions
While TeX’s hyphenation rules are good, they’re not infallible: you will occasionally
find words TeX just gets wrong. So for example, TeX’s default hyphenation rules (for
American English) don’t know the word “manuscript”, and since it’s a long word you
may find you need to hyphenate it. You can “write the hyphenation out” each time you
use the word:
... man\-u\-script ...

Here, each of the \- commands is converted to a hyphenated break, if (and only if )


necessary.
That technique can rapidly become tedious: you’ll probably only accept it if there
are no more than one or two wrongly-hyphenated words in your document. The alterna-
tive is to set up hyphenations in the document preamble. To do that, for the hyphenation
above, you would write:
162
\hyphenation{man-u-script}

and the hyphenation would be set for the whole document. Barbara Beeton publishes
articles containing lists of these “hyphenation exceptions”, in TUGboat; the hyphen-
ation ‘man-u-script’ comes from one of those articles.
What if you have more than one language in your document? Simple: select the
appropriate language, and do the same as above:
\usepackage[french]{babel}
\selectlanguage{french}
\hyphenation{re-cher-cher}

(nothing clever here: this is the “correct” hyphenation of the word, in the current ta-
bles). However, there’s a problem here: just as words with accents in them won’t break,
so \hyphenation commands with accents in them produce an error:

\usepackage[french]{babel}
\selectlanguage{french}
\hyphenation{r\’e-f\’e-rence}

tells us that the hyphenation is “improper”, and that it will be “flushed”. But, just as
hyphenation of words is enabled by selecting an 8-bit font encoding, so \hyphenation
commands are rendered proper again by selecting that same 8-bit font encoding. For
the hyphenation patterns provided in the usual distributions, the encoding is Cork, so
the complete sequence is:

\usepackage[T1]{fontenc}
\usepackage[french]{babel}
\selectlanguage{french}
\hyphenation{r\’e-f\’e-rence}

The same sort of performance goes for any language for which 8-bit fonts and
corresponding hyphenation patterns are available. Since you have to select both the
language and the font encoding to have your document typeset correctly, it should not
be a great imposition to do the selections before setting up hyphenation exceptions.

Q.8 Odds and ends


279 Typesetting all those TeX-related logos
Knuth was making a particular point about the capabilities of TeX when he defined the
logo. Unfortunately, many believe, he thereby opened floodgates to give the world a
whole range of rather silly ‘bumpy road’ logos such as AMSTeX, PiCTeX, BibTeX, and
so on, produced in a flurry of different fonts, sizes, and baselines — indeed, everything
one might hope to cause them to obstruct the reading process. In particular, Lamport
invented LaTeX (silly enough in itself) and marketing input from Addison-Wesley led
to the even stranger current logo LaTeX 2ε .
Sensible users don’t have to follow this stuff wherever it goes, but, for those who
insist, a large collection of logos is defined in the texnames package (but note that this
set of macros isn’t entirely reliable in LaTeX 2ε ). The MetaFont and MetaPost logos
can be set in fonts that LaTeX 2ε knows about (so that they scale with the surrounding
text) using the mflogo package; but be aware that booby-traps surround the use of
the Knuthian font for MetaPost (you might get META O T). You needn’t despair,
however — the author himself uses just ‘MetaPost’.
For those who don’t wish to acquire the ‘proper’ logos, the canonical thing to do
is to say AMS-\TeX{} (AMS-TeX) for AMSTeX, Pic\TeX{} (PicTeX) for PiCTeX,
Bib\TeX{} (BibTeX) for BibTeX, and so on.
mflogo.sty : macros/latex/contrib/mflogo
texnames.sty : info/biblio/texnames.sty

280 How to do bold-tt or bold-sc


LaTeX, as delivered, offers no means of handling bold “teletype” or small-caps fonts.
There’s a practical reason for this (Knuth never designed such fonts), but there are
typographical considerations too (the “medium weight” cmtt font is already pretty

163
bold (by comparison with other fixed-width fonts), and bold small-caps is not popular
with many professional typographers).
There’s a set of “extra” MetaFont files on CTAN that provide bold versions of
both cmtt and cmcsc (the small caps font). With modern TeX distributions, one may
bring these fonts into use simply by placing them in an appropriate place in the texmf
tree (these are (La)TeX-specific files, so the “public” supplier would be an appropriate
place). Once you’ve rebuilt the file indexes as necessary, TeX (and friends) will auto-
matically build whatever font files they need when you first make reference to them.
There’s a jiffy package bold-extra that builds the necessary font data structures so that
you can use the fonts within LaTeX.
Another alternative is to use the EC fonts, which come with bold variants of the
small-caps fonts.
If you need to use Type 1 fonts, you can’t proceed with Knuth-style fonts, since
there are no Type 1 versions of the mf-extra set. There are, however, Type 1 distribu-
tions of the EC fonts, so you can switch to EC and use them; alternatives are discussed
in 8-bit Type 1 fonts.
Of course, commercial fixed-width fonts (even the default Courier) almost always
come with a bold variant, so that’s not a problem. Furthermore PSNFSS will usually
provide “faked” small caps fonts, and has no compunctions about providing them in
a bold form. Courier is (as we all know, to our cost) freely available; a far more
presentable monospace font is LuxiMono, which is also freely available (monospace
text in the typeset version of this FAQ uses LuxiMono, with the metrics and LaTeX
support available on the archive.
bold-extra.sty : macros/latex/contrib/misc/bold-extra.sty
bold tt and small caps fonts: fonts/cm/mf-extra/bold
rmfamilyLuxiMono fonts: fonts/LuxiMono

281 Automatic sizing of minipage


The minipage environment requires you to specify the width of the “page” you’re
going to create. This is sometimes inconvenient: you would like to occupy less space,
if possible, but minipage sets a box that is exactly the width you specified.
The pbox package defines a \pbox whose width is exactly that of the longest en-
closed line, subject to a maximum width that you give it. So while \parbox{2cm}
{Hello\\world!} produces a box of width exactly 2cm, \pbox{2cm}{Hello\\world!}
produces one whose width is 1.79cm (if one’s using the default cmr font for the text, at
least). The package also provides a \settominwidth[min]{length}{text} (which
looks (almost) like the standard \settowidth command), and a \widthofpbox func-
tion analagous to the \widthof command for use with the calc package.
The eqparbox package extends pbox’s idea, by allowing you to set a series of boxes,
all with the same (minimised) width. (Note that it doesn’t accept a limiting maximum
width parameter.) The package documentation shows the following example drawn
from a joke curriculum vitae:
\noindent%
\eqparbox{place}{\textbf{Widgets, Inc.}} \hfill
\eqparbox{title}{\textbf{Senior Widget Designer}} \hfill
\eqparbox{dates}{\textbf{1/95--present}}

...

\noindent%
\eqparbox{place}{\textbf{Thingamabobs, Ltd.}} \hfill
\eqparbox{title}{\textbf{Lead Engineer}} \hfill
\eqparbox{dates}{\textbf{9/92--12/94}}

The code makes the three items on each of the heading lines have exactly the same
width, so that the lines as a whole produce a regular pattern down the page. A command
\eqboxwidth allows you to use the measured width of a group: the documentation
shows how the command may be used to produce sensible-looking columns that mix
c-, r- or l-rows, with the equivalent of a p{...} entry, by making the fixed-width rows
an eqparbox group, and making the last from a \parbox using the width that’s been
measured for the group.
164
The varwidth package defines a varwidth environment which sets the content of
the box to match a “narrower natural width” if it finds one. (You give it the same param-
eters as you would give minipage: in effect, it is a ‘drop-in’ replacement.) Varwidth
provides its own ragged text command: \narrowragged, which aims to make narrower
lines and to put more text in the last line of the paragraph (thus producing lines with
more nearly equal lengths than typically happens with \raggedright itself).
The documentation (in the package file) lists various restrictions and things still to
be done, but the package is already proving useful for a variety of jobs.
eqparbox.sty : macros/latex/contrib/eqparbox
pbox.sty : macros/latex/contrib/pbox
varwidth.sty : macros/latex/contrib/misc/varwidth.sty

R Symbols, etc.
282 Symbols for the number sets
It is a good idea to have commands such as \R for the real numbers and other standard
number sets. Traditionally these were typeset in bold. Because, in the ordinary course
of events, mathematicians do not have access to bold chalk, they invented the special
symbols that are now often used for \R, \C, etc. These symbols are known as “black-
board bold”. Before insisting on using them, consider whether going back to the old
system of ordinary bold might not be acceptable (it is certainly simpler).
A set of blackboard bold capitals is available in the AMS msbm fonts (msbm is
available at a range of design sizes, with names such as msbm10). The pair of font
families (the other is called msam) have a large number of mathematical symbols to
supplement the ones in the standard TeX distribution, and are available in Type 1 format
with most modern distributions. Support files for using the fonts, both under Plain TeX
and LaTeX (packages amssymb and amsfonts), are available. The font shape is a rather
austere sans, which many people don’t like (though it captures the essence of quickly-
chalked writing rather well).
The bbold family is set of blackboard bold fonts written in MetaFont. This set offers
blackboard bold forms of lower-case letters; the font source directory also contains
sources for a LaTeX package that enables use of the fonts. The fonts are not available
in Type 1 format.
The bbm family claims to provide ‘blackboard’ versions of most of the cm fonts . . .
including the bold and bold-extended series. Again, the fonts are designed in MetaFont
and are not available in Type 1 format. LaTeX macro support comes from a package
by Torsten Hilbrich.
The doublestroke family comes in just roman and sans shapes, at a single weight,
and is available both as MetaFont sources and as Type 1; the font covers the uppercase
latin letters, lowercase ‘h’ and ’k’, and the digit ‘1’.
An alternative source of Type 1 fonts with blackboard bold characters may be found
in the steadily increasing set of complete families, both commercial and free, that have
been prepared for use with (La)TeX (see choice of outline fonts). Of the free sets, the
txfonts and pxfonts families both come with replicas of msam and msbm (though, as
noted elsewhere, there are other reasons not to use these fonts). The mathpazo family
includes a “mathematically significant” choice of blackboard bold characters, and the
fourier fonts contain blackboard bold upper-case letters, the digit ‘1’, and lower-case
‘k’.
The “lazy person’s” blackboard bold macros:

\newcommand{\R}{{\sf R\hspace*{-0.9ex}%
\rule{0.15ex}{1.5ex}\hspace*{0.9ex}}}
\newcommand{\N}{{\sf N\hspace*{-1.0ex}%
\rule{0.15ex}{1.3ex}\hspace*{1.0ex}}}
\newcommand{\Q}{{\sf Q\hspace*{-1.1ex}%
\rule{0.15ex}{1.5ex}\hspace*{1.1ex}}}
\newcommand{\C}{{\sf C\hspace*{-0.9ex}%
\rule{0.15ex}{1.3ex}\hspace*{0.9ex}}}

165
are almost acceptable at normal size if the surrounding text is cmr10. However, they are
not part of a proper maths font, and so do not work in sub- and superscripts. Moreover,
the size and position of the vertical bar can be affected by the font of the surrounding
text. As we’ve seen, there are plenty of alternatives: don’t try the macros, or anything
similar using capital ‘I’ (which looks even worse!).
AMS support files (Plain): fonts/amsfonts/plaintex
AMS support files (LaTeX): fonts/amsfonts/latex
AMS symbol fonts: fonts/amsfonts/sources/symbols
AMS symbol fonts in Type 1 format: Browse fonts/amsfonts/ps-type1
bbm fonts: fonts/cm/bbm
bbm macros: macros/latex/contrib/bbm
bbold fonts: fonts/bbold
doublestroke fonts: fonts/doublestroke
fourier fonts: fonts/fourier-GUT
mathpazo fonts: fonts/mathpazo
pxfonts: fonts/pxfonts
txfonts: fonts/txfonts

283 Better script fonts for maths


The font selected by \mathcal is the only script font ‘built in’. However, there are
other useful calligraphic fonts included with modern TeX distributions.

Euler The eucal package (part of most sensible TeX distributions; the fonts are part
of the AMS font set) gives a slightly curlier font than the default. The package
changes the font that is selected by \mathcal.
Type 1 versions of the fonts are available in the AMS fonts distribution.
RSFS The mathrsfs package uses a really fancy script font (the name stands for “Ralph
Smith’s Formal Script”) which is already part of most modern TeX distributions.
The package creates a new command \mathscr.
Type 1 versions of the font have been made available by Taco Hoekwater.
Zapf Chancery is the standard PostScript calligraphic font. There is no package but
you can easily make it available by means of the command
\DeclareMathAlphabet{\mathscr}{OT1}{pzc}%
{m}{it}
in your preamble. You may find the font rather too big; if so, you can use a scaled
version of it like this:
\DeclareFontFamily{OT1}{pzc}{}
\DeclareFontShape{OT1}{pzc}{m}{it}%
{<-> s * [0.900] pzcmi7t}{}
\DeclareMathAlphabet{\mathscr}{OT1}{pzc}%
{m}{it}
Adobe Zapf Chancery (which the above examples use) is distributed in any but
the most basic PostScript printers. A substantially identical font (to the extent
that the same metrics may be used) is available from URW and is distributed with
ghostscript.
Examples of the available styles are available on CTAN.
eucal.sty : fonts/amsfonts/latex/eucal.sty
euler fonts: fonts/amsfonts/sources/euler
euler fonts, in Type 1 format: fonts/amsfonts/ps-type1
ghostscript: Browse nonfree/support/ghostscript
mathrsfs.sty : Distributed as part of macros/latex/contrib/jknappen
rsfs fonts: fonts/rsfs
rsfs fonts, in Type 1 format: fonts/rsfs/ps-type1/hoekwater
Script font examples: info/symbols/math/scriptfonts.pdf
166
284 Setting bold Greek letters in LaTeX
The issue here is complicated by the fact that \mathbf (the command for setting bold
text in TeX maths) affects a select few mathematical symbols (the uppercase Greek
letters). However lower-case Greek letters behave differently from upper-case Greek
letters (due to Knuth’s esoteric font encoding decisions). However, \mathbf can’t be
used even for upper-case Greek letters in the AMSLaTeX amsmath package, which
disables this font-switching and you must use one of the techniques outlined below.
The Plain TeX solution does work, in a limited way:
{\boldmath$\theta$}

but \boldmath may not be used in maths mode, so this ‘solution’ requires arcana such
as:
$... \mbox{\boldmath$\theta$} ...$

which then causes problems in superscripts, etc.


These problems may be addressed by using a bold mathematics package.
• The bm package, which is part of the LaTeX tools distribution, defines a command
\bm which may be used anywhere in maths mode.
• The amsbsy package (which is part of AMSLaTeX) defines a command \boldsymbol,
which (though slightly less comprehensive than \bm) covers almost all common
cases.
All these solutions cover all mathematical symbols, not merely Greek letters.
bm.sty : Distributed as part of macros/latex/required/tools
amsbsy.sty : Distributed as part of AMSLaTeX macros/latex/required/
amslatex
amsmath.sty : Distributed as part of AMSLaTeX macros/latex/required/
amslatex

285 The Principal Value Integral symbol


This symbol (an integral sign, ‘crossed’) does not appear in any of the fonts ordinarily
available to (La)TeX users, but it can be created by use of the following macros:
\def\Xint#1{\mathchoice
{\XXint\displaystyle\textstyle{#1}}%
{\XXint\textstyle\scriptstyle{#1}}%
{\XXint\scriptstyle\scriptscriptstyle{#1}}%
{\XXint\scriptscriptstyle\scriptscriptstyle{#1}}%
\!\int}
\def\XXint#1#2#3{{\setbox0=\hbox{$#1{#2#3}{\int}$}
\vcenter{\hbox{$#2#3$}}\kern-.5\wd0}}
\def\ddashint{\Xint=}
\def\dashint{\Xint-}
\dashint gives a single-dashed integral sign, \ddashint a double-dashed one.

286 How to use the underscore character


The underscore character _ is ordinarily used in TeX to indicate a subscript in maths
mode; if you type _ in the course of ordinary text, TeX will complain. If you’re writing
a document which will contain a large number of underscore characters, the prospect
of typing \_ (or, worse, \textunderscore) for every one of them will daunt most
ordinary people.
Moderately skilled macro programmers can readily generate a quick hack to per-
mit typing _ to mean ‘text underscore’. However, the code is somewhat tricky, and
more importantly there are significant points where it’s easy to get it wrong. There is
therefore a package underscore which provides a general solution to this requirement.
There is a problem, though: OT1 text fonts don’t contain an underscore character,
unless they’re in the typewriter version of the encoding (used by fixed-width fonts such
as cmtt). So either you must ensure that your underscore characters only occur in text
set in a typewriter font, or you must use a fuller encoding, such as T1, which has an
underscore character in every font.
If the requirement is only for occasional uses of underscores, it may be acceptable
to use the following construct:
167
\def\us{\char‘\_}
...
\texttt{create\us process}

The construction isn’t in the least robust (in the normal English sense of the word), but
it is robust under expansion (i.e., the LaTeX sense of the word); so use it with care, but
don’t worry about section headings and the like.
underscore.sty : macros/latex/contrib/misc/underscore.sty

287 How to type an ‘@’ sign?


Long ago, some packages used to make the ‘@’ sign active, so that special measures
were needed to type it. While those packages are still in principle available, few people
use them, and those that do use them have ready access to rather good documentation.
Ordinary people (such as the author of this FAQ) need only type ‘@’.
288 Typesetting the Euro sign
The European currency “Euro” is represented by a symbol of somewhat dubious de-
sign, but it’s an important currency and (La)TeX users need to typeset it.
Note that the Commission of the European Community at first deemed that the
Euro symbol should always be set in a sans-serif font; fortunately, this eccentric ruling
has now been rescinded, and one may apply best typesetting efforts to making it appear
at least slightly “respectable” (typographically).
The TS1-encoded fonts provided as part of the EC font distribution provide Euro
glyphs. The fonts are called Text Companion (TC) fonts, and offer the same range
of faces as do the EC fonts themselves. The textcomp package provides a \texteuro
command for accessing the symbol, which selects a symbol to match the surrounding
text. The design of the symbol in the TC fonts is not universally loved. . . Nevertheless,
use the TC font version of the symbol if you are producing documents using Knuth’s
Computer Modern Fonts.
The latin9 input encoding defined by the inputenc package has a euro character
defined (character position 164, occupied in other ISO Latin character sets by the “cur-
rency symbol”). The encoding uses the command \texteuro for the character; at
present that command is only available from the textcomp package. There is a Mi-
crosoft code page position, too, but standardisation of such things proceeds via rather
different routes and the LaTeX project hasn’t yet been given details of the change.
Outline fonts which contain nothing but Euro symbols are available (free) from
Adobe — the file is packaged as a Windows self-extracting executable, but it may be
decoded as a .zip format achive on other operating systems. The euro bundle contains
metrics, dvips map files, and macros (for Plain TeX and LaTeX), for using these fonts in
documents. LaTeX users will find two packages in the bundle: eurosans only offers the
sans-serif version (to conform with the obsolete ruling about sans-serif-only symbols;
the package provides the command \euro), whereas europs matches the Euro symbol
with the surrounding text (providing the command \EUR). To use either package with
the latin9 encoding, you need to define \texteuro as an alias for the euro command
the package defines.
The Adobe fonts are probably the best bet for use in non-Computer Modern en-
vironments. They are apparently designed to fit with Adobe Times, Helvetica and
Courier, but can probably fit with a wider range of modern fonts.
The eurofont package provides a compendious analysis of the “problem of the
euro symbol” in its documentation, and offers macros for configuring the source of
the glyphs to be used; however, it seems rather large for everyday use.
The euro-ce bundle is a rather pleasing MetaFont-only design providing Euro sym-
bols in a number of shapes. The file euro-ce.tex, in the distribution, offers hints as
to how a Plain TeX user might make use of the fonts.
Euro symbols are found in several other places, which we list here for complete-
ness.
The marvosym fonts contain a Euro symbol among many other good things.
Other MetaFont-based bundles containing Euro symbols are to be found in china2e
(whose primary aim is Chinese dates and suchlike matters) and the eurosym fonts.
china2e bundle: macros/latex/contrib/china2e
EC fonts: fonts/ec

168
euro fonts: fonts/euro
euro-ce fonts: fonts/euro-ce
eurofont.sty : macros/latex/contrib/eurofont
eurosym fonts: fonts/eurosym
marvosym fonts: fonts/psfonts/marvosym
textcomp.sty : Part of the LaTeX distribution.

289 How to get copyright, trademark, etc.


The “Comprehensive symbol list” (Comprehensive symbol list), lists the symbol com-
mands \textcopyright, \textregistered and \texttrademark, which are avail-
able in TS1-encoded fonts, and which are enabled using the textcomp package.
In fact, all three commands are enabled in default LaTeX, but the glyphs you get
aren’t terribly beautiful. In particular, \textregistered behaves oddly when included
in bold text (for example, in a section heading), since it is composed of a small-caps
letter, which typically degrades to a regular shape letter when asked to set in a bold
font. This means that the glyph becomes a circled “r”, whereas the proper symbol is a
circled “R”.
This effect is of course avoided by use of textcomp.
Another problem arises if you want \textregistered in a superscript position (to
look similar to \texttrademark). Using a maths-mode superscript to do this provokes
lots of pointless errors: you must use

\textsuperscript{\textregistered}

S Macro programming
S.1 “Generic” macros and techniques
290 Finding the width of a letter, word, or phrase
Put the word in a box, and measure the width of the box. For example,

\newdimen\stringwidth
\setbox0=\hbox{hi}
\stringwidth=\wd0

Note that if the quantity in the \hbox is a phrase, the actual measurement only approx-
imates the width that the phrase will occupy in running text, since the inter-word glue
can be adjusted in paragraph mode.
The same sort of thing is expressed in LaTeX by:

\newlength{\gnat}
\settowidth{\gnat}{\textbf{small}}

This sets the value of the length command \gnat to the width of “small” in bold-face
text.
291 Patching existing commands
In the general case (possibly sticking something in the middle of an existing command)
this is difficult. However, the common requirement, to add some code at the beginning
or the end of an existing command, is conceptually quite easy. Suppose we want to
define a version of a command that does some small extension of its original definition:
we would naturally write:

\renewcommand{\splat}{\mumble\splat}

However, this would not work: a call to \splat would execute \mumble, and the call
the redefined \splat again; this is an infinite recursive loop, that will quickly exhaust
TeX’s memory.
Fortunately, the TeX primitive \let command comes to our rescue; it allows us
to take a “snapshot” of the current state of a command, which we can then use in the
redefinition of the command. So:

169
\let\OldSmooth\smooth
\renewcommand{\smooth}{\mumble\OldSmooth}

effects the required patch, safely. Adding things at the end of a command works simi-
larly. If \smooth takes arguments, one must pass them on:

\renewcommand{\smooth}[2]{\mumble\OldSmooth{#1}{#2}}

The general case may be achieved in two ways. First, one can use the LaTeX
command \CheckCommand; this compares an existing command with the definition
you give it, and issues a warning if two don’t match. Use is therefore:

\CheckCommand{\complex}{horiginal definitioni}
\renewcommand{\complex}{hnew definitioni}

This technique is obviously somewhat laborious, but if the original command comes
from a source that’s liable to change under the control of someone else, it does at least
warn you that your patch is in danger of going wrong.
Otherwise, LaTeX users may use one of the patch or patchcmd systems.
Patch gives you an ingenious (and difficult to understand) mechanism, and comes
as an old-style LaTeX documented macro file. Sadly the old-style doc macros are no
longer available, but the file (patch.doc) may be input directly, and the documentation
may be read (un-typeset). Roughly speaking, one gives the command a set of instruc-
tions analagous to sed substitutions, and it regenerates the command thus amended.
The author of this FAQ has (slightly reluctantly) given up using patch. . .
The patchcmd package addresses a slightly simpler task, by restricting the set
of commands that you may patch; you mayn’t patch any command that has an
optional argument, though it does deal with the case of commands defined with
\DeclareRobustCommand. The package defines a \patchcommand command, that
takes three arguments: the command to patch, stuff to stick at the front of its def-
inition, and stuff to stick on the end of its definition. So, if \b contains “b”, then
\patchcommand\b{a}{c} will produce a new version of \b that contains “abc”.
patch.doc: macros/generic/misc/patch.doc
patchcommand.sty : macros/latex/contrib/patchcmd

292 Comparing the “job name”


The token \jobname amusingly produces a sequence of characters whose category
code is 12 (‘other’), regardless of what the characters actually are. Since one inevitably
has to compare a macro with the contents of another macro (using \ifx, somewhere)
one needs to create a macro whose expansion looks the same as the expansion of
\jobname. We find we can do this with \meaning, if we strip the “\show command”
prefix.
The full command looks like:

\def\StripPrefix#1>{}
\def\jobis#1{FF\fi
\def\predicate{#1}%
\edef\predicate{\expandafter\StripPrefix\meaning\predicate}%
\edef\job{\jobname}%
\ifx\job\predicate
}

And it’s used as:

\if\jobis{mainfile}%
\message{YES}%
\else
\message{NO}%
\fi

Note that the command \StripPrefix need not be defined if you’re using LaTeX —
there’s already an internal command \strip@prefix that you can use.

170
293 Is the argument a number?
TeX’s own lexical analysis doesn’t offer the macro programmer terribly much support:
while category codes will distinguish letters (or what TeX currently thinks of as letters)
from everything else, there’s no support for analysing numbers.
The simple-minded solution is to compare numeric characters with the characters
of the argument, one by one, by a sequence of direct tests, and to declare the argument
“not a number” if any character fails all comparisons:
\ifx1#1
\else\ifx2#1
...
\else\ifx9#1
\else\isanumfalse
\fi\fi...\fi

which one would then use in a tail-recursing macro to gobble an argument. One could
do slightly better by assuming (pretty safely) that the digits’ character codes are con-
secutive:
\ifnum‘#1<‘0 \isanumfalse
\else\ifnum‘#1>‘9 \isanumfalse
\fi
\fi

again used in tail-recursion. However, these forms aren’t very satisfactory: getting the
recursion “right” is troublesome (it has a tendency to gobble spaces in the argument),
and in any case TeX itself has mechanisms for reading numbers, and it would be nice
to use them.
Donald Arseneau’s cite package offers the following test for an argument being a
strictly positive integer:
\def\IsPositive#1{%
TT\fi
\ifcat_\ifnum0<0#1 _\else A\fi
}

which can be adapted to a test for a non-negative integer thus:


\def\IsNonNegative{%
\ifcat_\ifnum9<1#1 _\else A\fi
}

or a test for any integer:


\def\gobbleminus#1{\ifx-#1\else#1\fi}
\def\IsInteger#1{%
TT\fi
\ifcat_\ifnum9<1\gobbleminus#1 _\else A\fi
}

but this surely stretches the technique further than is reasonable.


If we don’t care about the sign, we can use TeX to remove the entire number (sign
and all) from the input stream, and then look at what’s left:
\def\testnum#1{\afterassignment\testresult\count255=#1 \end}
\def\testresult#1\end{\ifx\end#1\end\isanumtrue\else\isanumfalse\fi}

(which technique is due to David Kastrup); this can provoke errors. In a later thread on
the same topic, Michael Downes offered:
\def\IsInteger#1{%
TT\fi
\begingroup \lccode‘\-=‘\0 \lccode‘+=‘\0
\lccode‘\1=‘\0 \lccode‘\2=‘\0 \lccode‘\3=‘\0
\lccode‘\4=‘\0 \lccode‘\5=‘\0 \lccode‘\6=‘\0
\lccode‘\7=‘\0 \lccode‘\8=‘\0 \lccode‘\9=‘\0
171
\lowercase{\endgroup
\expandafter\ifx\expandafter\delimiter
\romannumeral0\string#1}\delimiter
}

which relies on \romannumeral producing an empty result if its argument is zero.


Sadly, this technique has the unfortunate property that it accepts simple expressions
such as ‘1+2-3’; this could be solved by an initial \gobbleminus-like construction.
All the complete functions above are designed to be used in TeX conditionals writ-
ten “naturally” — for example:
\if\IsInteger{<subject of test>}%
<deal with integer>%
\else
<deal with non-integer>%
\fi

The LaTeX memoir class has an internal command of its own, \checkifinteger
{num}, that sets the conditional command \ifinteger according to whether the ar-
gument was an integer.
Of course, all this kerfuffle would be (essentially) void if there was a simple means
of “catching” TeX errors. Imagining an error-catching primitive \ifnoerror, one
might write:

\def\IsInteger#1{%
TT%
\ifnoerror
\tempcount=#1\relax
% carries on if no error
\expandafter\iftrue
\else
% here if there was an error
\expandafter\iffalse
\fi
}

thus using TeX’s own integer-parsing code to do the check. It’s a pity that such a
mechanism was never defined (it could be that it’s impossible to program within TeX!).
memoir.cls: macros/latex/contrib/memoir

294 Defining macros within macros


The way to think of this is that ## gets replaced by # in just the same way that #1 gets
replaced by ‘whatever is the first argument’.
So if you define a macro and use it as:

\def\a#1{+++#1+++#1+++#1+++} \a{b}

the macro expansion produces ‘+++b+++b+++b+++’, which people find normal. How-
ever, if we now replace part of the macro:

\def\a#1{+++#1+++\def\x #1{xxx#1}}

\a{b} will expand to ‘+++b+++\def\x b{xxxb}’. This defines \x to be a macro


delimited by b, and taking no arguments, which people may find strange, even though
it is just a specialisation of the example above. If you want \a to define \x to be a
macro with one argument, you need to write:

\def\a#1{+++#1+++\def\x ##1{xxx##1}}

and \ab will expand to ‘+++b+++\def\x #1{xxx#1}’, because #1 gets replaced by


‘b’ and ## gets replaced by #.
To nest a definition inside a definition inside a definition then you need ####1, as
at each stage ## is replaced by #. At the next level you need 8 #s each time, and so on.

172
295 Spaces in macros
It’s very easy to write macros that produce space in the typeset output where it’s neither
desired nor expected. Spaces introduced by macros are particularly insidious because
they don’t amalgamate with spaces around the macro (unlike consecutive spaces that
you type), so your output can have a single bloated space that proves to be made up
of two or even more spaces that haven’t amalgamated. And of course, your output can
also have a space where none was wanted at all.
Spaces are produced, inside a macro as elsewhere, by space or tab characters, or by
end-of-line characters. There are two basic rules to remember when writing a macro:
first, the rules for ignoring spaces when you’re typing macros are just the same as the
rules that apply when you’re typing ordinary text, and second, rules for ignoring spaces
do not apply to spaces produced while a macro is being obeyed (“expanded”).
Spaces are ignored in vertical mode (between paragraphs), at the beginning of a
line, and after a command name. Since sequences of spaces are collapsed into one, it
‘feels as if’ spaces are ignored if they follow another space. Space can have syntactic
meaning after certain sorts of non-braced arguments (e.g., count and dimen variable
assignments in Plain TeX) and after certain control words (e.g., in \hbox to, so again
we have instances where it ‘feels as if’ spaces are being ignored when they’re merely
working quietly for their living.
Consider the following macro, fairly faithfully adapted from one that appeared on
comp.text.tex:

\newcommand{\stline}[1]{ \bigskip \makebox[2cm]{ \textbf{#1} } }

The macro definition contains five spaces:

• after the opening { of the macro body; this space will be ignored, not because
“because the macro appears at the start of a line”, but rather because the macro
was designed to operate between paragraphs
• after \bigskip; this space will be ignored (while the macro is being defined)
because it follows a command name
• after the { of the mandatory argument of \makebox; even though this space will
inevitably appear at the start of an output line, it will not be ignored
• after the } closing the argument of \textbf; this space will not be ignored, but
may be overlooked if the argument is well within the 2cm allowed for it
• after the } closing the mandatory argument of \makebox; this space will not be
ignored

The original author of the macro had been concerned that the starts of his lines with this
macro in them were not at the left margin, and that the text appearing after the macro
wasn’t always properly aligned. These problems arose from the space at the start of the
mandatory argument of \makebox and the space immediately after the same argument.
He had written his macro in that way to emphasise the meaning of its various parts;
unfortunately the meaning was rather lost in the problems the macro caused.
The principal technique for suppressing spaces is the use of % characters: everything
after a % is ignored, even the end of line itself (so that not even the end of line can
contribute an unwanted space). The secondary technique is to ensure that the end of
line is preceded by a command name (since the end of line behaves like a space, it will
be ignored following a command name). Thus the above command would be written
(by an experienced programmer with a similar eye to emphasising the structure):

\newcommand{\stline}[1]{%
\bigskip
\makebox[2cm]{%
\textbf{#1}\relax
}%
}

Care has been taken to ensure that every space in the revised definition is ignored, so
none appears in the output. The revised definition takes the “belt and braces” approach,
explicitly dealing with every line ending (although, as noted above, a space introduced
at the end of the first line of the macro would have been ignored in actual use of the
macro. This is the best technique, in fact — it’s easier to blindly suppress spaces than
173
to analyse at every point whether you actually need to. Three techniques were used to
suppress spaces:
• placing a % character at the end of a line (as in the 1st, 3rd and 5th lines),
• ending a line ‘naturally’ with a control sequence, as in line 2, and
• ending a line with an ‘artificial’ control sequence, as in line 4; the control sequence
in this case (\relax) is a no-op in many circumstances (as here), but this usage is
deprecated — a % character would have been better.

Beware of the (common) temptation to place a space before a % character: if you do


this you might as well omit the % altogether.
In “real life”, of course, the spaces that appear in macros are far more cryptic than
those in the example above. The most common spaces arise from unprotected line
ends, and this is an error that occasionally appears even in macros written by the most
accomplished programmers.
296 How to break the 9-argument limit
If you think about it, you will realise that Knuth’s command definition syntax:

\def\blah#1#2 ... #9{<macro body>}

is intrinsically limited to just 9 arguments. There’s no direct way round this: how
would you express a 10th argument? — and ensure that the syntax didn’t gobble some
other valid usage?
If you really must have more than 9 arguments, the way to go is:

\def\blah#1#2 ... #9{%


\def\ArgI{{#1}}%
\def\ArgII{{#2}}%
...
\def\ArgIX{{#9}}%
\BlahRelay
}
\def\BlahRelay#1#2#3{%
% arguments 1-9 are now in
% \ArgI-\ArgIX
% arguments 10-12 are in
% #1-#3
<macro body>%
}

This technique is easily extendible by concert pianists of the TeX keyboard, but is really
hard to recommend.
LaTeX users have the small convenience of merely giving a number of arguments in
the \newcommand that defines each part of the relaying mechanism: Knuth’s restriction
applies to \newcommand just as it does to \def. However, LaTeX users also have the
way out of such barbarous command syntax: the keyval package. With keyval, and a bit
of programming, one can write really quite sophisticated commands, whose invocation
might look like:

\flowerinstance{species=Primula veris,
family=Primulaceae,
location=Coldham’s Common,
locationtype=Common grazing land,
date=1995/04/24,
numplants=50,
soiltype=alkaline
}

The merit of such verbosity is that it is self-explanatory: the typist doesn’t have to
remember that argument twelve is soiltype, and so on: the commands may be copied
from field notes quickly and accurately.
keyval.sty : Distributed as part of macros/latex/required/graphics

174
297 Defining characters as macros
Single characters can act as macros (defined commands), and both Plain TeX and
LaTeX define the character “~” as a “non-breakable space”. A character is made
definable, or “active”, by setting its category code (catcode) to be \active (13):
\catcode‘_=\active.
Any character could, in principle, be activated this way and defined as a macro
(\def_{\_} — the simple answer to using underscores), but you must be wary:
whereas people expect an active tilde, other active characters may be unexpected and
interact badly with other macros. Furthermore, by defining an active character, you
preclude the character’s use for other purposes, and there are few characters “free” to
be subverted in this way.
To define the character “z” as a command, one would say something like:

\catcode‘\z=\active
\def z{Yawn, I’m tired}%

and each subsequent “z” in the text would become a yawn. This would be an astound-
ingly bad idea for most documents, but might have special applications. (Note that,
in “\def z”, “z” is no longer interpreted as a letter; the space is therefore not neces-
sary — “\defz” would do; we choose to retain the space, for what little clarity we can
manage.) Some LaTeX packages facilitate such definitions. For example, the shortvrb
package with its \MakeShortVerb command.
TeX uses category codes to interpret characters as they are read from the input.
Changing a catcode value will not affect characters that have already been read.
Therefore, it is best if characters have fixed category codes for the duration of a docu-
ment. If catcodes are changed for particular purposes (the \verb command does this),
then the altered characters will not be interpreted properly when they appear in the
argument to another command (as, for example, in \verb in command arguments).
An exemplary case is the doc package, which processes .dtx files using the shortvrb
package to define |...| as a shorthand for \verb|...|. But | is also used in the
preambles of tabular environments, so that tables in .dtx files can only have vertical
line separation between columns by employing special measures of some sort.
Another consequence is that catcode assignments made in macros often don’t work
as expected (Active characters in command arguments). For example, the definition

\def\mistake{%
\catcode‘_=\active
\def_{\textunderscore\-}%
}

does not work because it attemts to define an ordinary _ character: When the macro
is used, the category change does not apply to the underscore character already in the
macro definition. Instead, one may use:

\begingroup
\catcode‘_=\active
\gdef\works{% note the global \gdef
\catcode‘_=\active
\def_{\textunderscore\-}%
}
\endgroup

The alternative (“tricksy”) way of creating such an isolated definition depends on the
curious properties of \lowercase, which changes characters without altering their cat-
codes. Since there is always one active character (“~”), we can fool \lowercase into
patching up a definition without ever explicitly changing a catcode:

\begingroup
\lccode‘\~=‘\_
\lowercase{\endgroup
\def~{\textunderscore\-}%
}%

175
The two definitions have the same overall effect (the character is defined as a com-
mand, but the character does not remain active), except that the first defines a \global
command.
For active characters to be used only in maths mode, it is much better to leave the
character having its ordinary catcode, but assign it a special active maths code, as with
\begingroup
\lccode‘~=‘x
\lowercase{\endgroup
\def~{\times}%
}%
\mathcode‘x="8000

The special character does not need to be redefined whenever it is made active — the
definition of the command persists even if the character’s catcode reverts to its original
value; the definition becomes accessible again if the character once again becomes
active.
doc.sty : Distributed as part of the source of LaTeX, macros/latex/base
shortvrb.sty : Distributed as part of macros/latex/required/tools

298 Active characters in command arguments


Occasionally, it’s nice to make one or two characters active in the argument of a com-
mand, to make it easier for authors to code the arguments.
Active characters can be used safely in such situations; but care is needed.
An example arose while this answer was being considered: an aspirant macro writer
posted to comp.text.tex asking for help to make # and b produce musical sharp and
flat signs, respectively, in a macro for specifying chords.
The first problem is that both # and b have rather important uses elsewhere in TeX
(to say the least!), so that the characters can only be made active while the command is
executing.
Using the techniques discussed in characters as commands, we can define:

\begingroup
\catcode‘\#=\active
\gdef#{$\sharp$}
\endgroup

and:
\begingroup
\lccode‘\~=‘\b
\lowercase{\endgroup
\def~{$\flat$}%
}

The second problem is one of timing: the command has to make each character ac-
tive before its arguments are read: this means that the command can’t actually “have”
arguments itself, but must be split in two. So we write:

\def\chord{%
\begingroup
\catcode‘\#=\active
\catcode‘\b=\active
\Xchord
}
\def\Xchord#1{%
\chordfont#1%
\endgroup
}

and we can use the command as \chord{F#} or \chord{Bb minor}.


Two features of the coding are important:

176
• \begingroup in \chord opens a group that is closed by \endgroup in \Xchord;
this group limits the change of category codes, which is the raison d’être of the
whole exercise.
• Although # is active while \Xchord is executed, it’s not active when it’s being
defined, so that the use of #1 doesn’t require any special attention.

Note that the technique used in such macros as \chord, here, is analagous to that
used in such commands as \verb; and, in just the same way as \verb (see \verb
doesn’t work in arguments), \chord won’t work inside the argument of another com-
mand (the error messages, if they appear at all, will probably be rather odd).
299 Defining a macro from an argument
It’s common to want a command to create another command: often one wants the new
command’s name to derive from an argument. LaTeX does this all the time: for exam-
ple, \newenvironment creates start- and end-environment commands whose names
are derived from the name of the environment command.
The (seemingly) obvious approach:

\def\relay#1#2{\def\#1{#2}}

doesn’t work (the TeX engine interprets it as a rather strange redefinition of \#). The
trick is to use \csname, which is a TeX primitive for generating command names from
random text, together with \expandafter. The definition above should read:

\def\relay#1#2{%
\expandafter\def\csname #1\endcsname{#2}%
}

With this definition, \relay{blah}{bleah} is equivalent to \def\blah{bleah}.


Note that the definition of \relay omits the braces round the ‘command name’
in the \newcommand it executes. This is because they’re not necessary (in fact they
seldom are), and in this circumstance they make the macro code slightly more tedious.
The name created need not (of course) be just the argument:
\def\newrace#1#2#3{%
\expandafter\def\csname start#1\endcsname{%
#2%
}%
\expandafter\def\csname finish#1\endcsname{%
#3%
}%
}

With commands

\def\start#1{\csname start#1\endcsname}
\def\finish#1{\csname finish#1\endcsname}

these ‘races’ could behave a bit like LaTeX environments.


300 Transcribing LaTeX command definitions
At several places in this FAQ, questions are answered in terms of how to program a
LaTeX macro. Sometimes, these macros might also help users of Plain TeX or other
packages; this answer attempts to provide a rough-and-ready guide to transcribing such
macro definitions for use in other packages.
The reason LaTeX has commands that replace \def, is that there’s a general
philosophy within LaTeX that the user should be protected from himself: the user
has different commands according to whether the command to be defined exists
(\renewcommand) or not (\newcommand), and if its status proves not as the user ex-
pected, an error is reported. A third definition command, \providecommand, only
defines if the target is not already defined; LaTeX has no direct equivalent of \def,
which ignores the present state of the command. The final command of this sort is
\DeclareRobustCommand, which creates a command which is “robust” (i.e., will not
expand if subjected to LaTeX “protected expansion”); from the Plain TeX user’s point

177
of view, \DeclareRobustCommand should be treated as a non-checking version of
\newcommand.
LaTeX commands are, by default, defined \long; an optional * between the
\newcommand and its (other) arguments specifies that the command is not to be defined
\long. The * is detected by a command \@ifstar which uses \futurelet to switch
between two branches, and gobbles the *: LaTeX users are encouraged to think of the
* as part of the command name.
LaTeX’s checks for unknown command are done by \ifx comparison of a \csname
construction with \relax; since the command name argument is the desired control
sequence name, this proves a little long-winded. Since #1 is the requisite argument, we
have:
\expandafter\ifx
\csname\expandafter\@gobble\string#1\endcsname
\relax
...

(\@gobble simply throws away its argument).


The arguments of a LaTeX command are specified by two optional arguments to
the defining command: a count of arguments (0–9: if the count is 0, the optional count
argument may be omitted), and a default value for the first argument, if the defined
command’s first argument is to be optional. So:
\newcommand\foo{...}
\newcommand\foo[0]{...}
\newcommand\foo[1]{...#1...}
\newcommand\foo[2][boo]{...#1...#2...}

In the last case, \foo may be called as \foo{goodbye}, which is equivalent to \foo
[boo]{goodbye} (employing the default value given for the first argument), or as \foo
[hello]{goodbye} (with an explicit first argument).
Coding of commands with optional arguments is exemplified by the coding of the
last \foo above:
\def\foo{\futurelet\next\@r@foo}
\def\@r@foo{\ifx\next[%
\let\next\@x@foo
\else
\def\next{\@x@foo[boo]}%
\fi
\next
}
\def\@x@foo[#1]#2{...#1...#2...}

301 Detecting that something is empty


Suppose you need to know that the argument of your command is empty: that is, to
distinguish between \cmd{} and \cmd{blah}. This is pretty simple:
\def\cmd#1{%
\def\tempa{}%
\def\tempb{#1}%
\ifx\tempa\tempb
<empty case>
\else
<non-empty case>
\fi
}

The case where you want to ignore an argument that consists of nothing but spaces,
rather than something completely empty, is more tricky. It’s solved in the code frag-
ment ifmtarg, which defines commands \@ifmtarg and \@ifnotmtarg, which distin-
guish (in opposite directions) between a second and third argument. The package’s
code also appears in the LaTeX memoir class.
Ifmtarg makes challenging reading; there’s also a discussion of the issue in number
two of the “around the bend” articles by the late lamented Mike Downes.
178
Around the bend series: info/aro-bend
ifmtarg.sty : macros/latex/contrib/misc/ifmtarg.sty
memoir.cls: macros/latex/contrib/memoir

302 Am I using PDFTeX?


It’s often useful to know whether your macros are operating within PDFTeX or within
(“normal”) TeX; getting the right answer is surprisingly tricky.
Suppose you need to test whether your output will be PDF or DVI. The natural
thing is to check whether you have access to some PDFTeX-only primitive; a good
one to try (not least because it was present in the very first releases of PDFTeX) is
\pdfoutput. So you try

\ifx\pdfoutput\undefined
<not running PDFTeX>
\else
<running PDFTeX>
\fi

Except that neither branch of this conditional is rock-solid. The first branch can be
misleading, since the “awkward” user could have written:
\let\pdfoutput\undefined

so that your test will falsely choose the first alternative. While this is a theoretical
problem, it is unlikely to be a major one.
More important is the user who loads a package that uses LaTeX-style testing for
the command name’s existence (for example, the LaTeX graphics package, which is
useful even to the Plain TeX user). Such a package may have gone ahead of you, so the
test may need to be elaborated:
\ifx\pdfoutput\undefined
<not running PDFTeX>
\else
\ifx\pdfoutput\relax
<not running PDFTeX>
\else
<running PDFTeX>
\fi
\fi

If you only want to know whether some PDFTeX extension (such as marginal kerning)
is present, you can stop at this point: you know as much as you need.
However, if you need to know whether you’re creating PDF output, you also need
to know about the value of \pdfoutput:
\ifx\pdfoutput\undefined
<not running PDFTeX>
\else
\ifx\pdfoutput\relax
<not running PDFTeX>
\else
<running PDFTeX, with...>
\ifnum\pdfoutput>0
<...PDF output>
\else
<...DVI output>
\fi
\fi
\fi

The above is, in essence, what Heiko Oberdiek’s ifpdf package does; the reasoning is
the FAQ’s interpretation of Heiko’s explanation.
ifpdf.sty : Distributed with Heiko Oberdiek’s packages macros/latex/
contrib/oberdiek
179
303 Subverting a token register
A common requirement is to “subvert” a token register that other macros may use.
The requirement arises when you want to add something to a system token register
(\output or \every*), but know that other macros use the token register, too. (A
common requirement is to work on \everypar, but LaTeX changes \everypar at
every touch and turn.)
The following technique, due to David Kastrup, does what you need, and allows an
independent package to play the exact same game:
\let\mypkg@@everypar\everypar
\newtoks\mypkg@everypar
\mypkg@everypar\expandafter{\the\everypar}
\mypkg@@everypar{\mypkgs@ownstuff\the\mypkg@everypar}
\def\mypkgs@ownstuff{%
<stuff to do at the start of the token register>%
}
\let\everypar\mypkg@everypar

As you can see, the package (mypkg)

• creates an alias for the existing “system” \everypar (which is frozen into any
surrounding environment, which will carry on using the original);
• creates a token register to subvert \everypar and initialises it with the current
contents of \everypar;
• sets the “old” \everypar to execute its own extra code, as well as the contents of
its own token register;
• defines the macro for the extra code; and
• points the token \everypar at the new token register.

and away we go.


The form \mypkg@... is (sort of) blessed for LaTeX package internal names,
which is why this example uses macros of that form.

S.2 LaTeX macro tools and techniques


304 Using Plain or primitive commands in LaTeX
It’s well-known that LaTeX commands tend to be more complex, and to run more
slowly than, any Plain (or primitive) command that they replace. There is therefore
great temptation not to use LaTeX commands when macro programming. Neverthe-
less, the general rule is that you should use LaTeX commands, if there are seeming
equivalents. The exception is when you are sure you know the differences between the
two commands and you know that you need the Plain version. So, for example, use
\mbox in place of \hbox unless you know that the extras that LaTeX provides in \mbox
would cause trouble in your application. Similarly, use \newcommand (or one of its
relatives) unless you need one of the constructs that cannot be achieved without the use
of \def (or friends).
As a general rule, any LaTeX text command will start a new paragraph if necessary;
this isn’t the case with Plain TeX commands, a fact which has a potential to confuse.
The commands \smallskip, \medskip and \bigskip exist both in Plain TeX and
LaTeX, but behave slightly differently: in Plain TeX they terminate the current para-
graph, but in LaTeX they don’t. The command \line is part of picture mode in LaTeX,
whereas it’s defined as “\hbox to \hsize” in Plain TeX. (There’s no equivalent for
users of the Plain TeX command in LaTeX: an equivalent appears as the internal com-
mand \@@line).
Maths setting shows a case where the LaTeX version is essentially equivalent to
the TeX primitive commands: the LaTeX \( ... \) does essentially no different to
the TeX $ ... $ (except for checking that you’re not attempting to open a maths
environment when you’re already in one, or vice versa). However, \[ ... \] isn’t the
same as $$ ... $$: the TeX version, used in LaTeX, contrives to miss the effect of
the class option fleqn.
Font handling is, of course, wildly different in Plain TeX and LaTeX: even the La-
TeX equivalent of the Plain TeX font-loading command (\newfont) should be avoided

180
wherever possible: the possibilities of confusion with constructs that vary their be-
haviour according to the font size that LaTeX has recorded are (sadly) legion. A really
rather funny example is to be had by saying:
\documentclass{article}
\begin{document}
\font \myfont=cmr17 scaled 2000
\myfont
\LaTeX
\end{document}

(the reader is encouraged to try this). The “A” of LaTeX has pretty much disappeared:
LaTeX sets the “A” according to its idea of the font size (10pt), but “\myfont” is more
than three times that size.
Another “\myfont” example arises from an entirely different source. The mini-
document:
\documentclass{article}
\begin{document}
\font \myfont=ecrm1000
{\myfont par\‘a}
\end{document}

gives you “German low double quotes” in place of the grave accent. This problem
arises because ecrm1000 is in a different font encoding than LaTeX is expecting — if
you use LaTeX font commands, all the tiresome encoding issues are solved for you,
behind the scenes.
So, whether you’re dealing with a one-off font or a major new family, you are far
more likely to be satisfied with the LaTeX file selection system, so it’s worth investing
a small amount of effort to write declarations of the font’s family and how it should
be loaded. For details of the commands you need, see the LaTeX font usage guide,
fntguide: this may be viewed on the archive, but you should find a copy of the docu-
ment in your own system.
fntguide.pdf : macros/latex/doc/fntguide.pdf
fntguide.tex : Distributed with macros/latex/base

305 \@ and @ in macro names


Macro names containing @ are internal to LaTeX, and without special treatment just
don’t work in ordinary use. A nice example of the problems caused is discussed in \@
in vertical mode”.
The problems users see are caused by copying bits of a class (.cls file) or package
(.sty file) into a document, or by including a class or package file into a LaTeX doc-
ument by some means other than \documentclass or \usepackage. LaTeX defines
internal commands whose names contain the character @ to avoid clashes between its
internal names and names that we would normally use in our documents. In order that
these commands may work at all, \documentclass and \usepackage play around
with the meaning of @.
If you’ve included a file wrongly, you solve the problem by using the correct com-
mand.
If you’re using a fragment of a package or class, you may well feel confused: books
such as the first edition of the The LaTeX Companion are full of fragments of packages
as examples for you to employ. This problem has been solved in the second edition,
and in addition, all the examples from the Companion are now available on-line. To
see the technique in practice, look at the example below, from file 2-2-7.ltx in the
Companion examples directory:
\makeatletter
\renewcommand\subsection{\@startsection
{subsection}{2}{0mm}%name, level, indent
{-\baselineskip}% beforeskip
{0.5\baselineskip}% afterskip
{\normalfont\normalsize\itshape}}% style
\makeatother
181
(That example appears on page 29 of The LaTeX Companion, second edition.)
The alternative is to treat all these fragments as a package proper, bundling them
up into a .sty file and including them with \usepackage; this way you hide your
LaTeX internal code somewhere that LaTeX internal code is expected, which often
looks ‘tidier’.
Examples from the Companion: info/examples/tlc2
306 What’s the reason for ‘protection’?
Sometimes LaTeX saves data it will reread later. These data are often the argument
of some command; they are the so-called moving arguments. (‘Moving’ because data
are moved around.) Places to look for are all arguments that may go into table of
contents, list of figures, etc.; namely, data that are written to an auxiliary file and read
in later. Other places are those data that might appear in head- or footlines. Section
headings and figure captions are the most prominent examples; there’s a complete list
in Lamport’s book (see TeX-related books).
What’s going on really, behind the scenes? The commands in the moving ar-
guments are already expanded to their internal structure during the process of sav-
ing. Sometimes this expansion results in invalid TeX code when processed again.
“\protect\cmd” tells LaTeX to save \cmd as \cmd, without expansion.
What is a ‘fragile command’? It’s a command that expands into illegal TeX code
during the save process.
What is a ‘robust command’? It’s a command that expands into legal TeX code
during the save process.
Again, commands are marked as ‘robust’ or ‘fragile’, as they’re defined in Lam-
port’s book. Sadly, some commands are robust in LaTeX itself, but are redefined by
some packages to be fragile; the \cite command commonly suffers this treatment.
No-one (of course) likes this situation; the LaTeX3 team have removed the need for
protection of some things in the production of LaTeX 2ε , but the techniques available
to them within current LaTeX mean that this is an expensive exercise. It remains a
long-term aim of the team to remove all need for these things.
307 \edef does not work with \protect
Robust LaTeX commands are either “naturally robust” — meaning that they never
need \protect, or “self-protected” — meaning that they have \protect built in to
their definition in some way. Self-protected commands are robust only in a context
where the \protect mechanism is properly handled. The body of an \edef definition
doesn’t handle \protect properly, since \edef is a TeX primitive rather than a LaTeX
command.
This problem is resolved by a LaTeX internal command \protected@edef, which
does the job of \edef while keeping the \protect mechanism working. There’s a
corresponding \protected@xdef which does the job of \xdef.
Of course, these commands need to be tended carefully, since they’re internal: see
’@’ in control sequence names.
308 The definitions of LaTeX commands
There are several reasons to want to know the definitions of LaTeX commands: from
the simplest “idle curiosity”, to the pressing need to patch something to make it “work
the way you want it”. None of these are pure motives, but knowledge and expertise
seldom arrive through the purest of motives.
The simple answer is to try \show, in a run of LaTeX that is taking commands from
the terminal:

*\makeatletter
*\show\protected@edef
> \protected@edef=macro:
->\let \@@protect \protect
\let \protect \@unexpandable@protect
\afterassignment \restore@protect \edef .

(I’ve rearranged the output there, from the rather confused version TeX itself produces.)
We may perhaps, now, wonder about \@unexpandable@protect:

182
*\show\@unexpandable@protect
> \@unexpandable@protect=macro:
->\noexpand \protect \noexpand .

and we’re starting to see how one part of the \protection mechanism works (one can
probably fairly safely guess what \restore@protect does).
Many kernel commands are declared robust:
*\show\texttt
> \texttt=macro:
->\protect \texttt .

so that \show isn’t much help. Define a command \pshow as shown below, and use
that instead:
*\def\pshow#1{{\let\protect\show #1}}
*\pshow\texttt
> \texttt =\long macro:
#1->\ifmmode \nfss@text {\ttfamily #1}%
\else \hmode@bgroup \text@command {#1}%
\ttfamily \check@icl #1\check@icr
\expandafter \egroup \fi .

Note that the command name that is protected is the ‘base’ command, with a space
appended. This is cryptically visible, in a couple of places above. (Again, the output
has been sanitised.)
If one has a malleable text editor, the same investigation may more comfortably be
conducted by examining the file latex.ltx (which is usually to be found, in a TDS
system, in directory tex/latex/base).
In fact, latex.ltx is the product of a docstrip process on a large number of
.dtx files, and you can refer to those instead. The LaTeX distribution includes a file
source2e.tex, and most systems retain it, again in tex/latex/base. Source2e.tex
may be processed to provide a complete source listing of the LaTeX kernel (in fact the
process isn’t entirely straightforward, but the file produces messages advising you what
to do). The result is a huge document, with a line-number index of control sequences
the entire kernel and a separate index of changes recorded in each of the files since the
LaTeX team took over.
The printed kernel is a nice thing to have, but it’s unwieldy and sits on my shelves,
seldom used. One problem is that the comments are patchy: the different modules
range from well and lucidly documented, through modules documented only through
an automatic process that converted the documentation of the source of LaTeX 2.09, to
modules that hardly had any useful documentation even in the LaTeX 2.09 original.
In fact, each kernel module .dtx file will process separately through LaTeX, so
you don’t have to work with the whole of source2e. You can easily determine which
module defines the macro you’re interested in: use your “malleable text editor” to find
the definition in latex.ltx; then search backwards from that point for a line that starts
%%% From File: — that line tells you which .dtx file contains the definition you are
interested in. Doing this for \protected@edef, we find:
%%% From File: ltdefns.dtx

When we come to look at it, ltdefns.dtx proves to contain quite a dissertation on


the methods of handling \protection; it also contains some automatically-converted
LaTeX 2.09 documentation.
And of course, the kernel isn’t all of LaTeX: your command may be defined
in one of LaTeX’s class or package files. For example, we find a definition of
\thebibliography in article, but there’s no article.dtx. Some such files are
generated from parts of the kernel, some from other files in the distribution. You find
which by looking at the start of the file: in article.cls, we find:
%% This is file ‘article.cls’,
%% generated with the docstrip utility.
%%
%% The original source files were:
%%
%% classes.dtx (with options: ‘article’)
183
so we need to format classes.dtx to see the definition in context.
All these .dtx files are on CTAN as part of the main LaTeX distribution.
LaTeX distribution: macros/latex/base
309 Optional arguments like \section
Optional arguments, in macros defined using \newcommand, don’t quite work like the
optional argument to \section. The default value of \section’s optional argument is
the value of the mandatory argument, but \newcommand requires that you ‘know’ the
value of the default beforehand.
The requisite trick is to use a macro in the optional argument:
\documentclass{article}
\newcommand\thing[2][\DefaultOpt]{%
\def\DefaultOpt{#2}%
optional arg: #1, mandatory arg: #2%
}
\begin{document}
\thing{manda}% #1=#2

\thing[opti]{manda}% #1="opti"
\end{document}

310 More than one optional argument


If you’ve already read “breaking the 9-argument limit”. you can probably guess the
solution to this problem: command relaying.
LaTeX allows commands with a single optional argument thus:
\newcommand{\blah}[1][Default]{...}
You may legally call such a command either with its optional argument present, as
\blah[nonDefault] or as \blah; in the latter case, the code of \blah will have an
argument of Default.
To define a command with two optional arguments, we use the relaying technique,
as follows:
\newcommand{\blah}[1][Default1]{%
\def\ArgI{{#1}}%
\BlahRelay
}
\newcommand\BlahRelay[1][Default2]{%
% the first optional argument is now in
% \ArgI
% the second is in #1
...%
}
Of course, \BlahRelay may have as many mandatory arguments as are allowed, after
allowance for the one taken up with its own optional argument — that is, 8.
Variants of \newcommand (and friends), with names like \newcommandtwoopt, are
available in the twoopt package. However, if you can, it’s probably better to learn to
write the commands yourself, just to see why they’re not even a good idea from the
programming point of view.
A command with two optional arguments strains the limit of what’s sensible: ob-
viously you can extend the technique to provide as many optional arguments as your
fevered imagination can summon. However, see the comments on the use of the keyval
package, in “breaking the 9-argument limit”, which offers an alternative way forward.
If you must, however, consider the optparams package; provides a \optparams
command that you use as an intermediate in defining commands with up to nine op-
tional arguments. The documentation shows examples of commands with four optional
arguments (and this from an author who has his own key-value package!).
An alternative approach is offered by Scott Pakin’s newcommand program, which
takes a command name and a definition of a set of command arguments (in a fairly
readily-understood language), and emits (La)TeX macros which enable the command
to be defined. The command requires that a Python system be installed on your com-
puter.

184
newcommand.py : support/newcommand
optparams.sty : Distributed as part of macros/latex/contrib/sauerj
twoopt.sty : Distributed as part of macros/latex/contrib/oberdiek

311 Commands defined with * options


LaTeX commands commonly have “versions” defined with an asterisk tagged onto
their name: for example \newcommand and \newcommand* (the former defines a \long
version of the command).
The simple-minded way for a user to write such a command involves use of the
ifthen package:
\newcommand{\mycommand}[1]{\ifthenelse{\equal{#1}{*}}%
{\mycommandStar}%
{\mycommandNoStar{#1}}%
}
\newcommand{\mycommandStar}{starred version}
\newcommand{\mycommandNoStar}[1]{normal version}

This does the trick, for sufficiently simple commands, but it has various tiresome failure
modes, and it requires \mycommandnostar to take an argument.
Of course, the LaTeX kernel has something slicker than this:
\newcommand{\mycommand}{\@ifstar
\mycommandStar%
\mycommandNoStar%
}
\newcommand{\mycommandStar}[2]{starred version}
\newcommand{\mycommandNoStar}[1]{normal version}

(Note that arguments to \mycommandStar and \mycommandNoStar are independent —


either can have their own arguments, unconstrained by the technique we’re using, un-
like the trick described above.) The \@ifstar trick is all very well, is fast and efficient,
but it requires the definition to be \makeatletter protected.
A pleasing alternative is the suffix package. This elegant piece of code allows you
to define variants of your commands:
\newcommand\mycommand{normal version}
\WithSuffix\newcommand\mycommand*{starred version}

The package needs e-LaTeX, but any new enough distribution defines LaTeX as e-
LaTeX by default. Command arguments may be specified in the normal way, in both
command definitions (after the “*” in the \WithSuffix version). You can also use the
TeX primitive commands, creating a definition like:
\WithSuffix\gdef\mycommand*{starred version}

ifthen.sty : Part of the LaTeX distribution


suffix.sty : Distributed as part of macros/latex/contrib/bigfoot

312 LaTeX internal “abbreviations”, etc.


In the deeps of time, when TeX first happened, computers had extremely limited mem-
ory, and were (by today’s standards) painfully slow. When LaTeX came along, things
weren’t much better, and even when LaTeX 2ε appeared, there was a strong imperative
to save memory space (and to a lesser extent) CPU time.
From the very earliest days, Knuth used shortcut macros to speed things up. LaTeX,
over the years, has extended Knuth’s list by a substantial amount. An interesting feature
of the “abbreviations” is that on paper, they may look longer than the thing they stand
for; however, to (La)TeX they feel smaller. . .
The table at the end of this answer lists the commonest of these “abbreviations”. It
is not complete; as always, if the table doesn’t help, try the LaTeX source. The table
lists each abbreviation’s name and its value, which provide most of what a user needs
to know. The table also lists the abbreviation’s type, which is a trickier concept: if
you need to know, the only real confusion is that the abbreviations labelled ‘defn’ are
defined using an \xxxdef command.
185
Name Type Value
\m@ne count −1
\p@ dimen 1pt
\z@ dimen 0pt
\z@skip skip 0pt plus 0pt minus 0pt
\@ne defn 1
\tw@ defn 2
\thr@@ defn 3
\sixt@@n defn 16
\@cclv defn 255
\@cclvi defn 256
\@m defn 1000
\@M defn 10000
\@MM defn 20000
\@vpt macro 5
\@vipt macro 6
\@viipt macro 7
\@viiipt macro 8
\@ixpt macro 9
\@xpt macro 10
\@xipt macro 10.95
\@xiipt macro 12
\@xivpt macro 14.4
\@xviipt macro 17.28
\@xxpt macro 20.74
\@xxvpt macro 24.88
\@plus macro “plus”
\@minus macro “minus”
313 Defining LaTeX commands within other commands
LaTeX command definition is significantly different from the TeX primitive form dis-
cussed in an earlier question about definitions within macros.
In most ways, the LaTeX situation is simpler (at least in part because it imposes
more restrictions on the user); however, defining a command within a command still
requires some care.
The earlier question said you have to double the # signs in command definitions: in
fact, the same rule holds, except that LaTeX already takes care of some of the issues,
by generating argument lists for you.
The basic problem is that:

\newcommand{\abc}[1]{joy, oh #1!%
\newcommand{\ghi}[1]{gloom, oh #1!}%
}

followed by a call:

\cmdinvoke{abc}{joy}

typesets “joy, oh joy!”, but defines a command \ghi that takes one parameter, which it
ignores; \ghi{gloom} will expand to “gloom, oh joy!”, which is presumably not what
was expected.
And (as you will probably guess, if you’ve read the earlier question) the definition:

\newcommand{\abc}[1]{joy, oh #1!%
\newcommand{\ghi}[1]{gloom, oh ##1!}%
}

does what is required, and \ghi{gloom} will expand to “gloom, oh gloom!”, whatever
the argument to \abc.
The doubling is needed whether or not the enclosing command has an argument,
so:

186
\newcommand{\abc}{joy, oh joy!%
\newcommand{\ghi}[1]{gloom, oh ##1!}%
}

is needed to produce a replica of the \ghi we defined earlier.

S.3 LaTeX macro programming


314 How to change LaTeX’s “fixed names”
LaTeX document classes define several typographic operations that need ‘canned text’
(text not supplied by the user). In the earliest days of LaTeX 2.09 these bits of text
were built in to the body of LaTeX’s macros and were rather difficult to change, but
“fixed name” macros were introduced for the benefit of those wishing to use La-
TeX in languages other than English. For example, the special section produced by
the \tableofcontents command is always called \contentsname (or rather, what
\contentsname is defined to mean). Changing the canned text is now one of the easi-
est customisations a user can do to LaTeX.
The canned text macros are all of the form \hthing iname, and changing them is
simplicity itself. Put:

\renewcommand{\hthing iname}{Res minor}

in the preamble of your document, and the job is done. (However, beware of the babel
package, which requires you to use a different mechanism: be sure to check changing
babel names if you’re using it.)
The names that are defined in the standard LaTeX classes (and the makeidex pack-
age) are listed below. Some of the names are only defined in a subset of the classes
(and the letter class has a set of names all of its own); the list shows the specialisation
of each name, where appropriate.
\abstractname Abstract
\alsoname see also (makeidx package)
\appendixname Appendix
\bibname Bibliography (report,book)
\ccname cc (letter)
\chaptername Chapter (report,book)
\contentsname Contents
\enclname encl (letter)
\figurename Figure (for captions)
\headtoname To (letter)
\indexname Index
\listfigurename List of Figures
\listtablename List of Tables
\pagename Page (letter)
\partname Part
\refname References (article)
\seename see (makeidx package)
\tablename Table (for caption)
315 Changing the words babel uses
LaTeX uses symbolic names for many of the automatically-generated text it produces
(special-purpose section headings, captions, etc.). As noted in “LaTeXfixed names”
(which includes a list of the names themselves), this enables the user to change the
names used by the standard classes, which is particularly useful if the document is
being prepared in some language other than LaTeX’s default English. So, for example,
a Danish author may wish that her table of contents was called “Indholdsfortegnelse”,
and so would expect to place a command
\renewcommand{\contentsname}%
{Indholdsfortegnelse}
in the preamble of her document.
However, it’s natural for a user of a non-English language to use babel, because
it offers many conveniences and typesetting niceties for those preparing documents
in those languages. In particular, when babel is selecting a new language, it ensures
that LaTeX’s symbolic names are translated appropriately for the language in question.
187
Unfortunately, babel’s choice of names isn’t always to everyone’s choice, and there is
still a need for a mechanism to replace the ‘standard’ names.
Whenever a new language is selected, babel resets all the names to the settings for
that language. In particular, babel selects the document’s main language when \begin
{document} is executed, which immediately destroys any changes to these symbolic
names made in the prologue of a document that uses babel.
Therefore, babel defines a command to enable users to change the definitions of
the symbolic names, on a per-language basis: \addto\captionshlanguagei is the
thing (hlanguagei being the language option you gave to babel in the first place). For
example:
\addto\captionsdanish{%
\renewcommand{\contentsname}%
{Indholdsfortegnelse}%
}

316 Running equation, figure and table numbering


Many LaTeX classes (including the standard book class) number things per chapter; so
figures in chapter 1 are numbered 1.1, 1.2, and so on. Sometimes this is not appropriate
for the user’s needs.
Short of rewriting the whole class, one may use one of the removefr and remreset
packages; both define a \@removefromreset command, and having included the pack-
age one writes something like:

\makeatletter
\@removefromreset{figure}{chapter}

and the automatic renumbering stops. You then need to redefine the way in which the
figure number (in this case) is printed:

\renewcommand{\thefigure}{\@arabic\c@figure}
\makeatother

(remember to do the whole job, for every counter you want to manipulate, within
\makeatletter . . . \makeatother).
The technique may also be used to change where in a multilevel structure a counter
is reset. Suppose your class numbers figures as hchapteri.hsectioni.hfigurei, and you
want figures numbered per chapter, try:

\@removefromreset{figure}{section}
\@addtoreset{figure}{chapter}
\renewcommand{\thefigure}{\thechapter.\@arabic\c@figure}

(the command \@addtoreset is a part of LaTeX itself).


The chngcntr package provides a simple means to access the two sorts of changes
discussed, defining \counterwithin and \counterwithout commands; the memoir
class also provides these functions.
chngcntr.sty : macros/latex/contrib/misc/chngcntr.sty
memoir.cls: macros/latex/contrib/memoir
removefr.tex : macros/latex/contrib/fragments/removefr.tex (note, this
is constructed as a “fragment” for use within other packages: load by \input
{removefr})
remreset.sty : Distributed as part of macros/latex/contrib/carlisle

317 Making labels from a counter


Suppose we have a LaTeX counter, which we’ve defined with \newcounter{foo}. We
can increment the value of the counter by \addtocounter{foo}{1}, but that’s pretty
clunky for an operation that happens so often . . . so there’s a command \stepcounter
{foo} that does this special case of increasing-by-one.
There’s an internal LaTeX variable, the “current label”, that remembers the last
‘labellable’ thing that LaTeX has processed. You could (if you were to insist) set that
value by the relevant TeX command (having taken the necessary precautions to ensure
that the internal command worked) — but it’s not necessary. If, instead of either of

188
the stepping methods above, you say \refstepcounter{foo}, the internal variable is
set to the new value, and (until something else comes along), \label will refer to the
counter.
318 Finding if you’re on an odd or an even page
Another question discusses the issue of getting \marginpar commands to put their
output in the correct margin of two-sided documents. This is an example of the general
problem of knowing where a particular bit of text lies: the output routine is asyn-
chronous, and (La)TeX will usually process quite a bit of the “next” page before de-
ciding to output any page. As a result, the page counter (known internally in LaTeX as
\c@page) is normally only reliable when you’re actually in the output routine.
The solution is to use some version of the \label mechanism to determine which
side of the page you’re on; the value of the page counter that appears in a \pageref
command has been inserted in the course of the output routine, and is therefore safe.
However, \pageref itself isn’t reliable: one might hope that
\ifthenelse{\isodd{\pageref{foo}}}{odd}{even}
would do the necessary, but both the babel and hyperref packages have been known to
interfere with the output of \pageref; be careful!
The chngpage package needs to provide this functionality for its own use, and
therefore provides a command \checkoddpage; this sets a private-use label, and the
page reference part of that label is then examined (in a hyperref -safe way) to set a con-
ditional \ifcpoddpage true if the command was issued on an odd page. The memoir
class has the same command setting a conditional \ifoddpage. Of course, the \label
contributes to LaTeX’s “Rerun to get cross-references right” error messages. . .
The Koma-Script classes have an addmargin* environment that also provides
the sorts of facilities that the chngpage offers. Koma-Script’s supporting command
is \ifthispageodd{<true>}{<false>} executes different things depending on the
page number.
chngpage.sty : macros/latex/contrib/misc/chngpage.sty
KOMA script bundle: macros/latex/contrib/koma-script
memoir.cls: macros/latex/contrib/memoir

319 How to change the format of labels


By default, when a label is created, it takes on the appearance of the counter labelled,
so the label appears as \thehcounter i — what would be used if you asked to typeset
the counter in your text. This isn’t always what you need: for example, if you have
nested enumerated lists with the outer numbered and the inner labelled with letters,
one might expect to want to refer to items in the inner list as “2(c)”. (Remember, you
can change the structure of list items — change the structure of list items.) The change
is of course possible by explicit labelling of the parent and using that label to construct
the typeset result — something like
\ref{parent-item}(\ref{child-item})

which would be both tedious and error-prone. What’s more, it would be undesir-
able, since you would be constructing a visual representation which is inflexible (you
couldn’t change all the references to elements of a list at one fell swoop).
LaTeX in fact has a label-formatting command built into every label definition; by
default it’s null, but it’s available for the user to program. For any label hcounteri
there’s a LaTeX internal command \p@hcounter i; for example, a label definition on
an inner list item is supposedly done using the command \p@enumii{\theenumii}.
Unfortunately, the internal workings of this aren’t quite right, and you need to patch
the \refstepcounter command:
\renewcommand*\refstepcounter[1]{\stepcounter{#1}%
\protected@edef\@currentlabel{%
\csname p@#1\expandafter\endcsname
\csname the#1\endcsname
}%
}

With the patch in place you can now, for example, change the labels on all inner lists
by adding the following code in your preamble:
189
\makeatletter
\renewcommand{\p@enumii}[1]{\theenumi(#1)}
\makeatother

This would make the labels for second-level enumerated lists appear as “1(a)” (and so
on). The analogous change works for any counter that gets used in a \label command.
In fact, the fncylab package does all the above (including the patch to LaTeX it-
self). With the package, the code above is (actually quite efficiently) rendered by the
command:
\labelformat{enumii}{\theenumi(#1)}

In fact, the above example, which we can do in several different ways, has been ren-
dered obsolete by the appearance of the enumitem package, which is discussed in the
answer about decorating enumeration lists.
enumitem.sty : macros/latex/contrib/enumitem
fncylab.sty : macros/latex/contrib/misc/fncylab.sty

320 Adjusting the presentation of section numbers


The general issues of adjusting the appearance of section headings are pretty complex,
and are covered in question the style of section headings.
However, people regularly want merely to change the way the section number ap-
pears in the heading, and some such people don’t mind writing out a few macros. This
answer is for them.
The section number is typeset using the LaTeX internal \@seccntformat com-
mand, which is given the “name” (section, subsection, . . . ) of the heading, as argu-
ment. Ordinarily, \@seccntformat merely outputs the section number, and then a
\quad of space. Suppose you want to put a stop after every section (subsection, sub-
subsection, . . . ) number, a trivial change may be implemented by simple modification
of the command:
\renewcommand*{\@seccntformat}[1]{%
\csname the#1\endcsname.\quad
}

Many people (for some reason) want a stop after a section number, but not after a
subsection number, or any of the others. To do this, one must make \@seccntformat
switch according to its argument. The following technique for doing the job is slightly
wasteful, but is efficient enough for a relatively rare operation:
\let\@@seccntformat\@seccntformat
\renewcommand*{\@seccntformat}[1]{%
\expandafter\ifx\csname @seccntformat@#1\endcsname\relax
\expandafter\@@seccntformat
\else
\expandafter
\csname @seccntformat@#1\expandafter\endcsname
\fi
{#1}%
}

which looks to see if a second-level command has been defined, and uses it if so;
otherwise it uses the original. The second-level command to define stops after section
numbers (only) has the same definition as the original “all levels alike” version:
\newcommand*{\@seccntformat@section}[1]{%
\csname the#1\endcsname.\quad
}

Note that all the command definitions of this answer are dealing in LaTeX internal
commands, so the above code should be in a package file, for preference.
The Koma-script classes have different commands for specifying changes to sec-
tion number presentation: \partformat, \chapterformat and \othersectionlevelsformat,
but otherwise their facilities are similar to those of “raw” LaTeX.
KOMA script bundle: macros/latex/contrib/koma-script
190
321 There’s a space added after my environment
You’ve written your own environment env, and it works except that a space appears
at the start of the first line of typeset text after \end{env}. This doesn’t happen with
similar LaTeX-supplied environments.
You could impose the restriction that your users always put a “%” sign after the
environment . . . but LaTeX environments don’t require that, either.
The LaTeX environments’ “secret” is an internal flag which causes the unwanted
spaces to be ignored. Fortunately, you don’t have to use the internal form: since 1996,
LaTeX has had a user command \ignorespacesafterend, which sets the internal
flag.
322 Finding if a label is undefined
People seem to want to know (at run time) if a label is undefined (I don’t actually
understand why, particularly: it’s a transient state, and LaTeX deals with it quite well).
A resolved label is simply a command: \r@hlabel-namei; determining if the label
is set is then simply a matter of detecting if the command exists. The usual LaTeX
internal way of doing this is to use the command \@ifundefined:

\@ifundefined{r@label-name}{undef-cmds}{def-cmds}

In which, hlabel-namei is exactly what you would use in a \label command, and the
remaining two arguments are command sequences to be used if the label is undefined
(hundef-cmdsi) or if it is defined (hdef-cmdsi).
Note that any command that incorporates \@ifundefined is naturally fragile, so
remember to create it with \DeclareRobustCommand or to use it with \protect in a
moving argument.
If you’re into this game, you may well not care about LaTeX’s warning about un-
defined labels at the end of the document; however, if you are, include the command
\G@refundefinedtrue in hundef-cmdsi.
And of course, remember you’re dealing in internal commands, and pay attention
to the at-signs.
All the above can be avoided by using the labelcas package: it provides commands
that enable you to switch according to the state of a single label, or the states of a list
of labels. The package’s definition is a bit complicated, but the package itself is pretty
powerful.
labelcas.sty : macros/latex/contrib/labelcas

323 Master and slave counters


It’s common to have things numbered “per chapter” (for example, in the standard book
and report classes, figures, tables and footnotes are all numbered thus). The process
of resetting is done automatically, when the “master” counter is stepped (when the
\chapter command that starts chapter hni happens, the chapter counter is stepped,
and all the dependent counters are set to zero).
How would you do that for yourself? You might want to number algorithms per
section, or corrollaries per theorem, for example. If you’re defining these things by
hand, you declare the relationship when you define the counter in the first place:

\newcounter{new-name}[master]

says that every time counter hmasteri is stepped, counter hnew-namei will be reset.
But what if you have an uncooperative package, that defines the objects for you,
but doesn’t provide a programmer interface to make the counters behave as you want?
The \newcounter command uses a LaTeX internal command, and you can also
use it:

\@addtoreset{new-name}{master}

(but remember that it needs to be between \makeatletter and \makeatother, or in


a package of your own).
The chngcntr package encapsulates the \@addtoreset command into a command
\counterwithin. So:

\counterwithin*{corrollary}{theorem}

191
will make the corrollary counter slave to theorem counters. The command without its
asterisk:
\counterwithin{corrollary}{theorem}

will do the same, and also redefine \thecorrollary as htheorem numberi.hcorollary


numberi, which is a good scheme if you ever want to refer to the corrollaries — there
are potentially many “corrollary 1” in any document, so it’s as well to tie its num-
ber to the counter of the theorem it belongs to. This is true of pretty much any such
counter-within-another; if you’re not using the chngcntr, refer to the answer to redefin-
ing counters’ \the-commands for the necessary techniques.
Note that the technique doesn’t work if the master counter is page, the number of
the current page. The page counter is stepped deep inside the output routine, which
usually gets called some time after the text for the new page has started to appear:
so special techniques are required to deal with that. One special case is dealt with
elsewhere: footnotes numbered per page. One of the techniques described there, using
package perpage, may be applied to any counter. The command:
\MakePerPage{counter}

will cause hcounteri to be reset for each page. The package uses a label-like mech-
anism, and may require more than one run of LaTeX to stabilise counter values —
LaTeX will generate the usual warnings about labels changing.
chngcntr.sty : macros/latex/contrib/misc/chngcntr.sty
perpage.sty : Distributed as part macros/latex/contrib/bigfoot

T Things are Going Wrong. . .


T.1 Getting things to fit
324 Enlarging TeX
The TeX error message ‘capacity exceeded’ covers a multitude of problems. What has
been exhausted is listed in brackets after the error message itself, as in:

! TeX capacity exceeded, sorry


... [main memory size=263001].

Most of the time this error can be fixed without enlarging TeX. The most common
causes are unmatched braces, extra-long lines, and poorly-written macros. Extra-long
lines are often introduced when files are transferred incorrectly between operating sys-
tems, and line-endings are not preserved properly (the tell-tale sign of an extra-long
line error is the complaint that the ‘buf_size’ has overflowed).
If you really need to extend your TeX’s capacity, the proper method depends on
your installation. There is no need (with modern TeX implementations) to change the
defaults in Knuth’s WEB source; but if you do need to do so, use a change file to modify
the values set in module 11, recompile your TeX and regenerate all format files.
Modern implementations allow the sizes of the various bits of TeX’s memory to be
changed semi-dynamically. Some (such as emTeX) allow the memory parameters to be
changed in command-line switches when TeX is started; most frequently, a configura-
tion file is read which specifies the size of the memory. On web2c-based systems, this
file is called texmf.cnf: see the documentation that comes with the distribution for
other implementations. Almost invariably, after such a change, the format files need to
be regenerated after changing the memory parameters.
325 Why can’t I load PiCTeX?
PiCTeX is a resource hog; fortunately, most modern TeX implementations offer gener-
ous amounts of space, and most modern computers are pretty fast, so users aren’t too
badly affected by its performance.
However, PiCTeX has the further unfortunate tendency to fill up TeX’s fixed-size
arrays — notably the array of 256 ‘dimension’ registers. This is a particular problem
when you’re using pictex.sty with LaTeX and some other packages that also need
dimension registers. When this happens, you will see the TeX error message:

192
! No room for a new \dimen.

There is nothing that can directly be done about this error: you can’t extend the number
of available \dimen registers without extending TeX itself. e-TeX and Omega both do
this, as does MicroPress Inc’s VTeX.
It’s actually quite practical (with most modern distributions) to use e-TeX’s ex-
tended register set: use package etex (which comes with e-TeX distributions) and the
allocation mechanism is altered to cope with the larger register set: PiCTeX will now
load.
If you’re in some situation where you can’t use e-TeX, you need to change PiCTeX;
unfortunately PiCTeX’s author is no longer active in the TeX world, so one must resort
to patching. There are two solutions available.
The ConTeXt module m-pictex.tex (for Plain TeX and variants) or the corre-
sponding LaTeX m-pictex package provide an ingenious solution to the problem based
on hacking the code of \newdimen itself.
Alternatively, Andreas Schell’s pictexwd and related packages replace PiCTeX with
a version that uses 33 fewer \dimen registers; so use pictexwd in place of pictex (either
as a LaTeX package, or as a file to read into Plain TeX).
And how does one use PiCTeX anyway, given that the manual is so hard to come
by? Fortunately for us all, the MathsPic system may be used to translate a somewhat
different language into PiCTeX commands; and the MathsPic manual is free (and part
of the distribution). MathsPic is available either as a Basic program for DOS, or as a
Perl program for other systems (including Windows, nowadays).
m-pictex.sty : Distributed as part of macros/context/current/cont-tmf.zip
m-pictex.tex : Distributed as part of macros/context/current/cont-tmf.zip
MathsPic: graphics/mathspic
pictexwd.sty : Distributed as part of graphics/pictex/addon

T.2 Making things stay where you want them


326 Moving tables and figures in LaTeX
Tables and figures have a tendency to surprise, by floating away from where they were
specified to appear. This is in fact perfectly ordinary document design; any professional
typesetting package will float figures and tables to where they’ll fit without violating
the certain typographic rules. Even if you use the placement specifier h for ‘here’, the
figure or table will not be printed ‘here’ if doing so would break the rules; the rules
themselves are pretty simple, and are given on page 198, section C.9 of the LaTeX
manual. In the worst case, LaTeX’s rules can cause the floating items to pile up to the
extent that you get an error message saying “Too many unprocessed floats” (see “Too
many unprocessed floats”). What follows is a simple checklist of things to do to solve
these problems (the checklist talks throughout about figures, but applies equally well
to tables, or to “non-standard” floats defined by the float or other packages).

• Do your figures need to float at all? If not, consider the [H] placement option
offered by the float package: figures with this placement are made up to look as
if they’re floating, but they don’t in fact float. Beware outstanding floats, though:
the \caption commands are numbered in the order they appear in the document,
and a [H] float can ‘overtake’ a float that hasn’t yet been placed, so that figures
numbers get out of order.
• Are the placement parameters on your figures right? The default (tbp) is reason-
able, but you can reasonably change it (for example, to add an h). Whatever you
do, don’t omit the ‘p’: doing so could cause LaTeX to believe that if you can’t
have your figure here, you don’t want it anywhere. (LaTeX does try hard to avoid
being confused in this way. . . )
• LaTeX’s own float placement parameters could be preventing placements that
seem entirely “reasonable” to you — they’re notoriously rather conservative. To
encourage LaTeX not to move your figure, you need to loosen its demands. (The
most important ones are the ratio of text to float on a given page, but it’s sensible
to have a fixed set that changes the whole lot, to meet every eventuality.)
\renewcommand{\topfraction}{.85}
\renewcommand{\bottomfraction}{.7}

193
\renewcommand{\textfraction}{.15}
\renewcommand{\floatpagefraction}{.66}
\renewcommand{\dbltopfraction}{.66}
\renewcommand{\dblfloatpagefraction}{.66}
\setcounter{topnumber}{9}
\setcounter{bottomnumber}{9}
\setcounter{totalnumber}{20}
\setcounter{dbltopnumber}{9}
The meanings of these parameters are described on pages 199–200, section C.9 of
the LaTeX manual.
• Are there places in your document where you could ‘naturally’ put a \clearpage
command? If so, do: the backlog of floats is cleared after a \clearpage. (Note
that the \chapter command in the standard book and report classes implicitly
executes \clearpage, so you can’t float past the end of a chapter.)
• Try the placeins package: it defines a \FloatBarrier command beyond which
floats may not pass. A package option allows you to declare that floats may not
pass a \section command, but you can place \FloatBarriers wherever you
choose.
• If you are bothered by floats appearing at the top of the page (before they are spec-
ified in your text), try the flafter package, which avoids this problem by insisting
that floats should always appear after their definition.
• Have a look at the LaTeX 2ε afterpage package. Its documentation gives as an
example the idea of putting \clearpage after the current page (where it will clear
the backlog, but not cause an ugly gap in your text), but also admits that the pack-
age is somewhat fragile. Use it as a last resort if the other possibilities below don’t
help.
• If you would actually like great blocks of floats at the end of each of your chapters,
try the morefloats package; this ‘simply’ increases the number of floating inserts
that LaTeX can handle at one time (from 18 to 36).
• If you actually wanted all your figures to float to the end (e.g., for submitting a draft
copy of a paper), don’t rely on LaTeX’s mechanism: get the endfloat package to
do the job for you.

afterpage.sty : Distributed as part of macros/latex/required/tools


endfloat.sty : macros/latex/contrib/endfloat
flafter.sty : Part of the LaTeX distribution
float.sty : macros/latex/contrib/float
morefloats.sty : macros/latex/contrib/misc/morefloats.sty
placeins.sty : macros/latex/contrib/placeins

327 Underlined text won’t break


Knuth made no provision for underlining text: he took the view that underlining is not
a typesetting operation, but rather one that provides emphasis on typewriters, which
typically offer but one typeface. The corresponding technique in typeset text is to
switch from upright to italic text (or vice-versa): the LaTeX command \emph does just
that to its argument.
Nevertheless, typographically illiterate people (such as those that specify double-
spaced thesis styles — thesis styles) continue to require underlining of us, so LaTeX as
distributed defines an \underline command that applies the mathematical ‘underbar’
operation to text. This technique is not entirely satisfactory, however: the text gets
stuck into a box, and won’t break at line end.
Two packages are available that solve this problem. The ulem package redefines the
\emph command to underline its argument; the underlined text thus produced behaves
as ordinary emphasised text, and will break over the end of a line. (The package is
capable of other peculiar effects, too: read its documentation, contained within the file
itself.) The soul package defines an \ul command (after which the package is, in part,
named) that underlines running text.
Beware of ulem’s default behaviour, which is to convert the \emph command into
an underlining command; this can be avoided by loading the package with:
\usepackage[normalem]{ulem}
194
Documentation of ulem is in the package itself.
ulem.sty : macros/latex/contrib/misc/ulem.sty
soul.sty : macros/latex/contrib/soul

328 Controlling widows and orphans


Widows (the last line of a paragraph at the start of a page) and orphans (the first line of
paragraph at the end of a page) interrupt the reader’s flow, and are generally considered
“bad form”; (La)TeX takes some precautions to avoid them, but completely automatic
prevention is often impossible. If you are typesetting your own text, consider whether
you can bring yourself to change the wording slightly so that the page break will fall
differently.
The page maker, when forming a page, takes account of \widowpenalty and
\clubpenalty (which relates to orphans!). These penalties are usually set to the mod-
erate value of 150; this offers mild discouragement of bad breaks. You can increase
the values by saying (for example) \widowpenalty=500; however, vertical lists (such
as pages are made of) typically have rather little stretchability or shrinkability, so if
the page maker has to balance the effect of stretching the unstretchable and being pe-
nalised, the penalty will seldom win. Therefore, for typical layouts, there are only two
sensible settings for the penalties: finite (150 or 500, it doesn’t matter which) to allow
widows and orphans, and infinite (10000 or greater) to forbid them.
The problem can be avoided by allowing the pagemaker to run pages short, by
using the \raggedbottom directive; however, many publishers insist on the default
\flushbottom; it is seldom acceptable to introduce stretchability into the vertical list,
except at points (such as section headings) where the document design explicitly per-
mits it.
Once you’ve exhausted the automatic measures, and have a final draft you want to
“polish”, you should proceed to manual measures. To get rid of an orphan is simple:
precede the paragraph with \clearpage and the paragraph can’t start in the wrong
place.
Getting rid of a widow can be more tricky. If the paragraph is a long one, it may
be possible to set it ‘tight’: say \looseness=-1 immediately after the last word of
the paragraph. If that doesn’t work, try adjusting the page size: \enlargethispage
{\baselineskip} may do the job, and get the whole paragraph on one page. Re-
ducing the size of the page by \enlargethispage{-\baselineskip} may produce
a (more-or-less) acceptable “two-line widow”. (Note: \looseness=1, increasing the
line length by one, seldom seems to work — the looser paragraph typically has a one-
word final line, which doesn’t look much better than the straight widow.)

T.3 Things have “gone away”


329 Old LaTeX font references such as \tenrm
LaTeX 2.09 defined a large set of commands for access to the fonts that it had built
in to itself. For example, various flavours of cmr could be found as \fivrm, \sixrm,
\sevrm, \egtrm, \ninrm, \tenrm, \elvrm, \twlrm, \frtnrm, \svtnrm, \twtyrm and
\twfvrm. These commands were never documented, but certain packages nevertheless
used them to achieve effects they needed.
Since the commands weren’t public, they weren’t included in LaTeX 2ε ; to use
the unconverted LaTeX 2.09 packages under LaTeX 2ε , you need also to include the
rawfonts package (which is part of the LaTeX 2ε distribution).
330 Missing symbol commands
You’re processing an old document, and some symbol commands such as \Box and
\lhd appear no longer to exist. These commands were present in the core of La-
TeX 2.09, but are not in current LaTeX. They are available in the latexsym package
(which is part of the LaTeX distribution), and in the amsfonts package (which is part
of the AMS distribution, and requires AMS symbol fonts).
amsfonts.sty : fonts/amsfonts/latex
AMS symbol fonts: fonts/amsfonts/sources/symbols

195
331 Where are the msx and msy fonts?
The msx and msy fonts were designed by the American Mathematical Society in the
very early days of TeX, for use in typesetting papers for mathematical journals. They
were designed using the ‘old’ MetaFont, which wasn’t portable and is no longer avail-
able; for a long time they were only available in 300dpi versions which only imperfectly
matched modern printers. The AMS has now redesigned the fonts, using the current
version of MetaFont, and the new versions are called the msa and msb families.
Nevertheless, msx and msy continue to turn up to plague us. There are, of course,
still sites that haven’t got around to upgrading; but, even if everyone upgraded, there
would still be the problem of old documents that specify them.
If you have a .tex source that requests msx and msy, the best technique is to edit
it so that it requests msa and msb (you only need to change the single letter in the font
names).
If you have a DVI file that requests the fonts, there is a package of virtual fonts to
map the old to the new series.
msa and msb fonts: fonts/amsfonts/sources/symbols
virtual font set: fonts/vf-files/msx2msa

332 Where are the am fonts?


One still occasionally comes across a request for the am series of fonts. The initials
stood for ‘Almost [Computer] Modern’, and they were the predecessors of the Com-
puter Modern fonts that we all know and love (or hate)4 . There’s not a lot one can do
with these fonts; they are (as their name implies) almost (but not quite) the same as the
cm series; if you’re faced with a document that requests them, all you can reasonably
do is to edit the document. The appearance of DVI files that request them is sufficiently
rare that no-one has undertaken the mammoth task of creating a translation of them by
means of virtual fonts; however, most drivers let you have a configuration file in which
you can specify font substitutions. If you specify that every am font should be replaced
by its corresponding cm font, the output should be almost correct.

U Why does it do that?


U.1 Common errors
333 LaTeX gets cross-references wrong
Sometimes, however many times you run LaTeX, the cross-references are just wrong.
Remember that the \label command must come after the \caption command, or be
part of it. For example,
\begin{figure} \begin{figure}
\caption{A Figure} or
\caption{A Figure%
\label{fig} \label{fig}}
\end{figure} \end{figure}
You can, just as effectively, shield the \caption command from its associated
\label command, by enclosing the caption in an environment of its own. For example,
people commonly seek help after:
\begin{center}
\caption{A Figure}
\end{center}
\label{fig}

has failed to label correctly. If you really need this centring (and those in the know
commonly reject it), code it as:

\begin{center}
\caption{A Figure}
\label{fig}
\end{center}

4 The fonts acquired their label ‘Almost’ following the realisation that their first implementation in Meta-

Font79 still wasn’t quite right; Knuth’s original intention had been that they were the final answer.

196
334 Start of line goes awry
This answer concerns two sorts of problems: errors of the form

! Missing number, treated as zero.


<to be read again>
g
<*> [grump]

and those where a single asterisk at the start of a line mysteriously fails to appear in the
output.
Both problems arise because \\ takes optional arguments. The command \\*
means “break the line here, and inhibit page break following the line break”; the com-
mand \\[hdimeni] means “break the line here and add hdimeni extra vertical space
afterwards”.
So why does \\ get confused by these things at the start of a line? It’s looking for
the first non-blank thing, and in the test it uses ignores the end of the line in your input
text.
The solution is to enclose the stuff at the start of the new line in braces:

{\ttfamily
/* C-language comment\\
{[grump]} I don’t like this format\\
{*}/
}

(The above text derives from an actual post to comp.text.tex; this particular bit of
typesetting could plainly also be done using the verbatim environment.)
The problem also appears in maths mode, in arrays and so on. In this case, large-
scale bracketing of things is not a good idea; the TeX primitive \relax (which does
nothing except to block searches of this nature) may be used. From another comp.
text.tex example:

\begin{eqnarray}
[a] &=& b \\
\relax[a] &=& b
\end{eqnarray}

which is a usage this FAQ would not recommend, anyway: refer to the reason not to
use eqnarray.
Note that the amsmath package modifies the behaviour of \\ in maths. With
amsmath, the eqnarray example doesn’t need any special measures.
335 Why doesn’t verbatim work within . . . ?
The LaTeX verbatim commands work by changing category codes. Knuth says of this
sort of thing “Some care is needed to get the timing right. . . ”, since once the cate-
gory code has been assigned to a character, it doesn’t change. So \verb and \begin
{verbatim} have to assume that they are getting the first look at the parameter text;
if they aren’t, TeX has already assigned category codes so that the verbatim command
doesn’t have a chance. For example:

\verb+\error+

will work (typesetting ‘\error’), but

\newcommand{\unbrace}[1]{#1}
\unbrace{\verb+\error+}

will not (it will attempt to execute \error). Other errors one may encounter are
‘\verb ended by end of line’, or even the rather more helpful ‘\verb illegal in com-
mand argument’. The same sorts of thing happen with \begin{verbatim} . . . \end
{verbatim}:

\ifthenelse{\boolean{foo}}{%
\begin{verbatim}
foobar

197
\end{verbatim}
}{%
\begin{verbatim}
barfoo
\end{verbatim}
}

provokes errors like ‘File ended while scanning use of \@xverbatim’, as \begin
{verbatim} fails to see its matching \end{verbatim}.
This is why the LaTeX book insists that verbatim commands must not appear in the
argument of any other command; they aren’t just fragile, they’re quite unusable in any
command parameter, regardless of \protection. (The \verb command tries hard to
detect if you’re misusing it; unfortunately, it can’t always do so, and the error message
is therefore not a reliable indication of problems.)
The first question to ask yourself is: “is \verb actually necessary?”.

• If \texttt{your text} produces the same result as \verb+your text+, then


there’s no need of \verb in the first place.
• If you’re using \verb to typeset a URL or email address or the like, then the \url
command from the url package will help: it doesn’t suffer from the problems of
\verb.
• If you’re putting \verb into the argument of a boxing command (such as \fbox),
consider using the lrbox environment:
\newsavebox{\mybox}
...
\begin{lrbox}{\mybox}
\verb!VerbatimStuff!
\end{lrbox}
\fbox{\usebox{\mybox}}

Otherwise, there are three partial solutions to the problem.

• Some packages have macros which are designed to be responsive to verbatim


text in their arguments. For example, the fancyvrb package defines a com-
mand \VerbatimFootnotes, which redefines the \footnotetext (and hence
the \footnote) commands in such a way that you can include \verb com-
mands in its argument. This approach could in principle be extended to the
arguments of other commands, but it can clash with other packages: for exam-
ple, \VerbatimFootnotes interacts poorly with the para option to the footmisc
package.
The memoir class defines its \footnote command so that it will accept verbatim
in its arguments, without any supporting package.
• The fancyvrb package defines a command \SaveVerb, with a corresponding
\UseVerb command, that allow you to save and then to reuse the content of its
argument; for details of this extremely powerful facility, see the package docu-
mentation.
Rather simpler is the verbdef package, which defines a (robust) command which
expands to the verbatim argument given.
• If you have a single character that is giving trouble (in its absence you could simply
use \texttt), consider using \string. \texttt{my\string_name} typesets the
same as \verb+my_name+, and will work in the argument of a command. It won’t,
however, work in a moving argument, and no amount of \protection will make
it work in such a case.
A robust alternative is:
\chardef\us=‘\_
...
\section{... \texttt{my\us name}}
Such a definition is ‘naturally’ robust; the construction “hback-ticki\hchar i” may
be used for any troublesome character (though it’s plainly not necessary for things
like percent signs for which (La)TeX already provides robust macros).
Documentation of both url and verbdef is in the package files.
198
fancyvrb.sty : macros/latex/contrib/fancyvrb
memoir.cls: macros/latex/contrib/memoir
url.sty : macros/latex/contrib/misc/url.sty
verbdef.sty : macros/latex/contrib/misc/verbdef.sty

336 No line here to end


The error

! LaTeX Error: There’s no line here to end.

See the LaTeX manual or LaTeX Companion for explanation.

comes in reaction to you giving LaTeX a \\ command at a time when it’s not expecting
it. The commonest case is where you’ve decided you want the label of a list item to be
on a line of its own, so you’ve written (for example):

\begin{description}
\item[Very long label] \\
Text...
\end{description}

\\ is actually a rather bad command to use in this case (even if it worked), since it
would force the ‘paragraph’ that’s made up of the text of the item to terminate a line
which has nothing on it but the label. This would lead to an “Underfull \hbox”
warning message (usually with ‘infinite’ badness of 10000); while this message doesn’t
do any actual harm other than slowing down your LaTeX run, any message that doesn’t
convey any information distracts for no useful purpose.
The proper solution to the problem is to write a new sort of description en-
vironment, that does just what you’re after. (The LaTeX Companion — see LaTeX
Companion — offers a rather wide selection of variants of these things.)
The quick-and-easy solution, which avoids the warning, is to write:
\begin{description}
\item[Very long label] \hspace*{\fill} \\
Text...
\end{description}

which fills out the under-full line before forcing its closure. The expdlist package
provides the same functionality with its \breaklabel command, and mdwlist provides
it via its \desclabelstyle command.
The other common occasion for the message is when you’re using the center (or
flushleft or flushright) environment, and have decided you need extra separation
between lines in the environment:

\begin{center}
First (heading) line\\
\\
body of the centred text...
\end{center}

The solution here is plain: use the \\ command in the way it’s supposed to be used, to
provide more than just a single line break space. \\ takes an optional argument, which
specifies how much extra space to add; the required effect in the text above can be had
by saying:

\begin{center}
First (heading) line\\[\baselineskip]
body of the centred text...
\end{center}

expdlist.sty : macros/latex/contrib/expdlist
mdwlist.sty : Distributed as part of macros/latex/contrib/mdwtools

199
337 Two-column float numbers out of order
When LaTeX can’t place a float immediately, it places it on one of several “defer” lists.
If another float of the same type comes along, and the “defer” list for that type still has
something in it, the later float has to wait for everything earlier in the list.
Now, standard LaTeX has different lists for single-column floats, and double-
column floats; this means that single-column figures can overtake double-column
figures (or vice-versa), and you observe later figures appear in the document before
early ones. The same is true, of course, for tables, or for any user-defined float.
The LaTeX team recognise the problem, and provides a package (fixltx2e) to deal
with it. Fixltx2e amalgamates the two defer lists, so that floats don’t get out of order.
For those who are still running an older LaTeX distribution, the package fix2col
should serve. This package (also by a member of the LaTeX team) was the basis of the
relevant part of fixltx2e. The functionality has also been included in dblfloatfix, which
also has code to place full-width floats at [b] placement.
Once you have loaded the package, no more remains to be done: the whole require-
ment is to patch the output routine; no extra commands are needed.
dblfloatfix.sty : macros/latex/contrib/misc/dblfloatfix.sty
fix2col.sty : Distributed as part of macros/latex/contrib/carlisle
fixltx2e.sty : Part of the LaTeX distribution.

338 Accents misbehave in tabbing


So you are constructing a tabbing environment, and you have the need of some dia-
criticised text — perhaps something as simple as \’{e} — and the accent disappears
because it has been interpreted as a tabbing command, and everything goes wrong.
This is really a rather ghastly feature of the tabbing environment; in order to type
accented characters you need to use the \a kludge: so \a’{e} inside tabbing for \’
{e} outside, and similarly \a‘ for \‘ and \a= for \=. This whole procedure is of course
hideous and error-prone.
The simplest alternative is to type in an encoding that has the diacriticised charac-
ters in it, and to use an appropriate encoding definition file in the inputenc package. So
for example, type:
\usepackage[latin1]{inputenc}
...
\begin{tabbing}
...
... \> voilà \> ...

for:
... voilà ...
and the internal mechanisms of the inputenc package will put the right version of the
accent command in there.
A witty reversal of the rôles is introduced by the package Tabbing (note the capital
“T”): it provides a Tabbing environment which duplicates tabbing, but all the single-
character commands become complicated objects. So tabbing’s \> becomes \TAB>,
\= becomes \TAB=, and so on. The above trivial example would therefore become:

\usepackage{Tabbing}
...
\begin{Tabbing}
... ... \TAB> voil\‘a \TAB> ...

Tabbing.sty : macros/latex/contrib/Tabbing

339 Package reports “command already defined”


You load a pair of packages, and the second reports that one of the commands it defines
is already present. For example, both the txfonts and amsmath define a command \iint
(and \iiint and so on); so
...
\usepackage{txfonts}
\usepackage{amsmath}
200
produces a string of error messages of the form:

! LaTeX Error: Command \iint already defined.


Or name \end... illegal, see p.192 of the manual.

As a general rule, things that amsmath defines, it defines well; however, there is a good
case for using the txfonts version of \iint — the associated tx fonts have a double
integral symbol that doesn’t need to be “faked” in the way amsmath does. In the case
that you are loading several symbol packages, every one of which defines the same
symbol, you are likely to experience the problem in a big way (\euro is a common
victim).
There are similar cases where one package redefines another’s command, but no
error occurs because the redefining package doesn’t use \newcommand. Often, in such
a case, you only notice the change because you assume the definition given by the first
package. The amsmath–txfonts packages are just such a pair; txfonts doesn’t provoke
errors.
You may deal with the problem by saving and restoring the command. Macro
programmers may care to do this for themselves; for the rest of us, there’s the package
savesym. The sequence:

\usepackage{savesym}
\usepackage{amsmath}
\savesymbol{iint}
\usepackage{txfonts}
\restoresymbol{TXF}{iint}

does the job; restoring the amsmath version of the command, and making the txfonts
version of the command available as \TXFiint.
Documentation of savesym doesn’t amount to much: the only commands are
\savesymbol and \restoresymbol, as noted above.
amsmath.sty : Part of macros/latex/required/amslatex
savesym.sty : macros/latex/contrib/savesym/savesym.sty
txfonts.sty : Part of fonts/txfonts

340 Why are my sections numbered 0.1 . . . ?


This happens when your document is using the standard book or report class (or one
similar), and you’ve got a \section before your first \chapter.
What happens is, that the class numbers sections as “hchapter noi.hsection noi”,
and until the first chapter has appeared, the chapter number is 0.
If you’re doing this, it’s quite likely that the article class is for you; try it and see.
Otherwise, give your sections a ‘superior’ chapter, or do away with section numbering
by using \section* instead. An alternative way of avoiding numbering is discussed
in “unnumbered sections in the table of contents”.
341 Link text doesn’t break at end line
When using the hyperref package, you make a block of text “active” when you define
a hyper-link (when the user clicks on that text, the reader program will divert to the
target of the link).
The hyperref package uses a driver (in the same way as the graphics package does),
to determine how to implement all that hyper-stuff.
If you use the driver for dvips output (presumably you want to distill the resulting
PostScript), limitations in the way dvips deals with the \special commands mean that
hyperref must prevent link anchors from breaking at the end of lines. Other drivers
(notably those for PDFTeX and for dvipdfm) don’t suffer from this problem.
The problem may occur in a number of different circumstances. For a couple of
them, there are work-arounds:
First, if you have an URL which is active (so that clicking on it will activate your
web browser to “go to” the URL). In this case hyperref employs the url package to split
up the URL (as described in typesetting URLs), but the dvips driver then suppresses
the breaks. The way out is the breakurl package, which modifies the \url command to
produce several smaller pieces, between each of which a line break is permitted. Each
group of pieces, that ends up together in one line, is converted to a single clickable link.
201
Second, if you have a table of contents, list of figure or tables, or the like, hyperref
will ordinarily make the titles in the table of contents, or captions in the lists, active. If
the title or caption is long, it will need to break within the table, but the dvips driver
will prevent that. In this case, load hyperref with the option linktocpage, and only
the page number will be made active.
Otherwise, if you have a lengthy piece of text that you want active, you have at
present no simple solution: you have to rewrite your text, or to use a different PDF
generation mechanism.
breakurl.sty : macros/latex/contrib/breakurl

342 Page number is wrong at start of page


This is a long story, whose sources are deep inside the workings of TeX itself; it all
derives from the TeX’s striving to generate the best possible output.
The page number is conventionally stored in \count0; LaTeX users see this as the
counter page, and may typeset its value using \thepage.
The number (that is to say, \count0) is only updated when TeX actually outputs a
page. TeX only even tries to do this when it detects a hint that it may be a good thing
to do. From TeX’s point of view, the end of a paragraph is a good time to consider
outputting a page; it will output a page if it has more than a page’s worth of material to
output. (Ensuring it always has something in hand makes some optimisations possible.)
As a result, \count0 (\thepage) is almost always wrong in the first paragraph of a
page (the exception is where the page number has been “forcibly” changed, either by
changing its value directly, or by breaking the page where TeX wouldn’t necessarily
have chosen to break).
LaTeX provides a safe way of referring to the page number, by using label refer-
ences. So, rather than writing:
Here is page \thepage{}.

you should write:


Here is page \pageref{here}\label{here}.

(note: no space between the \pageref and the \label, since that could potentially
end up as a page-break space itself, which rather defeats the purpose of the exercise!).
343 My brackets don’t match
(La)TeX has a low-level mechanism for matching braces in document text. This means
you can type something like:
\section{All \emph{OK} now.}

and know that the first brace (for the argument of \section) will be matched with the
last brace, and the internal pair of braces (for the argument of \emph) will be matched
with each other. It’s all very simple.
However, LaTeX has a convention of enclosing optional arguments in brackets, as
in:
\section[OK]{All \emph{OK} now.}

These brackets are not matched by TeX mechanisms, despite the superficial similarity
of their use. As a result, straightforward-looking usages like:
\section[All [OK] now]{All \emph{OK} now.}

aren’t OK at all — the optional argument comes to consist of “All [OK”, and \section
takes the single character “n” (of the first “now”) as its argument.
Fortunately, TeX’s scanning mechanisms helps us by accepting the syntax “{]}” to
‘hide’ the closing bracket from the scanning mechanism that LaTeX uses. In practice,
the commonest way to use this facility is:
\section[All {[OK]} now]{All \emph{OK} now.}

since bracing the bracket on its own “looks odd”.


LaTeX has another argument syntax, even less regular, where the argument is en-
closed in parentheses, as in:
202
\put(1,2){foo}

(a picture environment command).


This mechanism is also prone to problems with matching closing parentheses, but
the issue seldom arises since such arguments rarely contain text. If it were to arise, the
same solution (enclosing the confused characters in braces) would solve the problem.

U.2 Common misunderstandings


344 What’s going on in my \include commands?
The original LaTeX provided the \include command to address the problem of
long documents: with the relatively slow computers of the time, the companion
\includeonly facility was a boon. With the vast increase in computer speed,
\includeonly is less valuable (though it still has its place in some very large projects).
Nevertheless, the facility is retained in current LaTeX, and causes some confusion to
those who misunderstand it.
In order for \includeonly to work, \include makes a separate .aux file for each
included file, and makes a ‘checkpoint’ of important parameters (such as page, figure,
table and footnote numbers); as a direct result, it must clear the current page both
before and after the \include command. What’s more, this mechanism doesn’t work
if a \include command appears in a file that was \included itself: LaTeX diagnoses
this as an error.
So, we can now answer the two commonest questions about \include:

• Why does LaTeX throw a page before and after \include commands?
Answer: because it has to. If you don’t like it, replace the \include command
with \input — you won’t be able to use \includeonly any more, but you prob-
ably don’t need it anyway, so don’t worry.
• Why can’t I nest \included files? — I always used to be able to under LaTeX 2.09.
Answer: in fact, you couldn’t, even under LaTeX 2.09, but the failure wasn’t di-
agnosed. However, since you were happy with the behaviour under LaTeX 2.09,
replace the \include commands with \input commands (with \clearpage as
appropriate).

345 Why does it ignore paragraph parameters?


When TeX is laying out text, it doesn’t work from word to word, or from line to line;
the smallest complete unit it formats is the paragraph. The paragraph is laid down
in a buffer, as it appears, and isn’t touched further until the end-paragraph marker is
processed. It’s at this point that the paragraph parameters have effect; and it’s because
of this sequence that one often makes mistakes that lead to the paragraph parameters
not doing what one would have hoped (or expected).
Consider the following sequence of LaTeX:

{\raggedright % declaration for ragged text


Here’s text to be ranged left in our output,
but it’s the only such paragraph, so we now
end the group.}

Here’s more that needn’t be ragged...

TeX will open a group, and impose the ragged-setting parameters within that group;
it will then save a couple of sentences of text and close the group (thus restoring the
previous value of the parameters that \raggedright set). Then TeX encounters a
blank line, which it knows to treat as a \par token, so it typesets the two sentences;
but because the enclosing group has now been closed, the parameter settings have been
lost, and the paragraph will be typeset normally.
The solution is simple: close the paragraph inside the group, so that the setting
parameters remain in place. An appropriate way of doing that is to replace the last
three lines above with:

end the group.\par}


Here’s more that needn’t be ragged...

203
In this way, the paragraph is completed while \raggedright’s parameters are still in
force within the enclosing group.
Another alternative is to define an environment that does the appropriate job for
you. For the above example, LaTeX already defines an appropriate one:

\begin{flushleft}
Here’s text to be ranged left...
\end{flushleft}

In fact, there are a number of parameters for which TeX only maintains one value
per paragraph. A tiresome one is the set of upper case/lower case translations, which
(oddly enough) constrains hyphenation of mutilingual texts. Another that regularly
creates confusion is \baselineskip.
346 Case-changing oddities
TeX provides two primitive commands \uppercase and \lowercase to change the
case of text; they’re not much used, but are capable creating confusion.
The two commands do not expand the text that is their parameter — the result
of \uppercase{abc} is ‘ABC’, but \uppercase{\abc} is always ‘\abc’, whatever
the meaning of \abc. The commands are simply interpreting a table of equivalences
between upper- and lowercase characters. They have (for example) no mathematical
sense, and

\uppercase{About $y=f(x)$}

will produce

ABOUT $Y=F(X)$

which is probably not what is wanted.


In addition, \uppercase and \lowercase do not deal very well with non-
American characters, for example \uppercase{\ae} is the same as \ae.
LaTeX provides commands \MakeUppercase and \MakeLowercase which fixes
the latter problem. These commands are used in the standard classes to produce upper
case running heads for chapters and sections.
Unfortunately \MakeUppercase and \MakeLowercase do not solve the other prob-
lems with \uppercase, so for example a section title containing \begin{tabular} . . .
\end{tabular} will produce a running head containing \begin{TABULAR}. The sim-
plest solution to this problem is using a user-defined command, for example:

\newcommand{\mytable}{\begin{tabular}...
\end{tabular}}
\section{A section title \protect\mytable{}
with a table}

Note that \mytable has to be protected, otherwise it will be expanded and made upper
case; you can achieve the same result by declaring it with \DeclareRobustCommand,
in which case the \protect won’t be necessary.
David Carlisle’s textcase package addresses many of these problems in a transpar-
ent way. It defines commands \MakeTextUppercase and \MakeTextLowercase
which do upper- or lowercase, with the fancier features of the LaTeX standard
\Make*-commands but without the problems mentioned above. Load the package with
\usepackage[overload]{textcase}, and it will redefine the LaTeX commands (not
the TeX primitive commands \uppercase and \lowercase), so that section headings
and the like don’t produce broken page headings.
textcase.sty : macros/latex/contrib/textcase

347 Why does LaTeX split footnotes across pages?


LaTeX splits footnotes when it can think of nothing better to do. Typically, when this
happens, the footnote mark is at the bottom of the page, and the complete footnote
would overfill the page. LaTeX could try to salvage this problem by making the page
short of both the footnote and the line with the footnote mark, but its priorities told it
that splitting the footnote would be preferable.

204
As always, the best solution is to change your text so that the problem doesn’t occur
in the first place. Consider whether the text that bears the footnote could move earlier
in the current page, or on to the next page.
If this isn’t possible, you might want to change LaTeX’s perception of its priorities:
they’re controlled by \interfootnotelinepenalty — the larger it is, the less willing
LaTeX is to split footnotes.
Setting

\interfootnotelinepenalty=10000

inhibits split footnotes altogether, which will cause ‘Underfull \vbox’ messages un-
less you also specify \raggedbottom. The default value of the penalty is 100, which
is rather mild.
An alternative technique is to juggle with the actual size of the pages. \enlargethispage
changes the size of the current page by its argument (for example, you might say
\enlargethispage{\baselineskip} to add a single line to the page, but you can
use any ordinary TeX length such as 15mm or -20pt as argument). Reducing the size
of the current page could force the offending text to the next page; increasing the size
of the page may allow the footnote to be included in its entirety. It may be necessary
to change the size of more than one page.
The fnbreak package detects (and generates warnings about) split footnotes.
fnbreak.sty : macros/latex/contrib/fnbreak

348 Getting \marginpar on the right side


In an ideal world, marginal notes would be in “analagous” places on every page: notes
on an even-side page would be in the left margin, while those on an odd-side page
would be in the right margin. A moment’s thought shows that a marginal note on
the left needs to be typeset differently from a marginal note on the right. The La-
TeX \marginpar command therefore takes two arguments in a twoside documents:
\marginpar[left text]{right text}. LaTeX uses the “obvious” test to get the
\marginpars in the correct margin, but a booby-trap arises because TeX runs its page
maker asynchronously. If a \marginpar is processed while page n is being built, but
doesn’t get used until page n+1, then the \marginpar will turn up on the wrong side
of the page. This is an instance of a general problem: see “finding if you’re on an odd
or an even page”.
The solution to the problem is for LaTeX to ‘remember’ which side of the page
each \marginpar should be on. The mparhack package does this, using label-like
marks stored in the .aux file; the memoir class does likewise.
memoir.cls: macros/latex/contrib/memoir
mparhack.sty : macros/latex/contrib/mparhack

349 Where have my characters gone?


You’ve typed some apparently reasonable text and processed it, but the result contains
no sign of some of the characters you typed. A likely reason is that the font you selected
just doesn’t have a representation for the character in question.
For example, if I type “that will be £44.00” into an ordinary (La)TeX document,
or if I select the font rsfs10 (which contains uppercase letters only) and type pretty
much anything, the £ sign, or any lowercase letters or digits will not appear in the
output. There’s no actual error message, either: you have to read the log file, where
you’ll find cryptic little messages like

Missing character: There is no ^^a3 in font cmr10!


Missing character: There is no 3 in font rsfs10!

(the former demonstrating my TeX’s unwillingness to deal in characters which have the
eighth bit set, while the rsfs10 example shows that TeX will log the actual character
in error, if it thinks it’s possible).
Somewhat more understandable are the diagnostics you may get from dvips when
using the OT1 and T1 versions of fonts that were supplied in Adobe standard encoding:

dvips: Warning: missing glyph ‘Delta’

205
The process that generates the metrics for using the fonts generates an instruction to
dvips to produce these diagnostics, so that their non-appearance in the printed output is
less surprising than it might be. Quite a few glyphs provided in Knuth’s text encodings
and in the Cork encoding are not available in the Adobe fonts. In these cases, there is
a typeset sign of the character: dvips produces a black rectangle of whatever size the
concocted font file has specified.
350 “Rerun” messages won’t go away
The LaTeX message “Rerun to get crossreferences right” is supposed to warn the user
that the job needs to be processed again, since labels seem to have changed since
the previous run. (LaTeX compares the labels it has created this time round with
what it found from the previous run when it started; it does this comparison at \end
{document}.)
Sometimes, the message won’t go away: however often you reprocess your docu-
ment, LaTeX still tells you that “Label(s) may have changed”. This can sometimes be
caused by a broken package: both footmisc (with the perpage option) and hyperref
have been known to give trouble, in the past: if you are using either, check you have
the latest version, and upgrade if possible.
However, there is a rare occasion when this error can happen as a result of patholog-
ical structure of the document itself. Suppose you have pages numbered in roman, and
you add a reference to a label on page “ix” (9). The presence of the reference pushes
the thing referred to onto page “x” (10), but since that’s a shorter reference the label
moves back to page “ix” at the next run. Such a sequence can obviously not terminate.
The only solution to this problem is to make a small change to your document
(something as small as adding or deleting a comma will often be enough).
footmisc.sty : macros/latex/contrib/footmisc
hyperref.sty : macros/latex/contrib/hyperref

351 Commands gobble following space


People are forever surprised that simple commands gobble the space after them: this
is just the way it is. The effect arises from the way TeX works, and Lamport describes
a solution (place a pair of braces after a command’s invocation) in the description of
LaTeX syntax. Thus the requirement is in effect part of the definition of LaTeX.
This FAQ, for example, is written with definitions that require one to type \fred{}
for almost all macro invocations, regardless of whether the following space is required:
however, this FAQ is written by highly dedicated (and, some would say, eccentric)
people. Many users find all those braces become very tedious very quickly, and would
really rather not type them all.
An alternative structure, that doesn’t violate the design of LaTeX, is to say
\fred\ — the \ command is “self terminating” (like \\) and you don’t need braces
after it. Thus one can reduce to one the extra characters one needs to type.
If even that one character is too many, the package xspace defines a command
\xspace that guesses whether there should have been a space after it, and if so intro-
duces that space. So “fred\xspace jim” produces “fred jim”, while “fred\xspace.
jim” produces “fred. jim”. Which usage would of course be completely pointless; but
you can incorporate \xspace in your own macros:

\usepackage{xspace}
...
\newcommand{\restenergy}{\ensuremath{mc^2}\xspace}
...
and we find \restenergy available to us...

The \xspace command must be the last thing in your macro definition (as in the exam-
ple); it’s not completely foolproof, but it copes with most obvious situations in running
text.
The xspace package doesn’t save you anything if you only use a modified macro
once or twice within your document. In any case, be careful with usage of \xspace —
it changes your input syntax, which can be confusing, notably to a collaborating author
(particularly if you create some commands which use it and some which don’t). Of
course, no command built into LaTeX or into any “standard” class or package will use
\xspace.
206
xspace.sty : Distributed as part of macros/latex/required/tools

352 (La)TeX makes overfull lines


When TeX is building a paragraph, it can make several attempts to get the line-breaking
right; on each attempt it runs the same algorithm, but gives it different parameters. You
can affect the way TeX’s line breaking works by adjusting the parameters: this answer
deals with the “tolerance” and stretchability parameters. The other vital ‘parameter’ is
the set of hyphenations to be applied: see “my words aren’t being hyphenated” (and
the questions it references) for advice.
If you’re getting an undesired “overfull box”, what has happened is that TeX has
given up: the parameters you gave it don’t allow it to produce a result that doesn’t
overfill. In this circumstance, Knuth decided the best thing to do was to produce a
warning, and to allow the user to solve the problem. (The alternative, silently to go
beyond the envelope of “good taste” defined for this run of TeX, would be distasteful
to any discerning typographer.) The user can almost always address the problem by
rewriting the text that’s provoking the problem — but that’s not always possible, and in
some cases it’s impossible to solve the problem without adjusting the parameters. This
answer discusses the approaches one might take to resolution of the problem, on the
assumption that you’ve got the hyphenation correct.
The simplest case is where a ‘small’ word fails to break at the end of a line; pushing
the entire word to a new line isn’t going to make much difference, but it might make
things just bad enough that TeX won’t do it by default. In such a case on can try the
LaTeX \linebreak command: it may solve the problem, and if it does, it will save an
awful lot of fiddling. Otherwise, one needs to adjust parameters: to do that we need to
recap the details of TeX’s line breaking mechanisms.
TeX’s first attempt at breaking lines is performed without even trying hyphenation:
TeX sets its “tolerance” of line breaking oddities to the internal value \pretolerance,
and sees what happens. If it can’t get an acceptable break, TeX adds the hyphenation
points allowed by the current patterns, and tries again using the internal \tolerance
value. If this pass also fails, and the internal \emergencystretch value is positive,
TeX will try a pass that allows \emergencystretch worth of extra stretchability to the
spaces in each line.
In principle, therefore, there are three parameters (other than hyphenation) that
you can change: \pretolerance, \tolerance and \emergencystretch. Both the
tolerance values are simple numbers, and should be set by TeX primitive count as-
signment — for example

\pretolerance=150

For both, an “infinite” tolerance is represented by the value 10 000, but infinite toler-
ance is rarely appropriate, since it can lead to very bad line breaks indeed.
\emergencystretch is a TeX-internal ‘dimen’ register, and can be set as normal
for dimens in Plain TeX; in LaTeX, use \setlength — for example:

\setlength{\emergencystretch}{3em}

The choice of method has time implications — each of the passes takes time, so
adding a pass (by changing \emergencystretch) is less desirable than suppressing
one (by changing \pretolerance). However, it’s unusual nowadays to find a com-
puter that’s slow enough that the extra passes are really troublesome.
In practice, \pretolerance is rarely used other than to manipulate the use of hy-
phenation; Plain TeX and LaTeX both set its value to 100. To suppress the first scan of
paragraphs, set \pretolerance to -1.
\tolerance is often a good method for adjusting spacing; Plain TeX and LaTeX
both set its value to 200. LaTeX’s \sloppy command sets it to 9999, as does the
sloppypar environment. This value is the largest available, this side of infinity, and
can allow pretty poor-looking breaks (this author rarely uses \sloppy “bare”, though
he does occasionally use sloppypar — that way, the change of \tolerance is con-
fined to the environment). More satisfactory is to make small changes to \tolerance,
incrementally, and then to look to see how the change affects the result; very small
increases can often do what’s necessary. Remember that \tolerance is a paragraph
parameter, so you need to ensure it’s actually applied — see “ignoring paragraph pa-
rameters”. LaTeX users could use an environment like:
207
\newenvironment{tolerant}[1]{%
\par\tolerance=#1\relax
}{%
\par
}

enclosing entire paragraphs (or set of paragraphs) in it.


The value of \emergencystretch is added to the assumed stretchability of each
line of a paragraph, in a further run of the paragraph formatter in case that the
paragraph can’t be made to look right any other way. (The extra scan happens if
\emergencystretch>0pt — if it’s zero or negative, no gain could be had from re-
running the paragraph setter.) The example above set it to 3em; the Computer Modern
fonts ordinarily fit three space skips to the em, so the change would allow anything up
to the equivalent of nine extra spaces in each line. In a line with lots of spaces, this
could be reasonable, but with (say) only three spaces on the line, each could stretch to
four times its natural width. It is therefore clear that \emergencystretch needs to be
treated with a degree of caution.
More subtle (but more tricky to manage) are the microtypographic extensions pro-
vided by PDFTeX. Since PDFTeX is the default ‘engine’ for LaTeX and ConTeXt
work in all distributions, nowadays, the extensions are available to all. There are two
extensions, margin kerning and font expansion; margin kerning only affects the visual
effect of the typeset page, and has little effect on the ability of the paragraph setter
to “get things right”. Font expansion works like a subtler version of the trick that
\emergencystretch plays: PDFTeX ‘knows’ that your current font may be stretched
(or shrunk) to a certain extent, and will do that “on the fly” to optimise the setting of a
paragraph. This is a powerful tool in the armoury of the typesetter.
As mentioned above, the microtypographic extensions are tricky beasts to control;
however, the microtype package relieves the user of the tedious work of specifying how
to perform margin adjustments and how much to scale each font . . . for the fonts the
package knows about; it’s a good tool, and users who can take on the specification of
adjustments for yet more fonts are always welcome.
microtype.sty : macros/latex/contrib/microtype

353 Maths symbols don’t scale up


By default, the “large” maths symbols stay at the same size regardless of the font size
of the text of the document. There’s good reason for this: the cmex fonts aren’t really
designed to scale, so that TeX’s maths placement algorithms don’t perform as well as
they might when the fonts are scaled.
However, this behaviour confounds user expectations, and can lead to slightly odd-
looking documents. If you want the fonts to scale, despite the warning above, use the
exscale package — just loading it is enough.
exscale.sty : Part of the LaTeX distribution.

354 Why doesn’t \linespread work?


The command \linespread{factor} is supposed to multiply the current \baselineskip
by hfactori; but, to all appearances, it doesn’t.
In fact, the command is equivalent to \renewcommand{\baselinestretch}
{factor}: written that way, it somehow feels less surprising that the effect isn’t
immediate. The \baselinestretch factor is only used when a font is selected; a
mere change of \baselinestretch doesn’t change the font, any more than does the
command \fontsize{size}{baselineskip} — you have to follow either command
with \selectfont. So:

\fontsize{10}{12}%
\selectfont

or:

\linespread{1.2}%
\selectfont

Of course, a package such as setspace, whose job is to manage the baseline, will
deal with all this stuff — see “managing double-spaced documents”. If you want to
208
avoid setspace, beware the behaviour of linespread changes within a paragraph: read
“\baselineskip is a paragraph parameter”.
setspace.sty : macros/latex/contrib/setspace/setspace.sty

355 Only one \baselineskip per paragraph


The \baselineskip is not (as one might hope) a property of a line, but of a paragraph.
As a result, in a 10pt (nominal) document (with a default \baselineskip of 12pt), a
single character with a larger size, as:
{\Huge A}

will be squashed into the paragraph: TeX will make sure it doesn’t scrape up against
the line above, but won’t give it “room to breathe”, as it does the text at standard size;
that is, its size (24.88pt) is taken account of, but its \baselineskip (30pt) isn’t.
Similarly
Paragraph text ...
{\footnotesize Extended interjection ...
... into the paragraph.}
... paragraph continues ...

will look silly, since the 8pt interjection will end up set on the 12pt \baselineskip
of the paragraph, rather than its preferred 8.5pt. Finally, something like
Paragraph text ...
... paragraph body ends.
{\footnotesize Short comment on paragraph.}

Next paragraph starts...

will set the body of the first paragraph on the constricted \baselineskip of the
\footnotesize comment.
So, how to deal with these problems? The oversized (short) section is typically
corrected by a strut: this word comes from movable metal typography, and refers to a
spacer that held the boxes (that contained the metal character shapes) apart. Every time
you change font size, LaTeX redefines the command \strut to provide the equivalent
of a metal-type strut for the size chosen. So for the example above, we would type
Paragraph text ...
{\Huge A\strut}
... paragraph continues ...

However, more extended insertions (whether of larger or smaller text) are always going
to cause problems; while you can strut larger text, ensuring that you strut every line will
be tiresome, and there’s no such thing as a “negative strut” that pulls the lines together
for smaller text.
The only satisfactory way to deal with an extended insertion at a different size is
to set it off as a separate paragraph. A satisfactory route to achieving this is the quote
environment, which sets its text modestly inset from the enclosing paragraph:
Paragraph text ...
\begin{quote}
\footnotesize This is an inset account
of something relevant to the enclosing
paragraph...
\end{quote}
... paragraph continues ...

Such quote-bracketing also deals with the problem of a trailing comment on the para-
graph.
356 Numbers too large in table of contents, etc.
LaTeX constructs the table of contents, list of figures, tables, and similar tables, on the
basis of a layout specified in the class. As a result, they do not react to the sizes of
things in them, as they would if a tabular environment (or something similar) was
used.
209
This arrangement can provoke problems, most commonly with deep section nesting
or very large page numbers: the numbers in question just don’t fit in the space allowed
for them in the class.
A separate answer discusses re-designing the tables — re-designing the tables —
and those techniques can be employed to make the numbers fit.
357 Why is the inside margin so narrow?
If you give the standard classes the twoside option, the class sets the margins narrow
on the left of odd-numbered pages, and on the right of even-numbered pages. This is
often thought to look odd, but it is quite right.
The idea is that the typographic urge for symmetry should also apply to margins: if
you lay an even numbered page to the left of an odd-numbered one, you will see that
you’ve three equal chunks of un-printed paper: the left margin of the even page, the
right margin of the odd page, and the two abutting margins together.
This is all very fine in the abstract, but in practical book(let) production it only
works “sometimes”.
If your booklet is produced on double-width paper and stapled, the effect will be
good; if your book(let) is produced using a so-called “perfect” binding, the effect will
again be good.
However, almost any “quality” book-binder will need some of your paper to grab
hold of, and a book bound in such a way won’t exhibit the treasured symmetry unless
you’ve done something about the margin settings.
The packages recommended in “setting up margins” mostly have provision for
a “binding offset” or a “binding correction” — search for “binding” in the manuals
(vmargin doesn’t help, here).
If you’re doing the job by hand (see manual margin setup), the trick is to calculate
your page and margin dimensions as normal, and then:

• subtract the binding offset from \evensidemargin, and


• add the binding offset to \oddsidemargin.

which can be achieved by:


\addtolength{\evensidemargin}{-offset}
\addtolength{\oddsidemargin}{offset}

(substituting something sensible like “5mm” for “offset”, above).


The above may not be the best you can do: you may well choose to change the
\textwidth in the presence of the binding offset; but the changes do work for constant
\textwidth.

U.3 Why shouldn’t I?


358 Why use fontenc rather than t1enc?
In the very earliest days of LaTeX 2ε , the only way to use the T1 encoding was t1enc;
with the summer 1994 “production” release, the fontenc package appeared, and pro-
vided comprehensive support for use of the encoding.
Nevertheless, the t1enc package remains (as part of the LaTeX 2.09 compatibility
code), but it does very little: it merely selects font encoding T1, and leaves to the user
the business of generating the character codes required.
Generating such character codes could be a simple matter, if the T1 encoding
matched any widely-supported encoding standard, since in that case, one might ex-
pect one’s keyboard to generate the character codes. However, the T1 encoding is a
mix of several standard encodings, and includes code points in areas of the table which
standard encodings specifically exclude, so no T1 keyboards have been (or ever will
be) manufactured.
By contrast, the fontenc package generates the T1 code points from ordinary LaTeX
commands (e.g., it generates the é character codepoint from the command \’e). So,
unless you have program-generated T1 input, use \usepackage[T1]{fontenc} rather
than \usepackage{t1enc}.

210
359 Why bother with inputenc and fontenc?
The standard input encoding for Western Europe (pending the arrival of Unicode) is
ISO 8859–1 (commonly known by the standard’s subtitle ‘Latin-1’). Latin-1 is re-
markably close, in the codepoints it covers, to the (La)TeX T1 encoding.
In this circumstance, why should one bother with inputenc and fontenc? Since
they’re pretty exactly mirroring each other, one could do away with both, and use just
t1enc, despite its shortcomings.
One doesn’t do this for a variety of small reasons:
Confusion You’ve been happily working in this mode, and for some reason find you’re
to switch to writing in German: the effect of using “ß” is somewhat startling, since
T1 and Latin-1 treat the codepoint differently.
Compatibility You find yourself needing to work with a colleague in Eastern Europe:
their keyboard is likely to be set to produce Latin-2, so that the simple mapping
doesn’t work.
Traditional LaTeX You lapse and write something like \’e rather than typing é; only
fontenc has the means to convert this LaTeX sequence into the T1 character, so an
\accent primitive slips through into the output, and hyphenation is in danger.

The inputenc–fontenc combination seems slow and cumbersome, but it’s safe.
360 Why not use eqnarray?
The environment eqnarray is attractive for the occasional user of mathematics in La-
TeX documents: it seems to allow aligned systems of equations. Indeed it does supply
such things, but it makes a serious mess of spacing. In the system:
\begin{eqnarray}
a & = & b + c \\
x & = & y - z
\end{eqnarray}

the spacing around the “=” signs is not that defined in the metrics for the font from
which the glyph comes — it’s \arraycolsep, which may be set to some very odd
value for reasons associated with real arrays elsewhere in the document.
The user is far better served by the AMSLaTeX bundle, which provides an align
environment, which is designed with the needs of mathematicians in mind (as opposed
to the convenience of LaTeX programmers). For this simple case (align is capable of
far greater things), code as:
\begin{align}
a & = b + c \\
x & = y - z
\end{align}

AMSLaTeX : macros/latex/required/amslatex

361 Why use \[ . . . \] in place of $$ . . . $$?


LaTeX defines inline- and display-maths commands, apparently analagous to those
that derive from the TeX primitive maths sequences bracketing maths commands with
single dollar signs (or pairs of dollar signs).
As it turns out, LaTeX’s inline maths grouping, \( ... \), has precisely the
same effect as the TeX primitive version $... $. (Except that the LaTeX version
checks to ensure you don’t put \( and \) the wrong way round.)
In this circumstance, one often finds LaTeX users, who have some experience of
using Plain TeX, merely assuming that LaTeX’s display maths grouping \[ ... \]
may be replaced by the TeX primitive display maths $$... $$.
Unfortunately, they are wrong: if LaTeX code is going to patch display maths, it
can only do so by patching \[ and \]. The most obvious way this turns up, is that
the class option fleqn simply does not work for equations coded using $$... $$,
whether you’re using the standard classes alone, or using package amsmath. Also,
the \[ and \] has code for rationalising vertical spacing in some extreme cases; that
code is not available in $$... $$, so if you use the non-standard version, you may
occasionally observe inconsistent vertical spacing .
There are more subtle effects (especially with package amsmath), and the simple
rule is \[ ... \] whenever unadorned displayed maths is needed in LaTeX.
211
362 What’s wrong with \bf, \it, etc.?
The font-selection commands of LaTeX 2.09 were \rm, \sf, \tt, \it, \sl, \em and
\bf; they were modal commands, so you used them as:

{\bf Fred} was {\it here\/}.

with the font change enclosed in a group, so as to limit its effect; note the italic correc-
tion command \/ that was necessary at the end of a section in italics.
At the release of LaTeX 2ε in summer 1994, these simple commands were depre-
cated, but recognising that their use is deeply embedded in the brains of LaTeX users,
the commands themselves remain in LaTeX, with their LaTeX 2.09 semantics. Those
semantics were part of the reason they were deprecated: each \xx overrides any other
font settings, keeping only the size. So, for example,
{\bf\it Here we are again\/}

ignores \bf and produces text in italic, medium weight (and the italic correction has a
real effect), whereas
{\it\bf happy as can be\/}

ignore \it and produces upright text at bold weight (and the italic correction has noth-
ing to do). The same holds if you mix LaTeX 2ε font selections with the old style
commands:
\textbf{\tt all good friends}

ignores the \textbf that encloses the text, and produces typewriter text at medium
weight.
So why are these commands deprecated? — it is because of confusions such as that
in the last example. The alternative (LaTeX 2ε ) commands are discussed in the rest of
this answer.
LaTeX 2ε ’s font commands come in two forms: modal commands and text-block
commands. The default set of modal commands offers weights \mdseries and
\bfseries, shapes \upshape, \itshape, \scshape and \slshape, and families
\rmfamily, \sffamily and \ttfamily. A font selection requires a family, a shape
and a series (as well as a size, of course). A few examples
{\bfseries\ttfamily and jolly good company!}

produces bold typewriter text (but note the lack of a bold typewriter font in the default
Computer Modern fonts), or
{\slshape\sffamily Never mind the weather\/}

(note the italic correction needed on slanted fonts, too).


LaTeX 2ε ’s text block commands take the first two letters of the modal commands,
and form a \textxx command from them. Thus \bfseries becomes \textbf
{text}, \itshape becomes \textit{text}, and \ttfamily becomes \texttt
{text}. Block commands may be nested, as:

\textit{\textbf{Never mind the rain}}

to produce bold italic text (note that the block commands supply italic corrections
where necessary), and they be nested with the LaTeX 2ε modal commands, too:
\texttt{\bfseries So long as we’re together}

for bold typewriter, or


{\slshape \textbf{Whoops! she goes again}\/}

for a bold slanted instance of the current family (note the italic correction applied at
the end of the modal command group, again).
The new commands (as noted above) override commands of the same type. In
almost all cases, this merely excludes ludicrous ideas such as “upright slanted” fonts,
or “teletype roman” fonts. There are a couple of immediate oddities, though. The first
is the conflict between \itshape (or \slshape) and \scshape: while many claim
212
that an italic small-caps font is typographically unsound, such fonts do exist. Daniel
Taupin’s smallcap package enables use of the instances in the EC fonts, and similar
techniques could be brought to bear on many other font sets. The second is the conflict
between \upshape and \itshape: Knuth actually offers an upright-italic font which
LaTeX uses for the “£” symbol in the default font set. The combination is sufficiently
weird that, while there’s a defined font shape, no default LaTeX commands exist; to
use the shape, the (eccentric) user needs LaTeX’s simplest font selection commands:
{\fontshape{ui}\selectfont Tra la la, di dee}

smallcap.sty : macros/latex/contrib/smallcap

363 What’s wrong with \newfont?


If all else fails, you can specify a font using the LaTeX \newfont command. The font
so specified doesn’t fit into the LaTeX font selection mechanism, but the technique
can be tempting under several circumstances. The command is merely the thinnest of
wrappers around the \font primitive, and suffers from exactly the problems with font
encodings and sizes that are outlined in using Plain TeX commands in LaTeX.
Almost all fonts, nowadays, are provided with LaTeX control files (if they’re
adapted to TeX at all). There is therefore little gain in using \newfont.
One temptation arises from the way that LaTeX restricts the sizes of fonts. In fact,
this restriction only significantly applies to the default (Computer Modern) and the
Cork-encoded (T1) EC fonts, but it is widely considered to be anomalous, nowadays.
In recognition of this problem, there is a package fix-cm which will allow you to use the
fonts, within LaTeX, at any size you choose. If you’re not using scaleable versions of
the fonts, most modern distributions will just generate an appropriate bitmap for you.
So, suppose you want to use Computer Modern Roman at 30 points, you might be
tempted to write:
\newfont{\bigfont}{cmr10 at 30pt}
{\bigfont Huge text}

which will indeed work, but will actually produce a worse result than
\usepackage{fix-cm}
...
{%
\fontsize{30}{36}\selectfont
Huge text%
}

Note that the fix-cm package was not distributed until the December 2003 edition of
LaTeX; if you have an older distribution, the packages type1cm (for CM fonts) and
type1ec (for EC fonts) are available.
fix-cm.sty : Distributed as part of macros/latex/base (an unpacked version is
available at macros/latex/unpacked/fix-cm.sty)
type1cm.sty : macros/latex/contrib/type1cm
type1ec.sty : fonts/ps-type1/cm-super/type1ec.sty (the package is actually
part of the fonts/ps-type1/cm-super distribution, but it works happily in
the absence of the scaled fonts)

V The joy of TeX errors


364 How to approach errors
Since TeX is a macroprocessor, its error messages are often difficult to understand;
this is a (seemingly invariant) property of macroprocessors. Knuth makes light of the
problem in the TeXbook, suggesting that you acquire the sleuthing skills of a latter-day
Sherlock Holmes; while this approach has a certain romantic charm to it, it’s not good
for the ‘production’ user of (La)TeX. This answer (derived, in part, from an article
by Sebastian Rahtz in TUGboat 16(4)) offers some general guidance in dealing with
TeX error reports, and other answers in this section deal with common (but perplexing)
errors that you may encounter. There’s a long list of “hints” in Sebastian’s article,
including the following:
213
• Look at TeX errors; those messages may seem cryptic at first, but they often con-
tain a straightforward clue to the problem. See the structure of errors for further
details.
• Read the .log file; it contains hints to things you may not understand, often things
that have not even presented as error messages.
• Be aware of the amount of context that TeX gives you. The error messages gives
you some bits of TeX code (or of the document itself), that show where the error
“actually happened”; it’s possible to control how much of this ‘context’ TeX ac-
tually gives you. LaTeX (nowadays) instructs TeX only to give you one line of
context, but you may tell it otherwise by saying
\setcounter{errorcontextlines}{999}
in the preamble of your document. (If you’re not a confident macro programmer,
don’t be ashamed of cutting that 999 down a bit; some errors will go on and on,
and spotting the differences between those lines can be a significant challenge.)
• As a last resort, tracing can be a useful tool; reading a full (La)TeX trace takes a
strong constitution, but once you know how, the trace can lead you quickly to the
source of a problem. You need to have read the TeXbook (see books about TeX)
in some detail, fully to understand the trace.
The command \tracingall sets up maximum tracing; it also sets the output to
come to the interactive terminal, which is somewhat of a mixed blessing (since the
output tends to be so vast — all but the simplest traces are best examined in a text
editor after the event).
The LaTeX trace package (first distributed with the 2001 release of LaTeX)
provides more manageable tracing. Its \traceon command gives you what
\tracingall offers, but suppresses tracing around some of the truly verbose
parts of LaTeX itself. The package also provides a \traceoff command (there’s
no “off” command for \tracingall), and a package option (logonly) allows
you to suppress output to the terminal.

The best advice to those faced with TeX errors is not to panic: most of the common
errors are plain to the eye when you go back to the source line that TeX tells you of.
If that approach doesn’t work, the remaining answers in this section deal with some of
the odder error messages you may encounter. You should not ordinarily need to appeal
to the wider public for assistance, but if you do, be sure to report full backtraces (see
errorcontextlines above) and so on.
trace.sty : Distributed as part of macros/latex/required/tools

365 The structure of TeX error messages


TeX’s error messages are reminiscent of the time when TeX itself was conceived (the
1970s): they’re not terribly user-friendly, though they do contain all the information
that TeX can offer, usually in a pretty concise way.
TeX’s error reports all have the same structure:

• An error message
• Some ‘context’
• An error prompt

The error message will relate to the TeX condition that is causing a problem. Sadly, in
the case of complex macro packages such as LaTeX, the underlying TeX problem may
be superficially difficult to relate to the actual problem in the “higher-level” macros.
Many LaTeX-detected problems manifest themselves as ‘generic’ errors, with error
text provided by LaTeX itself (or by a LaTeX class or package).
The context of the error is a stylised representation of what TeX was doing at the
point that it detected the error. As noted in approaching errors, a macro package can
tell TeX how much context to display, and the user may need to undo what the package
has done. Each line of context is split at the point of the error; if the error actually
occurred in a macro called from the present line, the break is at the point of the call. (If
the called object is defined with arguments, the “point of call” is after all the arguments
have been scanned.) For example:
\blah and so on
produces the error report

214
! Undefined control sequence.
l.4 \blah
and so on
while:
\newcommand{\blah}[1]{\bleah #1}
\blah{to you}, folks
produces the error report
! Undefined control sequence.
\blah #1->\bleah
#1
l.5 \blah{to you}
, folks
If the argument itself is in error, we will see things such as
\newcommand{\blah}[1]{#1 to you}
\blah{\bleah}, folks
producing
! Undefined control sequence.
<argument> \bleah

l.5 \blah{\bleah}
, folks
The prompt accepts single-character commands: the list of what’s available may
be had by typing ?. One immediately valuable command is h, which gives you an
expansion of TeXs original précis message, sometimes accompanied by a hint on what
to do to work round the problem in the short term. If you simply type ‘return’ (or
whatever else your system uses to signal the end of a line) at the prompt, TeX will
attempt to carry on (often with rather little success).
366 An extra ‘}’??
You’ve looked at your LaTeX source and there’s no sign of a misplaced } on the line
in question.
Well, no: this is TeX’s cryptic way of hinting that you’ve put a fragile command in
a moving argument.
For example, \footnote is fragile, and if we put that in the moving argument of a
\section command, as
\section{Mumble\footnote{I couldn’t think of anything better}}
we get told
! Argument of \@sect has an extra }.
The same happens with captions (the following is a simplification of a comp.text.tex
post):
\caption{Energy: \[e=mc^2\]}
giving us the error message
! Argument of \@caption has an extra }.
The solution is usually to use a robust command in place of the one you are using,
or to force your command to be robust by prefixing it with \protect, which in the
\section case would show as
\section{Mumble\protect\footnote{I couldn’t think of anything better}}
In both the \section case and the \caption case, you can separate the moving ar-
gument, as in \section[moving]{static}; this gives us another standard route —
simply to omit (or otherwise sanitise) the fragile command in the moving argument.
So, one might rewrite the \caption example as:
\caption[Energy: $E=mc^2$]{Energy: \[E=mc^2\]}
for, after all, even if you want display maths in a caption, you surely don’t want it in
the list of figures.
The case of footnotes is somewhat more complex; “footnotes in LaTeX section
headings” deals specifically with that issue.
215
367 Capacity exceeded [semantic nest . . . ]
! TeX capacity exceeded, sorry [semantic nest size=100].
...
If you really absolutely need more capacity,
you can ask a wizard to enlarge me.
Even though TeX suggests (as always) that enlargement by a wizard may help, this
message usually results from a broken macro or bad parameters to an otherwise work-
ing macro.
The “semantic nest” TeX talks about is the nesting of boxes within boxes. A stupid
macro can provoke the error pretty easily:

\def\silly{\hbox{here’s \silly being executed}}


\silly

The extended traceback (see general advice on errors) does help, though it does rather
run on. In the case above, the traceback consists of
\silly ->\hbox {
here’s \silly being executed}
followed by 100 instances of
\silly ->\hbox {here’s \silly
being executed}
The repeated lines are broken at exactly the offending macro; of course the loop need
not be as simple as this — if \silly calls \dopy which boxes \silly, the effect is just
the same and alternate lines in the traceback are broken at alternate positions.
There are in fact two items being consumed when you nest boxes: the other is the
grouping level. Whether you exhaust your semantic nest or your permitted grouping
levels first is controlled entirely by the relative size of the two different sets of buffers
in your (La)TeX executable.
368 No room for a new ‘thing’
The technology available to Knuth at the time TeX was written is said to have been
particularly poor at managing dynamic storage; as a result much of the storage used
within TeX is allocated as fixed arrays, in the reference implementations. Many of these
fixed arrays are expandable in modern TeX implementations, but size of the arrays of
“registers” is written into the specification as being 256 (usually); this number may not
be changed if you still wish to call the result TeX (see testing TeX implementations).
If you fill up one of these register arrays, you get a TeX error message saying

! No room for a new \<thing>.

The \things in question may be \count (the object underlying LaTeX’s \newcounter
command), \skip (the object underlying LaTeX’s \newlength command), \box (the
object underlying LaTeX’s \newsavebox command), or \dimen, \muskip, \toks,
\read, \write or \language (all types of object whose use is “hidden” in LaTeX;
the limit on the number of \read or \write objects is just 16).
There is nothing that can directly be done about this error, as you can’t extend the
number of available registers without extending TeX itself. Of course, Ω and e-TeX —
respectively — both do this, as does MicroPress Inc’s VTeX.
The commonest way to encounter one of these error messages is to have broken
macros of some sort, or incorrect usage of macros (an example is discussed in epsf
problems).
However, sometimes one just needs more than TeX can offer, and when this hap-
pens, you’ve just got to work out a different way of doing things. An example is the
difficulty of loading PiCTeX with LaTeX. In cases like PiCTeX, it may be possible
to use e-TeX (all modern distributions provide it). The LaTeX package etex modifies
the register allocation mechanism to make use of e-TeX’s extended register sets (it’s a
derivative of the Plain TeX macro file etex.src, which is used in building the e-TeX Plain
format; both files are part of the e-TeX distribution). Unfortunately, e-TeX doesn’t help
with \read or \write objects.

216
369 epsf gives up after a bit
Some copies of the documentation of epsf.tex seem to suggest that the command
\input epsf
is needed for every figure included. If you follow this suggestion too literally, you get
an error
! No room for a new \read .
after a while; this is because each time epsf.tex is loaded, it allocates itself a new file-
reading handle to check the figure for its bounding box, and there just aren’t enough of
these things (see no room for a new thing).
The solution is simple — this is in fact an example of misuse of macros; one only
need read epsf.tex once, so change
...
\input epsf
\epsffile{...}
...
\input epsf
\epsffile{...}
(and so on) with a single
\input epsf
somewhere near the start of your document, and then decorate your \epsffile state-
ments with no more than adjustments of \epsfxsize and so on.
370 Improper \hyphenation will be flushed
For example
! Improper \hyphenation will be flushed.
\’#1->{
\accent 19 #1}
<*> \hyphenation{Ji-m\’e
-nez}
(in Plain TeX) or
! Improper \hyphenation will be flushed.
\leavevmode ->\unhbox
\voidb@x
<*> \hyphenation{Ji-m\’e
-nez}
in LaTeX.
As mentioned in “hyphenation failures”, words with accents in them may not be
hyphenated. As a result, any such word is deemed improper in a \hyphenation com-
mand.
The solution is to use a font that contains the character in question, and to express
the \hyphenation command in terms of that character; this “hides” the accent from
the hyphenation mechanisms. LaTeX users can be achieved this by use of the fontenc
package (part of the LaTeX distribution). If you select an 8-bit font with the pack-
age, as in \usepackage[T1]{fontenc}, accented-letter commands such as the \’e
in \hyphenation{Ji-m\’e-nez} automatically become the single accented character
by the time the hyphenation gets to look at it.
371 “Too many unprocessed floats”
If LaTeX responds to a \begin{figure} or \begin{table} command with the error
message
! LaTeX Error: Too many unprocessed floats.

See the LaTeX manual or LaTeX Companion for explanation.


your figures (or tables) are failing to be placed properly. LaTeX has a limited amount
of storage for ‘floats’ (figures, tables, or floats you’ve defined yourself with the float
package); if you don’t let it ever actually typeset any floats, it will run out of space.
This failure usually occurs in extreme cases of floats moving “wrongly” (see floats
moving “wrongly”); LaTeX has found it can’t place a float, and floats of the same type
have piled up behind it. LaTeX’s idea is to ensure that caption numbers are sequential

217
in the document: the caption number is allocated when the figure (or whatever) is cre-
ated, and can’t be changed, so that placement out of order would mean figure numbers
appearing out of order in the document (and in the list of figures, or whatever). So
a simple failure to place a figure means that no subsequent figure can be placed; and
hence (eventually) the error.
Techniques for solving the problem are discussed in the floats question (floats ques-
tion) already referenced.
The error also occurs in a long sequence of figure or table environments, with no
intervening text. Unless the environments will fit “here” (and you’ve allowed them to
go “here”), there will never be a page break, and so there will never be an opportunity
for LaTeX to reconsider placement. (Of course, the floats can’t all fit “here” if the
sequence is sufficiently prolonged: once the page fills, LaTeX won’t place any more
floats, leading to the error.
Techniques for resolution may involve redefining the floats using the float pack-
age’s [H] float qualifier, but you are unlikely to get away without using \clearpage
from time to time.
float.sty : macros/latex/contrib/float

372 \spacefactor complaints


The errors
! You can’t use ‘\spacefactor’ in vertical mode.
\@->\spacefactor
\@m
or
! You can’t use ‘\spacefactor’ in math mode.
\@->\spacefactor
\@m
or simply
! Improper \spacefactor.
...
bites the LaTeX programmer who uses an internal command without taking “precau-
tions”. The problem is discussed in detail in “@ in macro names”, together with solu-
tions.
373 \end occurred inside a group
The actual error we observe is:
(\end occurred inside a group at level <n>)
and it tells us that something we started in the document never got finished before
we ended the document itself. The things involved (‘groups’) are what TeX uses for
restricting the scope of things: you see them, for example, in the “traditional” font se-
lection commands: {\it stuff\/} — if the closing brace is left off such a construct,
the effect of \it will last to the end of the document, and you’ll get the diagnostic.
TeX itself doesn’t tell you where your problem is, but you can often spot it by look-
ing at the typeset output in a previewer. Otherwise, you can usually find mismatched
braces using an intelligent editor (at least emacs and winedt offer this facility). How-
ever, groups are not only created by matching { with }: other grouping commands are
discussed elsewhere in these FAQs, and are also a potential source of unclosed group.
\begin{henvironmenti} encloses the environment’s body in a group, and estab-
lishes its own diagnostic mechanism. If you end the document before closing some
other environment, you get the ‘usual’ LaTeX diagnostic
! LaTeX Error: \begin{blah} on input line 6 ended by \end{document}.
which (though it doesn’t tell you which file the \begin{blah} was in) is usually
enough to locate the immediate problem. If you press on past the LaTeX error, you
get one or more repetitions of the “occurred inside a group” message before LaTeX
finally exits. The checkend package recognises other unclosed \begin{blob} com-
mands, and generates an “ended by” error message for each one, rather than producing
the “occurred inside a group” message, which is sometimes useful (if you remember to
load the package).
In the absence of such information from LaTeX, you need to use “traditional” bi-
nary search to find the offending group. Separate the preamble from the body of your
218
file, and process each half on its own with the preamble; this tells you which half of
the file is at fault. Divide again and repeat. The process needs to be conducted with
care (it’s obviously possible to split a correctly-written group by chopping in the wrong
place), but it will usually find the problem fairly quickly.
e-TeX (and e-LaTeX — LaTeX run on e-TeX) gives you further diagnostics after
the traditional infuriating TeX one — it actually keeps the information in a similar way
to LaTeX:
(\end occurred inside a group at level 3)

### semi simple group (level 3) entered at line 6 (\begingroup)


### simple group (level 2) entered at line 5 ({)
### simple group (level 1) entered at line 4 ({)
### bottom level
The diagnostic not only tells us where the group started, but also the way it started:
\begingroup or { (which is an alias of \bgroup, and the two are not distinguishable
at the TeX-engine level).
checkend.sty : Distributed as part of macros/latex/contrib/bezos

374 “Missing number, treated as zero”


In general, this means you’ve tried to assign something to a count, dimension or skip
register that isn’t (in TeX’s view of things) a number. Usually the problem will become
clear using the ordinary techniques of examining errors.
Two LaTeX-specific errors are commonly aired on the newsgroups.
The commonest arises from attempting to use an example from the The LaTeX
Companion (first edition), and is exemplified by the following error text:
! Missing number, treated as zero.
<to be read again>
\relax
l.21 \begin{Ventry}{Return values}
The problem arises because, in its first edition, the Companion’s examples always as-
sumed that the calc package is loaded: this fact is mentioned in the book, but often not
noticed. The remedy is to load the calc package in any document using such exam-
ples from the Companion. (The problem does not really arise with the second edition;
copies of all the examples are available on the accompanying CD-ROM, or on CTAN.)
The other problem, which is increasingly rare nowadays, arises from misconfigu-
ration of a system that has been upgraded from LaTeX 2.09: the document uses the
times package, and the error appears at \begin{document}. The file search paths
are wrongly set up, and your \usepackage{times} has picked up a LaTeX 2.09 ver-
sion of the package, which in its turn has invoked another which has no equivalent in
LaTeX 2ε . The obvious solution is to rewrite the paths so that LaTeX 2.09 packages are
chosen only as a last resort so that the startlingly simple LaTeX 2ε times package will
be picked up. Better still is to replace the whole thing with something more modern
still; current psnfss doesn’t provide a times package — the alternative mathptmx incor-
porates Times-like mathematics, and a sans-serif face based on Helvetica, but scaled to
match Times text rather better.
calc.sty : Distributed as part of macros/latex/required/tools
Examples for LaTeX Companion: info/examples/tlc2
The psnfss bundle: macros/latex/required/psnfss

375 “Please type a command or say \end”


Sometimes, when you are running (La)TeX, it will abruptly stop and present you with
a prompt (by default, just a * character). Many people (including this author) will
reflexively hit the ‘return’ key, pretty much immediately, and of course this is no help
at all — TeX just says:
(Please type a command or say ‘\end’)
and prompts you again.
What’s happened is that your (La)TeX file has finished prematurely, and TeX has
fallen back to a supposed including file, from the terminal. This could have happened
simply because you’ve omitted the \bye (Plain TeX), \end{document} (LaTeX), or

219
whatever. Other common errors are failure to close the braces round a command’s ar-
gument, or (in LaTeX) failure to close a verbatim environment: in such cases you’ve
already read and accepted an error message about encountering end of file while scan-
ning something.
If the error is indeed because you’ve forgotten to end your document, you can
insert the missing text: if you’re running Plain TeX, the advice, to “say \end” is good
enough: it will kill the run; if you’re running LaTeX, the argument will be necessary:
\end{document}.
However, as often as not this isn’t the problem, and (short of debugging the source
of the document before ending) brute force is probably necessary. Excessive force
(killing the job that’s running TeX) is to be avoided: there may well be evidence in
the .log file that will be useful in determining what the problem is — so the aim is to
persuade TeX to shut itself down and hence flush all log output to file.
If you can persuade TeX to read it, an end-of-file indication (control-D under Unix,
control-Z under Windows) will provoke TeX to report an error and exit immediately.
Otherwise you should attempt to provoke an error dialogue, from which you can exit
(using the x ‘command’). An accessible error could well be inserting an illegal char-
acter: what it is will depend on what macros you are running. If you can’t make that
work, try a silly command name or two.
376 “Unknown graphics extension”
The LaTeX graphics package deals with several different types of DVI (or other) output
drivers; each one of them has a potential to deal with a different selection of graphics
formats. The package therefore has to be told what graphics file types its output driver
knows about; this is usually done in the hdriveri.def file corresponding to the output
driver you’re using.
The error message arises, then, if you have a graphics file whose extension doesn’t
correspond with one your driver knows about. Most often, this is because you’re being
optimistic: asking dvips to deal with a .png file, or PDFTeX to deal with a .eps file:
the solution in this case is to transform the graphics file to a format your driver knows
about.
If you happen to know that your device driver deals with the format of your file, you
are probably falling foul of a limitation of the file name parsing code that the graphics
package uses. Suppose you want to include a graphics file home.bedroom.eps using
the dvips driver; the package will conclude that your file’s extension is .bedroom.eps,
and will complain. To get around this limitation, you have three alternatives:

• Rename the file — for example home.bedroom.eps→home-bedroom.eps


• Mask the first dot in the file name:
\newcommand*{\DOT}{.}
\includegraphics{home\DOT bedroom.eps}
• Tell the graphics package what the file is, by means of options to the \includegraphics
command:
\includegraphics[type=eps,ext=.eps,read=.eps]{home.bedroom}

377 “Missing $ inserted”


There are certain things that only work in maths mode. If your document is not in
maths mode and you have an _ or a ^ character, TeX (and by inheritance, LaTeX too)
will say
! Missing $ inserted
as if you couldn’t possibly have misunderstood the import of what you were typing,
and the only possible interpretation is that you had commited a typo in failing to enter
maths mode. TeX, therefore, tries to patch things up by inserting the $ you ‘forgot’,
so that the maths-only object will work; as often as not this will land you in further
confusion.
It’s not just the single-character maths sub- and superscript operators: anything
that’s built in or declared as a maths operation, from the simplest lower-case \alpha
through the inscrutable \mathchoice primitive, and beyond, will provoke the error if
misused in text mode.
LaTeX offers a command \ensuremath, which will put you in maths mode for the
execution of its argument, if necessary: so if you want an \alpha in your running text,
220
say \ensuremath{\alpha}; if the bit of running text somehow transmutes into a bit of
mathematics, the \ensuremath will become a no-op, so it’s pretty much always safe.
378 Warning: “Font shape . . . not available”
LaTeX’s font selection scheme maintains tables of the font families it has been told
about. These tables list the font families that LaTeX knows about, and the shapes and
series in which those font families are available. In addition, in some cases, the tables
list the sizes at which LaTeX is willing to load fonts from the family.
When you specify a font, using one of the LaTeX font selection commands, LaTeX
looks for the font (that is, a font that matches the encoding, family, shape, series and
size that you want) in its tables. If the font isn’t there at the size you want, you will see
a message like:
LaTeX Font Warning: Font shape ‘OT1/cmr/m/n’ in size <11.5> not available
(Font) size <12> substituted on input line ...
There will also be a warning like:
LaTeX Font Warning: Size substitutions with differences
(Font) up to 0.5pt have occurred.
after LaTeX has encountered \end{document}.
The message tells you that you’ve chosen a font size that is not in LaTeX’s list
of “allowed” sizes for this font; LaTeX has chosen the nearest font size it knows is
allowed. In fact, you can tell LaTeX to allow any size: the restrictions come from
the days when only bitmap fonts were available, and they have never applied to fonts
that come in scaleable form in the first place. Nowadays, most of the fonts that were
once bitmap-only are also available in scaleable (Adobe Type 1) form. If your installa-
tion uses scaleable versions of the Computer Modern or European Computer Modern
(EC) fonts, you can tell LaTeX to remove the restrictions; use the type1cm or type1ec
package as appropriate.
If the combination of font shape and series isn’t available, LaTeX will usually have
been told of a fall-back combination that may be used, and will select that:
LaTeX Font Warning: Font shape ‘OT1/cmr/bx/sc’ undefined
(Font) using ‘OT1/cmr/bx/n’ instead on input line 0.
Substitutions may also be “silent”; in this case, there is no more than an “infor-
mation” message in the log file. For example, if you specify an encoding for which
there is no version in the current font family, the ‘default family for the encoding’ is
selected. This happens, for example, if you use command \textbullet, which is
normally taken from the maths symbols font, which is in OMS encoding. My test log
contained:
LaTeX Font Info: Font shape ‘OMS/cmr/m/n’ in size <10> not available
(Font) Font shape ‘OMS/cmsy/m/n’ tried instead on input line ...
In summary, these messages are not so much error messages, as information mes-
sages, that tell you what LaTeX has made of your text. You should check what the
messages say, but you will ordinarily not be surprised at their content.
type1cm.sty : macros/latex/contrib/type1cm
type1ec.sty : fonts/ps-type1/cm-super/type1ec.sty

379 Unable to read an entire line


TeX belongs to the generation of applications written for environments that didn’t offer
the sophisticated string and i/o manipulation we nowadays take for granted (TeX was
written in Pascal, and the original Pascal standard made no mention of i/o, so that
anything but the most trivial operations were likely to be unportable).
When you overwhelm TeX’s input mechanism, you get told:
! Unable to read an entire line---bufsize=3000.
Please ask a wizard to enlarge me.
(for some value of ‘3000’ — the quote was from a comp.text.tex posting by a some-
one who was presumably using an old TeX).
As the message implies, there’s (what TeX thinks of as a) line in your input that’s
“too long” (to TeX’s way of thinking). Since modern distributions tend to have tens of
thousands of bytes of input buffer, it’s somewhat rare that these messages occur “for
real”. Probable culprits are:

221
• A file transferred from another system, without translating record endings. With
the decline of fixed-format records (on mainframe operating systems) and the in-
creased intelligence of TeX distributions at recognising other systems’ explicit
record-ending characters, this is nowadays rather a rare cause of the problem.
• A graphics input file, which a package is examining for its bounding box, contains
a binary preview section. Again, sufficiently clever TeX distributions recognise
this situation, and ignore the previews (which are only of interest, if at all, to a
TeX previewer).
The usual advice is to ignore what TeX says (i.e., anything about enlarging), and to
put the problem right in the source.
If the real problem is over-long text lines, most self-respecting text editors will be
pleased to automatically split long lines (while preserving the “word” structure) so that
they are nowhere any longer than a given length; so the solution is just to edit the file.
If the problem is a ridiculous preview section, try using ghostscript to reprocess the
file, outputting a “plain .eps” file. (Ghostscript is distributed with a script ps2epsi
which will regenerate the preview if necessary.) Users of the shareware program
GSview will find buttons to perform the required transformation of the file being dis-
played.
ghostscript: Browse nonfree/support/ghostscript
GSview : Browse nonfree/support/ghostscript/ghostgum

380 “Fatal format file error; I’m stymied”


(La)TeX applications often fail with this error when you’ve been playing with the con-
figuration, or have just installed a new version.
The format file contains the macros that define the system you want to use: anything
from the simplest (Plain TeX) all the way to the most complicated, such as LaTeX or
ConTeXt. From the command you issue, TeX knows which format you want.
The error message
Fatal format file error; I’m stymied
means that TeX itself can’t understand the format you want. Obviously, this could
happen if the format file had got corrupted, but it usually doesn’t. The commonest
cause of the message, is that a new binary has been installed in the system: no two TeX
binaries on the same machine can understand each other’s formats. So the new version
of TeX you have just installed, won’t understand the format generated by the one you
installed last year.
Resolve the problem by regenerating the format; of course, this depends on which
system you are using.
• On a teTeX-based system, run
fmtutil --all
or
fmtutil --byfmt=<format name>
to build only the format that you are interested in.
• On a MiKTeX system, click Start→Programs→MiKTeX version→MiKTeX
Options, and in the options window, click Update now.

381 Non-PDF special ignored!


This is a PDFTeX error: PDFTeX is running in PDF output mode, and it has encoun-
tered a \special command (\special). PDFTeX is able to generate its own output,
and in this mode of operation has no need of \special commands (which allow the
user to pass information to the driver being used to generate output).
Why does this happen? LaTeX users, nowadays, hardly ever use \special com-
mands on their own — they employ packages to do the job for them. Some packages
will generate \special commands however they are invoked: pstricks is an example
(it’s very raison d’être is to emit PostScript code in a sequence of \special com-
mands). Pstricks may be dealt with by other means (the pdftricks package offers a
usable technique).
More amenable to correction, but more confusing, are packages (such as color,
graphics and hyperref ) that specify a “driver”. These packages have plug-in modules
that determine what \special (or other commands) are needed to generate any given
222
effect: the pdftex driver for such packages knows not to generate \special com-
mands. In most circumstances, you can let the system itself choose which driver you
need; in this case everything will act properly when you switch to using PDFLaTeX. If
you’ve been using dvips (and specifying the dvips driver) or dvipdfm (for which you
have to specify the driver), and decide to try PDFLaTeX, you must remove the dvips
or dvipdfm driver specification from the package options, and let the system recognise
which driver is needed.
pdftricks.sty : macros/latex/contrib/pdftricks
pstricks.sty : graphics/pstricks

382 Mismatched mode ljfour and resolution 8000


You’re running dvips, and you encounter a stream of error messages, starting with
“Mismatched mode”. The mode is the default used in your installation — it’s set in
the dvips configuration file, and ljfour is commonest (since it’s the default in most
distributions), but not invariable.
The problem is that dvips has encountered a font for which it must generate a
bitmap (since it can’t find it in Type 1 format), and there is no proforma available
to provide instructions to give to MetaFont.
So what to do? The number 8000 comes from the ‘-Ppdf’ option to dvips, which
you might have found from the answer “wrong type of fonts” (“wrong type of fonts”).
The obvious solution is to switch to the trivial substitute ‘-Pwww’, which selects the
necessary type 1 fonts for PDF generation, but nothing else: however, this will leave
you with undesirable bitmap fonts in your PDF file. The “proper” solution is to find a
way of expressing what you want to do, using type 1 fonts.
383 “Too deeply nested”
This error appears when you start a LaTeX list.
LaTeX keeps track of the nesting of one list inside another. There is a set of list
formatting parameters built-in for application to each of the list nesting levels; the
parameters determine indentation, item separation, and so on. The list environment
(the basis for list environments like itemize and enumerate) “knows” there are only
6 of these sets.
There are also different label definitions for the enumerate and itemize environ-
ments at their own private levels of nesting. Consider this example:
\begin{enumerate}
\item first item of first enumerate
\begin{itemize}
\item first item of first itemize
\begin{enumerate}
\item first item of second enumerate
...
\end{enumerate}
...
\end{itemize}
...
\end{enumerate}

In the example,
• the first enumerate has labels as for a first-level enumerate, and is indented as
for a first-level list;
• the first itemize has labels as for a first level itemize, and is indented as for a
second-level list; and
• the second enumerate has labels as for a second-level enumerate, and is indented
as for a third-level list.
Now, as well as LaTeX knowing that there are 6 sets of parameters for indentation, it
also knows that there are only 4 types of labels each, for the environments enumerate
and itemize (this “knowledge” spells out a requirement for class writers, since the
class supplies the sets of parameters).
From the above, we can deduce that there are several ways we can run out of
space: we can have 6 lists (of any sort) nested, and try to start a new one; we can
223
have 4 enumerate environments somewhere among the set of nested lists, and try to
add another one; and we can have 4 itemize environments somewhere among the set
of nested lists, and try to add another one.
What can be done about the problem? Not much, short of rewriting LaTeX — you
really need to rewrite your document in a slightly less labyrinthine way.
384 Capacity exceeded — input levels
The error
! TeX capacity exceeded, sorry [text input levels=15].
is caused by nesting your input too deeply. You can provoke it with the trivial (Plain
TeX) file input.tex, which contains nothing but:
\input input
In the real world, you are unlikely to encounter the error with a modern TeX distribu-
tion. TeTeX (used to produce the error message above) allows 15 files open for TeX
input at any one time, which is improbably huge for a document generated by real
human beings.
However, for those improbable (or machine-generated) situations, some distribu-
tions offer the opportunity to adjust the parameter max_in_open in a configuration file.
385 PDFTeX destination . . . ignored
The warning:

! pdfTeX warning (ext4): destination with the same identifier


(name{page.1}) has been already used, duplicate ignored

arises because of duplicate page numbers in your document. The problem is usually
soluble: see PDF page destinations — which answer also describes the problem in
more detail.
If the identifier in the message is different, for example name{figure.1.1}, the
problem is (usually) due to a problem of package interaction. Some packages are
simply incompatible with hyperref , but most work simply by ignoring it. In most
cases, therefore, you should load your package before you load hyperref , and hyperref
will patch things up so that they work:

\usepackage{your package}
...
\usepackage[opts]{hyperref}

You should do this as a matter of course, unless the documentation of a package


says you must load it after hyperref . (There aren’t many such packages: one such
is memoir’s “hyperref fixup” package memhfixc.)
If loading your packages in the (seemingly) “correct” order doesn’t solve the prob-
lem, you need to seek further help.
386 Alignment tab changed to \cr
This is an error you may encounter in LaTeX when a tabular environment is being
processed. “Alignment tabs” are the & signs that separate the columns of a tabular; so
the error message

! Extra alignment tab has been changed to \cr

could arise from a simple typo, such as:

\begin{tabular}{ll}
hello & there & jim \\
goodbye & now
\end{tabular}

where the second & in the first line of the table is more than the two-column ll column
specification can cope with — an extra “l” in that solves the problem. (As a result of
the error, “jim” will be moved to a row of his own.)
Rather more difficult to spot is the occurrence of the error when you’re using align-
ment instructions in a “p” column:

224
\usepackage{array}
...
\begin{tabular}{l>{\raggedright}p{2in}}
here & we are again \\
happy & as can be
\end{tabular}

the problem here (as explained in tabular cell alignment) is that the \raggedright
command in the column specification has overwritten tabular’s definition of \\, so
that “happy” appears in a new line of the second column, and the following & appears
to LaTeX just like the second & in the first example above.
Get rid of the error in the way described in tabular cell alignment — either use
\tabularnewline explicitly, or use the \RBS trick described there.
array.sty : Distributed as part of macros/latex/required/tools

387 Graphics division by zero


While the error

! Package graphics Error: Division by 0.

can actually be caused by offering the package a figure which claims to have a zero
dimension, it’s more commonly caused by rotation.
Objects in TeX may have both height (the height above the baseline) and depth (the
distance the object goes below the baseline). If you rotate an object by 180 degrees,
you convert its height into depth, and vice versa; if the object started with zero depth,
you’ve converted it to a zero-height object.
Suppose you’re including your graphic with a command like:

\includegraphics[angle=180,height=5cm]{myfig.eps}

In the case that myfig.eps has no depth to start with, the scaling calculations will
produce the division-by-zero error.
Fortunately, the graphicx package has a keyword totalheight, which allows you
to specify the size of the image relative to the sum of the object’s height and depth,
so

\includegraphics[angle=180,totalheight=5cm]{myfig.eps}

will resolve the error, and will behave as you might hope.
If you’re using the simpler graphics package, use the * form of the \resizebox
command to specify the use of totalheight:

\resizebox*{!}{5cm}{%
\rotatebox{180}{%
\includegraphics{myfig.eps}%
}%
}

graphics.sty,graphicx.sty : Both parts of the macros/latex/required/


graphics bundle

388 Missing \begin{document}


Give it a file of plain text, or a LaTeX file that really does have no \begin{document}
command in it, and LaTeX will produce this error, quite correctly. LaTeX needs
\begin{document} so as to know when to execute the commands that finish off the
document preamble.
Other than that, the error can occur as a result of an error of yours, of a corrupt
.aux file, or of a buggy class or package.
The errors you might commit are absent-mindedly typing a document command
(such as \section) in the preamble of your document, missing the comment marker
from the beginning of a line, or giving too many arguments to one of the setup com-
mands related to the class or a package that you have loaded.
A corrupt .aux file might be due to any number of things; delete the file and to run
LaTeX again, twice. If the error recurs, it’s probably due to a buggy class or package.

225
If the error occurs while LaTeX is reading a class or package, then there’s probably
a bug in the file. The author of the class or package stands the best chance of finding the
bug, but with luck you (or someone you ask on a mailing list or on comp.text.tex)
will be able to spot the problem and provide a work-around. Always report such bugs,
even if you have found a work-around.
389 \normalsize not defined
The LaTeX error:

The font size command \normalsize is not defined:


there is probably something wrong with the class file.

is reporting something pretty fundamental (document base font size not set up). While
this can, as the message implies, be due to a broken class file, the more common
cause is that you have simply forgotten to put a \documentclass command in your
document.
390 Too many math alphabets
TeX mathematics is one of its most impressive features, yet the internal structure of the
mechanism that produces it is painfully complicated and (in some senses) pathetically
limited. One area of limitation is that one is only allowed 16 ”maths alphabets”
LaTeX offers the user quite a lot of flexibility with allocating maths alphabets, but
few people use the flexibility directly. Nevertheless, there are many packages that pro-
vide symbols, or that manipulate them, which allocate themselves one or more maths
alphabet.
If you can’t afford to drop any of these packages, there’s still hope if you’re using
the bm package to support bold maths: bm is capable of gobbling alphabets as if there is
no tomorrow. The package defines two limiter commands: \bmmax (for bold symbols;
default 4) and \hmmax (for heavy symbols, if you have them; default 3), which control
the number of alphabets to be used.
Any reduction of the \xxmax variables will slow bm down — but that’s surely
better than the document not running at all. So unless you’re using maths fonts (such as
Mathtime Plus) that feature a heavy symbol weight, suppress all use of heavy families
by

\renewcommand{\hmmax}{0}

and then steadily reduce the bold families, starting with

\renewcommand{\bmmax}{3}

until (with a bit of luck) the error goes away.


bm.sty : Distributed as part of macros/latex/required/tools

391 Not in outer par mode


The error:

! LaTeX Error: Not in outer par mode.

comes when some “main” document feature is shut up somewhere it doesn’t like.
The commonest occurrence is when the user wants a figure somewhere inside a
table:

\begin{tabular}{|l|}
\hline
\begin{figure}
\includegraphics{foo}
\end{figure}
\hline
\end{tabular}

a construction that was supposed to put a frame around the diagram, but doesn’t work,
any more than:

226
\framebox{\begin{figure}
\includegraphics{foo}
\end{figure}%
}

The problem is, that the tabular environment, and the \framebox command restrain
the figure environment from its natural métier, which is to float around the document.
The solution is simply not to use the figure environment here:

\begin{tabular}{|l|}
\hline
\includegraphics{foo}
\hline
\end{tabular}

What was the float for? — as written in the first two examples, it serves no useful
purpose; but perhaps you actually wanted a diagram and its caption framed, in a float.
It’s simple to achieve this — just reverse the order of the environments (or of the
figure environment and the command):

\begin{figure}
\begin{tabular}{|l|}
\hline
\includegraphics{foo}
\caption{A foo}
\hline
\end{tabular}
\end{figure}

The same goes for table environments (or any other sort of float you’ve defined for
yourself) inside tabulars or box commands; you must get the float environment out
from inside, one way or another.
392 Perhaps a missing \item?
Sometimes, the error

Something’s wrong--perhaps a missing \item

actually means what it says:

\begin{itemize}
boo!
\end{itemize}

produces the error, and is plainly in need of an \item command.


However, the error regularly appears when you would never have thought that a
\item command might be appropriate. For example, the seemingly innocent:

\fbox{%
\begin{alltt}
boo!
\end{alltt}%
}

produces the error (the same happens with \mbox in place of \fbox, or with either of
their “big brothers”, \framebox and \makebox). This is because the alltt environ-
ment uses a “trivial” list, hidden inside their definition. (The itemize environment
also has this construct inside itself, in fact, so \begin{itemize} won’t work inside an
\fbox, either.) The list construct wants to happen between paragraphs, so it makes a
new paragraph of its own. Inside the \fbox command, that doesn’t work, and subse-
quent macros convince themselves that there’s a missing \item command.
To solve this rather cryptic error, one must put the alltt inside a paragraph-style
box. The following modification of the above does work:

227
\fbox{%
\begin{minipage}{0.75\textwidth}
\begin{alltt}
hi, there!
\end{alltt}
\end{minipage}
}

The code above produces a box that’s far too wide for the text. One may want to use
something that allows variable size boxes in place of the minipage environment.
Oddly, although the verbatim environment wouldn’t work inside a \fbox com-
mand argument (see verbatim in command arguments), you get an error that complains
about \item: the environment’s internal list bites you before verbatim has even had a
chance to create its own sort of chaos.
Another (seemingly) obvious use of \fbox also falls foul of this error:
\fbox{\section{Boxy section}}

This is a case where you’ve simply got to be more subtle; you should either write your
own macros to replace the insides of LaTeX’s sectioning macros, or look for some
alternative in the packages discussed in “The style of section headings”.
393 Illegal parameter number in definition
The error message means what it says. In the simple case, you’ve attempted a definition
like:
\newcommand{\abc}{joy, oh #1!}

or (using TeX primitive definitions):


\def\abc{joy, oh #1!}

In either of the above, the definition uses an argument, but the programmer did not tell
(La)TeX, in advance, that she was going to. The fix is simple — \newcommand{\abc}
[1], in the LaTeX case, \def\abc#1 in the basic TeX case.
The more complicated case is exemplified by the attempted definition:
\newcommand{\abc}{joy, oh joy!%
\newcommand{\ghi}[1]{gloom, oh #1!}%
}

will also produce this error, as will its TeX primitive equivalent:
\def\abc{joy, oh joy!%
\def\ghi#1{gloom, oh #1!}%
}

This is because special care is needed when defining one macro within the code of
another macro. This is explained elsewhere, separately for LaTeX definitions and for
TeX primitive definitions
394 Float(s) lost
The error
! LaTeX Error: Float(s) lost.

seldom occurs, but always seems deeply cryptic when it does appear.
The message means what it says: one or more figures, tables, etc., or marginpars
has not been typeset. (Marginpars are treated internally as floats, which is how they
come to be lumped into this error message.)
The most likely reason is that you placed a float or a \marginpar command in-
side another float or marginpar, or inside a minipage environment, a \parbox or
\footnote. Note that the error may be detected a long way from the problematic
command(s), so the techniques of tracking down elusive errors all need to be called
into play.
This author has also encountered the error when developing macros that used the
LaTeX internal float mechanisms. Most people doing that sort of thing are expected to
be able to work out their own problems. . .
228
395 Option clash for package
So you’ve innocently added:
\usepackage[draft]{graphics}

to your document, and LaTeX responds with


! LaTeX Error: Option clash for package graphics.

The error is a complaint about loading a package with options, more than once
(LaTeX doesn’t actually examine what options there are: it complains because it can’t
do anything with the multiple sets of options). You can load a package any number
of times, with no options, and LaTeX will be happy, but you may only supply options
when you first load the package.
So perhaps you weren’t entirely innocent — the error would have occurred on the
second line of:
\usepackage[dvips]{graphics}
\usepackage[draft]{graphics}

which could quite reasonably (and indeed correctly) have been typed:
\usepackage[dvips,draft]{graphics}

But if you’ve not made that mistake (even with several lines separating the
\usepackage commands, it’s pretty easy to spot), the problem could arise from
something else loading the package for you. How do you find the culprit? Well, it’s
down to the log analysis games discussed in “How to approach errors”; the trick to
remember is that that the process of loading each file is parenthesised in the log; so if
package foo loads graphics, the log will contain something like:
(<path>/foo.sty ...
...
(<path>/graphics.sty ...
...)
...
)

(the parentheses for graphics are completely enclosed in those for foo; the same is of
course true if bar is the culprit, except that the line will start with the path to bar.cls).
If we’re dealing with a package that loads the package you are interested in, you
need to ask LaTeX to slip in options when foo loads it. Instead of:
\usepackage{foo}
\usepackage[draft]{graphics}

you would write:


\PassOptionsToPackage{draft}{graphics}
\usepackage{foo}

The command \PassOptionsToPackage tells LaTeX to behave as if its options


were passed, when it finally loads a package. As you would expect from its name,
\PassOptionsToPackage can deal with a list of options, just as you would have in
the the options brackets of \usepackage.
More trickily, instead of:
\documentclass[...]{bar}
\usepackage[draft]{graphics}

you would write:


\PassOptionsToPackage{draft}{graphics}
\documentclass[...]{bar}

with \PassOptionsToPackage before the \documentclass command.


However, if the foo package or the bar class loads graphics with an option of its
own that clashes with what you need in some way, you’re stymied. For example:
229
\PassOptionsToPackage{draft}{graphics}

where the package or class does:

\usepackage[final]{graphics}

sets final after it’s dealt with option you passed to it, so your draft will get forgotten.
In extreme cases, the package might generate an error here (graphics doesn’t go in for
that kind of thing, and there’s no indication that draft has been forgotten).
In such a case, you have to modify the package or class itself (subject to the terms of
its licence). It may prove useful to contact the author: she may have a useful alternative
to suggest.

W Current TeX-related projects


396 The LaTeX3 project
The LaTeX3 project team (see http://www.latex-project.org/latex3.html) is
a small group of volunteers whose aim is to produce a major new document processing
system based on the principles pioneered by Leslie Lamport in the current LaTeX. It
will remain freely available and it will be fully documented at all levels.
The LaTeX3 team’s first product (LaTeX 2ε ) was delivered in 1994 (it’s now prop-
erly called “LaTeX”, since no other version is current).
LaTeX 2ε was intended as a consolidation exercise, unifying several sub-variants
of LaTeX while changing nothing whose change wasn’t absolutely necessary. This has
permitted the team to support a single version of LaTeX, in parallel with development
of LaTeX3.
Some of the older discussion papers about directions for LaTeX3 are to be found
on CTAN; other (published) articles are to be found on the project web site (http:
//www.latex-project.org/papers/), as is some of the project’s experimental code
(see http://www.latex-project.org/code.html, which allows you to read the
project’s source repository). You can participate in discussions of the future of La-
TeX through the mailing list latex-l. Subscribe to the list by sending a message
‘subscribe latex-l <your name>’ to [email protected]
LaTeX project publications: info/ltx3pub
397 Future WWW technologies and (La)TeX
An earlier answer (“converting to HTML”) addresses the issue of converting existing
(La)TeX documents for viewing on the Web as HTML. All the present techniques are
somewhat flawed: the answer explains why.
However, things are changing, with better font availability, cunning HTML pro-
gramming and the support for new Web standards.

Font technologies Direct representation of mathematics in browsers has been ham-


pered up to now by the limited range of symbols in the fonts one can rely on being
available. Some existing (La)TeX to HTML converters provide maths symbols
by hitching them to alternate font face specifications for standard code points in a
non-standard way. This does nothing for the universality of the HTML so gener-
ated.
In the near future, we can expect rather wide availability of Unicode fonts with
good coverage of symbols, which should make life easier for those who need to
represent mathematics.
XML The core of the range of new standards is XML, which provides a framework
for better structured markup; limited support for it has already appeared in some
browsers.
Conversion of (La)TeX source to XML is already available (through TeX4ht at
least), and work continues in that arena. The alternative, authoring in XML (thus
producing documents that are immediately Web-friendly, if not ready) and using
(La)TeX to typeset is also well advanced. One useful technique is transforming the
XML to LaTeX, using an XSLT stylesheet or code for an XML library, and then
simply using LaTeX; alternatively, one may typeset direct from the XML source.

230
Direct represention of mathematics MathML is a standard for representing maths on
the Web; its original version was distinctly limited, but version 2 of MathML has
had major browser support since 2002 with richness of mathematical content for
online purposes approaching that of TeX for print. Browser support for MathML
is provided by amaya, the ‘Open Source’ browser mozilla (and its derivatives in-
cluding NetScape, Firefox and Galeon) and Internet Explorer when equipped with
a suitable plugin such as MathPlayer. There’s evidence that (La)TeX users are
starting to use such browsers. Some believe that XHTML+MathML now provides
better online viewing than PDF. Work to produce XHTML+MathML is well ad-
vanced in both the TeX4ht and TtH projects for (La)TeX conversion.
An approach different from (La)TeX conversion is taken by the GELLMU Project.
Its article XML document type, which has a markup vocabulary close to La-
TeX that can be edited using LaTeX-like markup (even though it is not La-
TeX — so far), comes with translators that make both PDF (via pdflatex) and
XHTML+MathML. Such an approach avoids the inherent limitations of the “tra-
ditional” (La)TeX translation processes, which have traps that can be sprung by
unfettered use of (La)TeX markup.
Graphics SVG is a standard for graphics representation on the web. While the natural
use is for converting existing figures, representations of formulas are also possible,
in place of the separate bitmaps that have been used in the past (and while we wait
for the wider deployment of MathML).
Browser plug-ins, that deal with SVG are already available (Adobe offer one, for
example). More recently, the open source graphics editor Inkscape has appeared,
and has been reported to be useful for SVG-related work in at least one TeX-related
project. Be aware that the developers of Inkscape have no illusions about being
able to replace commercial software, yet. . .
Direct use of TeX markup Some time back, IBM developed a browser plug-in called
TechExplorer, which would display (La)TeX documents direct in a browser. Over
the years, it developed into a MathML browser plug-in, while still retaining its
(La)TeX abilities, but it’s now distributed (free for Linux and Windows platforms)
by Integre Technical Publishing.
The disadvantage of the TechExplorer approach is that it places the onus on the
browser user; and however technically proficient you are, it’s never safe to assume
too much of your readers. An interesting alternative is MimeTeX, which sits on
your server as a CGI script, and you use it to include your TeX, in your HTML, as
if it were an image:
<img src="../cgi-bin/mimetex.cgi?f(x)=\int\limits_{-\infty}^xe^{-t^2}dt">

GELLMU : support/gellmu
MimeTeX : support/mimetex
TeX4HT : support/TeX4ht/tex4ht-all.zip

398 Making outline fonts from MetaFont


TeXtrace, originally developed by Péter Szabó, is a bundle of Unix scripts that use
Martin Weber’s freeware boundary tracing package autotrace to generate Type 1 out-
line fonts from MetaFont bitmap font outputs. The result is unlikely ever to be of the
quality of the commercially-produced Type 1 font, but there’s always the FontForge
font editor to tidy things. Whatever, there remain fonts which many people find useful
and which fail to attract the paid experts, and auto-tracing is providing a useful service
here. Notable sets of fonts generated using TeXtrace are Péter Szabó’s own EC/TC font
set tt2001 and Vladimir Volovich’s CM-Super set, which covers the EC, TC, and the
Cyrillic LH font sets (for details of both of which sets, see “8-bit” type 1 fonts).
Another system, which arrived slightly later, is mftrace: this is a small Python
program that does the same job. Mftrace may use either autotrace (like TeXtrace) or
Peter Selinger’s potrace to produce the initial outlines to process. Mftrace is said to
be more flexible, and easier to use, than is TeXtrace, but both systems are increasingly
being used to provide Type 1 fonts to the public domain.
The MetaType1 system aims to use MetaFont font sources, by way of MetaPost and
a bunch of scripts and so on, to produce high-quality Type 1 fonts. The first results,
the Latin Modern fonts, are now well-established, and a bunch of existing designs have
been reworked in MetaType1 format.
231
MetaType1: fonts/utilities/metatype1

399 The TeX document preparation environment


“Why TeX is not WYSIWYG” outlines the reasons (or excuses) for the huge disparity of
user interface between “typical” TeX environments and commercial word processors.
Nowadays, at last, there is a range of tools available that try either to bridge or to
close the gap. One range modestly focuses on providing the user with a legible source
document. At the other extreme we have TeXmacs, a document processor using TeX’s
algorithms and fonts for both editor display and printing. TeXmacs does not use the TeX
language itself (though among other formats, LaTeX may be exported and imported).
A bit closer to LaTeX is LyX, which has its own editor display and file formats as well,
but does its print output by exporting to LaTeX. The editor display merely resembles
the printed output, but you have the possibility of entering arbitrary LaTeX code. If
you use constructs that LyX does not understand, it will just display them as source
text marked red, but will properly export them.
Since a lot of work is needed to create an editor from scratch that actually is good at
editing (as well as catering for TeX), it is perhaps no accident that several approaches
have been implemented using the extensible emacs editor. The low end of the pretti-
fying range is occupied by syntax highlighting: marking TeX tokens, comments and
other stuff with special colours. Many free editors (including emacs) can cater for TeX
in this way. Under Windows, one of the more popular editors with such support is
the Shareware product winedt. Continuing the range of tools prettifying your input,
we have the emacs package x-symbol, which does the WYSIWYG part of its work by
replacing single TeX tokens and accented letter sequences with appropriate-looking
characters on the screen.
A different type of tool focuses on making update and access to previews of the
typeset document more immediate. A recent addition in several viewers, editors and
TeX executables are so-called ‘source specials’ for cross-navigation. When TeX com-
piles a document, it will upon request insert special markers for every input line into
the typeset output. The markers are interpreted by the DVI previewer which can be
made to let its display track the page corresponding to the editor input position, or to
let the editor jump to a source line corresponding to a click in the preview window.
An emacs package that combines this sort of editor movement tracking with auto-
matic fast recompilations (through the use of dumped formats) is WhizzyTeX which is
best used with a previewer by the same author. A simpler package in a similar spirit is
called InstantPreview and makes use of a continuously running copy of TeX (under the
moniker of TeX daemon) instead of dumping formats to achieve its speed.
Another emacs package called preview-latex tries to solve the problem of visual
correlation between source and previews in a more direct way: it uses a LaTeX package
to chop the document source into interesting fragments (like figures, text or display
math) which it runs through LaTeX and replaces the source text of those fragments
with the corresponding rendered output images. Since it does not know about the
structure of the images, at the actual cursor position the source text is displayed while
editing rather than the preview. This approach is more or less a hybrid of the source
prettifying and fast preview approaches since it works in the source buffer but uses
actual previews rendered by LaTeX.
A more ambitious contender is called TeXlite. This system is only available on
request from its author; it continuously updates its screen with the help of a special
version of TeX dumping its state in a compressed format at each page and using hooks
into TeX’s line breaking mechanism for reformatting paragraphs on the fly. That way,
it can render the output from the edited TeX code with interactive speed on-screen, and
it offers the possibility of editing directly in the preview window.
That many of these systems occupy slightly different niches can be seen by compar-
ing the range of the emacs-based solutions ranging from syntax highlighting to instant
previewing: all of them can be activated at the same time without actually interfering
in their respective tasks.
The different approaches offer various choices differing in the immediacy of their
response, the screen area they work on (source or separate window), degree of corre-
spondance of the display to the final output, and the balance they strike between visual
aid and visual distraction.
preview-latex : Browse support/preview-latex

232
texmacs: Browse systems/unix/TeXmacs

400 The ANT typesetting system


Achim Blumensath’s ANT project, in contrast to NTS, aims not to replicate TeX with a
different implementation technique, but rather to provide a replacement for TeX which
uses TeX-like typesetting algorithms in a very different programming environment.
ANT remains under development, but it is now approaching the status of a usable
typesetting system.
ANT’s markup language is immediately recognisable to the (La)TeX user, but the
scheme of implementing design in ANT’s own implementation language (presently
OCaml) comes as a pleasant surprise to the jaded FAQ writer. This architecture holds
the promise of a system that avoids a set of serious problems with TeX’s user interface:
those that derive from the design language being the same as the markup language.
ANT : systems/ant
401 The ExTeX project
The ExTeX project is building on the experience of the many existing TeX develop-
ment and extension projects, to develop a new TeX-like system. The system is to be
developed in Java (like the ill-fated NTS project).
While ExTeX will implement all of TeX’s primitives, some may be marked as obso-
lete, and “modern” alternatives provided (an obvious example is the primitive \input
command, whose syntax inevitably makes life difficult for users of modern operating
system file paths). Desirable extensions from e-TeX, PDFTeX and Ω have been identi-
fied.
Usability will be another focus of the work: debugging support and log filtering
mechanisms will please those who have long struggled with TeX macros.
ExTeX will accept Unicode input, from the start. In the longer term, drawing prim-
itives are to be considered.
402 Omega and Aleph
Omega (Ω) was developed as an extension of TeX, to use with multilingual texts, ex-
pressed in a variety of input encodings. Omega uses 16-bit, Unicode-encoded, charac-
ters. It provides many innovative concepts, notably including the “translation process”
that takes a character stream and transforms it according to various processes that may
be internally specified, or be a separate program.
While Omega showed a lot of promise at its mid-1990s announcement, its devel-
opment was slow, and development was essentially dead by the time that one of the
original developers withdrew (taking with him a bunch of research students).
Before that distressing event, a separate thread of development was started, to pro-
duce a program called Aleph (ℵ), which merged the facilities of e-TeX into a stable
Omega codebase and added other extensions. Aleph also proved an attractive platform
for many people; but its development, too, has dried up.
The latest news (from EuroTeX 2006) is that development of Omega is picking up
again, in parallel with research into what the (new) authors consider a rational scheme
for supporting TeX-style typesetting. The new system is to be known as Omega-2 (Ω2 ),
and is being designed in a modular fashion so that support of new facilities (such as
use of advanced OpenType fonts) can be added in a relatively straightforward fashion.
The work done in the Aleph project is also being carried forward in the LUATeX
project.
403 PDFTeX becomes LUATeX
As is said elsewhere in these FAQs, development of PDFTeX is “in essence” com-
plete — development of new facilities continues, but the limitations of the present
structure impose a strong limit on what facilities are possible.
Thus arose the idea of LUATeX. LUA is a script language, chosen because its inter-
preter has a very small “footprint”, so it is rather easy to build it into other applications.
So LUATeX was launched as a PDFTeX executable with a LUA interpreter built into
it.
The LUATeX project is now proceeding (with monetary support from various
sources) and is pursuing avenues that many of the other current projects have in their
sights, notably Unicode character representations and support for OpenType fonts.
Work is also in hand to integrate the extensions pioneered by Aleph.
233
404 The XeTeX project
XeTeX (by Jonathan Kew) is a successor to the shareware TeX/GX program. It is a
Unicode-based (UTF-8) TeX implementation which is able to make use of Mac OS X
AAT (Apple Advanced Typography) .dfonts and OpenType fonts. It uses Apple’s
Quartz system (which facilitates the afore-mentioned font access) to generate PDF out-
put.
A Linux version has been announced (2006); that version of course can’t use AAT
fonts, and relies on OpenType fonts alone.
The project has a web site for the user who wants more than this simple answer,
and you can also sign up to a mailing list.

X You’re still stuck?


405 You don’t understand the answer
While the FAQ maintainers don’t offer a ‘help’ service, they’re very keen that you un-
derstand the answers they’ve already written. They’re (almost) written “in a vacuum”,
to provide something to cover a set of questions that have arisen; it’s always possible
that they’re written in a way that a novice won’t understand them.
Which is where you can help the community. Mail the maintainers to report the
answer that you find unclear, and (if you can) suggest what we need to clarify. Time
permitting (the team is small and all its members are busy), we’ll try and clarify the
answer. This way, with a bit of luck, we can together improve the value of this resource
to the whole community.
Note that the FAQ development email address is not for answering questions: it’s
for you to suggest which questions we should work on, or new questions we should
answer in future editions.
Those who simply ask questions at that address will be referred to texhax@tug.
org or to comp.text.tex.

406 Submitting new material for the FAQ


The FAQ will never be complete, and we always expect that there will be people out
there who know better than we do about something or other. We always need to be
put right about whatever we’ve got wrong, and suggestions for improvements, partic-
ularly covering areas we’ve missed, are always needed: mail anything you have to the
maintainers
If you have actual material to submit, your contribution is more than ever welcome.
Submission in plain text is entirely acceptable, but if you’re really willing, you may feel
free to mark up your submission in the form needed for the FAQ itself. The markup is a
strongly-constrained version of LaTeX — the constraints come from the need to trans-
late the marked-up text to HTML on the fly (and hence pretty efficiently). There is a
file markup-syntax in the FAQ distribution that describes the structure of the markup,
but there’s no real substitute for reading at least some of the source (faqbody.tex)
of the FAQ itself. If you understand Perl, you may also care to look at the translation
code in texfaq2file and sanitize.pl in the distribution: this isn’t the code actually
used on the Web site, but it’s a close relation and is kept up to date for development
purposes.
FAQ distribution: help/uk-tex-faq
407 Reporting a LaTeX bug
The LaTeX team supports LaTeX, and will deal with bona fide bug reports. Note that
the LaTeX team does not deal with contributed packages — just the software that is
part of the LaTeX distribution itself: LaTeX and the “required” packages. Furthermore,
you need to be slightly careful to produce a bug report that is usable by the team. The
steps are:
1. Are you still using current LaTeX? Maintenance is only available for sufficiently
up-to-date versions of LaTeX — if your LaTeX is more than two versions out of date,
the bug reporting mechanisms may reject your report.
2. Has your bug already been reported? Browse the LaTeX bugs database, to find any
earlier instance of your bug. In many cases, the database will list a work-around.

234
3. Prepare a “minimum” file that exhibits the problem. Ideally, such a file should
contain no contributed packages — the LaTeX team as a whole takes no responsibility
for such packages (if they’re supported at all, they’re supported by their authors). The
“minimum” file should be self-sufficient: if a member of the team should run it in
a clean directory, on a system with no contributed packages, it should replicate your
problem.
4. Run your file through LaTeX: the bug system needs the .log file that this process
creates.
You now have two possible ways to proceed: either create a mail report to send to
the bug processing mechanism (5, below), or submit your bug report via the web (7,
below).
5. Process the bug-report creation file, using LaTeX itself:
latex latexbug
latexbug asks you some questions, and then lets you describe the bug you’ve found.
It produces an output file latexbug.msg, which includes the details you’ve supplied,
your “minimum” example file, and the log file you got after running the example. (I
always need to edit the result before submitting it: typing text into latexbug isn’t
much fun.)
6. Mail the resulting file to [email protected]; the subject line of
your email should be the same as the bug title you gave to latexbug. The file
latexbug.msg should be included into your message in-line: attachments are likely to
be rejected by the bug processor.
7. Connect to the latex bugs processing web page and enter details of your bug —
category, summary and full description, and the two important files (source and log
file); note that members of the LaTeX team need your name and email address, as they
may need to discuss the bug with you, or to advise you of a work-around.
408 What to do if you find a bug
For a start, make entirely sure you have found a bug. Double-check with books about
TeX, LaTeX, or whatever you’re using; compare what you’re seeing against the other
answers above; ask every possible person you know who has any TeX-related expertise.
The reasons for all this caution are various.
If you’ve found a bug in TeX itself, you’re a rare animal indeed. Don Knuth is so
sure of the quality of his code that he offers real money prizes to finders of bugs; the
cheques he writes are such rare items that they are seldom cashed. If you think you
have found a genuine fault in TeX itself (or MetaFont, or the CM fonts, or the TeX-
book), don’t immediately write to Knuth, however. He only looks at bugs infrequently,
and even then only after they are agreed as bugs by a small vetting team. In the first
instance, contact Barbara Beeton at the AMS ([email protected]), or contact TUG.
If you’ve found a bug in LaTeX 2ε , report it using mechanisms supplied by the
LaTeX team. (Please be careful to ensure you’ve got a LaTeX bug, or a bug in one of
the “required” packages distributed by the LaTeX team.)
If you’ve found a bug in a contributed LaTeX package, you could try contacting
the author (if you can find a contact address). However, it’s generally best to treat
any package as unsupported, in the first instance, and only try the author after mailing
list/news group support has failed you.
If you’ve found a bug in LaTeX 2.09, or some other such unsupported software,
you may find help or de facto support on a newsgroup such as comp.tex.tex or on
a mailing list such as [email protected]; be carefule to include a code example of the
failure, if relevant.
Failing all else, you may need to pay for help — TUG maintains a register of TeX
consultants.

235

You might also like