Author Topic: Comments  (Read 50509 times)

0 Members and 2 Guests are viewing this topic.

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 20768
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Comments
« Reply #150 on: December 24, 2023, 09:27:19 am »
Unfortunately quite a few of those don't render on my tablet :(

For the others, would it be possible to link to the specific plane, not just https://en.wikipedia.org/wiki/Plane_(Unicode)
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6967
  • Country: fi
    • My home page and email address
Re: Comments
« Reply #151 on: December 24, 2023, 11:00:38 am »
Unfortunately quite a few of those don't render on my tablet :(
That is determined by the font you use: not all fonts have a good glyph coverage.  CJK may be poorly supported, for example.

The default font for normal text for this forum is Verdana, which has medium coverage; the default font for the teletype text ([tt]...[/tt])  is DejaVu Sans Mono, or if that is not installed, Monaco, and so on.  I recommend you consider installing the full font (from DejaVu Fonts github; it is a free font), as the WebFont versions tend to be limited to a subset of glyphs, but the DejaVu Sans Mono font itself has pretty good coverage (coverage in PDF form).

It is not just for this site either, because it is a common font with a nice coverage; if you install the other DejaVu Sans variants, you can set it as your default Sans Serif font in your browser, so pages like Wikipedia will have a nice glyph coverage.  And it should look nice, too.

For the others, would it be possible to link to the specific plane, not just https://en.wikipedia.org/wiki/Plane_(Unicode)
I added the links to the correct blocks I mentioned, and consolidated the two links to that to a single link ("list").
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 20768
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Comments
« Reply #152 on: December 24, 2023, 11:41:25 am »
installing fonts is a never-ending rathole. Think IBM Selectric golf balls ;)

This runs straight into the dilemma of whether something should be displayed in a manner determined by the viewing device or determined by the author.

The former is is good where the author wants to get content and information into the readers forebrain. The latter is good where the author wants to get pixels and emotion into the top readers hindbrain.

Too much of the web is the latter.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6967
  • Country: fi
    • My home page and email address
Re: Comments
« Reply #153 on: December 24, 2023, 12:45:54 pm »
installing fonts is a never-ending rathole.
Not really.  Linux users will want to install MS Core Fonts or fonts-liberation2; and fonts-dejavu-core, fonts-dejavu-extra, fonts-freefont-ttf, fonts-noto-mono, fonts-noto-ui-core, fonts-noto-cjk, fonts-noto-color-emoji, fonts-opensymbol, fonts-mathjax, plus whatever the distribution installs.

fonts-noto-* refers to the No Tofu font, at https://fonts.google.com/noto, which has extremely wide Unicode coverage.
DejaVu fonts are free fonts similar to Bitstream Vera (sans) and Bitstream Charter (serif), with very wide Unicode coverage.
GNU FreeFont (FreeMono, FreeSans, FreeSerif) are free fonts with as wide Unicode coverage as possible.
OpenSymbol fonts are used in many text documents' ornaments (in default templates for greeting cards and such), and the fonts-mathjax font package ensures one can render MathJax correctly locally (with just the JavaScript files) without an internet connection, as normally MathJax uses WebFonts.

So, I'm only recommending one installs No Tofu, DejaVu, and FreeFont.  Of these, only DejaVu is widely used, so that alone suffices; the others are optional and for variety.  As I already said, DejaVu Mono is already used for teletype text on this forum (and many other SMF forums); and if you set your browser monospace, serif, and sans serif default character sets to NoTo/DejaVu/FreeFont (whichever pleases you most), you won't see the annoying boxes instead of the correct glyphs anymore!

(For text such as this one, the preferred font order is Verdana, Arial, Helvetica, sans-serif.  Note that Verdana is metrically compatible with DejaVu Sans, having the same size glyphs.)



The way CSS specifies a font includes three generic names: "serif", "sans-serif", and "monospace".  These are generic font names, for the express purpose of using whatever font the user desires for serif/no-serif and monospace text.  Your browser will let you select these.  By default, they often default to the similar operating system defaults set by your UI theme.  However, many OS fonts have poor Unicode coverage, which leads to *some* glyphs missing (or replaced with boxes).  To avoid this in your browser, it is quite important to set those three default fonts to a font you like, with the correct properties, and with a wide Unicode coverage.

A lot of web pages do not specify a specific font, only the generic font.  (Even when they do specify one, they usually specify a list of preferred fonts in order of preference, which is terminated with one of the three generic names.)  A good example of this is Wikipedia: both the current and the previous default skins use "sans-serif" for the body text.  Thus, whether you see all glyphs in Wikipedia pages or not, depends on what you have set for the above three default ones in your browser, or what they default to if you haven't touched them at all.

If you open plain text files in your browser, either locally or via HTTP or HTTPS with mime type text/plain, your browser will use the "monospace" default font you have set.

I do not recommend adding many fonts to ones system, just DejaVu, and optionally NoTo and/or FreeFont, depending on what you yourself prefer.  DejaVu is commonly used on the web, and covers Unicode quite well, so it is a good choice for a default font in your browser.  If you don't like it that much, pick NoTo or FreeFont instead for your browser default fonts.  (I often use DejaVu Sans and DejaVu Serif for text, but a different one for monospace, because DejaVu Sans Mono is too large compared to the other two, and just using a smaller point size leads to Unicode Block Elements to no longer connect when mixed with DejaVu Sans or Serif text on the same line.)
« Last Edit: December 24, 2023, 12:51:33 pm by Nominal Animal »
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6967
  • Country: fi
    • My home page and email address
Re: Comments
« Reply #154 on: December 24, 2023, 02:04:22 pm »
I do use, and want to use, UTF-8 (and selected Unicode blocks) in comments, because they should convey the information with minimum cognitive load, and I personally do need UTF-8 to do that.

For the same reason as browsers allow users to set a default font for serif/sans-serif/monospace, I do believe it would be useful to have a "standardized" comment form that will be rendered in a separately-selectable monospace font with wide Unicode coverage.  I fully understand tggzz's font worries above, but often the most comfortable character set for code in your code editor does not have good Unicode coverage at all.  This is exactly where my suggestion about a standard Unicode comment form would be particularly useful: it would allow programmers to use their preferred Basic Latin or ASCII font for code and comments, but a wider-coverage font for Unicode comments.

Technically, U+002F U+002F U+2003 or "// ", the third character being em-space, would be near-perfect, except it is very hard to type.

This is why I suggested "//|" for C and C++ for exactly this, as it is easy to type, and is somewhat aligned with existing practices, and should be easy to add to existing editors.

It does not affect me personally that much, because I use UTF-8 Everywhere, and a monospace font (NoTo/DejaVu Mono/FreeMono) with wide Unicode coverage for my source code and plain-text documentation files.

I just wanted to show how using Unicode (especially for math expressions) via UTF-8 can yield better code comments, and maybe spark some ideas (and hear any objections) others have related to that.  I do know some people really don't like UTF-8.



I do strenuously reject comments that refer to documentation, except for naming a specific documentation file.  For math-heavy projects, this leads to documentation being split into multiple files.

(As an aside: what file format to use for rich text? HTML+CSS+JS works in a browser even locally, but is annoying to edit.  LaTeX and word processing formats save horrible HTML, but quite nice PDF files.  LaTeX, TeX, and MathJax and various markup languages, can produce very nice output in various formats, but one needs those document authoring tools installed to edit the documentation, and if it takes too much time/effort, it will not get done.)

I personally do like plain text files with UTF-8 for documentation (except LibreOffice + Math for reference book style math with proofs, or LaTeX for physics), assuming the file is displayed with a monospace font that covers at least Unicode blocks
Latin 1 Supplement (Ä, ö, é, ç, ×, ÷, ·, µ, ±, °),
General Punctuation (non-breaking space, ' ' aka em-space for graphics, ―, –),
Greek and Coptic (π, φ, ω, λ, β, etc.),
Box Drawing (┤,╦, ╱, ╲, ╳, etc.),
Block Elements (▘, ▞, ▙, █, ▓, ▒, ░, etc.),
Arrows (←, ↑, →, ↓, ↔, ↕, ⇐, ⇒, ⇔, etc.),
Mathematical Operators (≃, ≈, ≠, ≡, ≢, ≤, ≥, ∂, ∑, ∏, ∫, ⊕, ⊖, ⊗, ⊘, ⊙, ⊚, ⊛, ⊜, ⊝),
and ballots (☐, ☑, ☒) from Miscellaneous Symbols.
Enclosed Alphanumerics can be very useful for dense (single-character) 2D matrices or tables with 20 (numbers ①, ②, ‥, ⑲, ⑳; ⑴, ⑵, ‥, ⒆, ⒇; ⒈, ⒉, ‥, ⒚, ⒛) or 26 (uppercase letters Ⓐ, Ⓑ, ‥, Ⓩ) single-character elements; the enclosed lower-case letters (ⓐ, ⓑ, ‥, ⓩ) are often not very legible.

Within code comments, the set I need is basically the same, except for Block Elements, Miscellaneous Symbols, and Enclosed Alphanumerics.  (In documentation, Block Elements contains the 15 glyphs needed (+ space) to display 2×2 graphics, so it can be useful for dense boolean or binary 2D matrices.  Other than those, they're often used for banners.)
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 9559
  • Country: gb
Re: Comments
« Reply #155 on: December 24, 2023, 05:23:02 pm »
That is determined by the font you use: not all fonts have a good glyph coverage.  CJK may be poorly supported, for example.
These days CJK coverage is usually fine. Its technical things that have the most issues. The symbols coverage for maths, logic, and related things can still be pretty patchy. I usually view technical PDFs in Firefox or Chrome these days. They seem to display everything well.
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6967
  • Country: fi
    • My home page and email address
Re: Comments
« Reply #156 on: December 25, 2023, 04:00:09 am »
That is determined by the font you use: not all fonts have a good glyph coverage.  CJK may be poorly supported, for example.
These days CJK coverage is usually fine. Its technical things that have the most issues. The symbols coverage for maths, logic, and related things can still be pretty patchy. I usually view technical PDFs in Firefox or Chrome these days. They seem to display everything well.
PDF files typically include the vector description of the glyphs it uses, similar to how browsers can use WebFonts.

The interesting thing here is that for HTML files, including this forum, one can trivially add webfonts, as long as one has a suitable license.

For teletype text ([tt]...[/tt] and code blocks), the DejaVu Sans Mono font is used on this forum, with Monaco (an Apple Mac font) as a backup.  As DejaVu Sans is licensed under a free license that basically requires you only to provide the license file with the font typeface, it would be trivial to add to this forum as a web font!

Indeed, I created a self-explanatory example of this here, with a PDF version of the same page, for anyone to take a look at how that would work. Because I'm only interested in the monospace font, the entire page uses the monospace font as shown.
(Note that when aligning Box Drawing or Block Element characters on different lines, you need to use em-space (U+2003, ' ') or pairs of half-em/en-space (U+2002, ' ') instead of normal spaces for alignment! You only use normal space for aligning Latin characters and such.)

While this is veering a bit too far away from code comments, this is related to documentation (when using Markdown/MathJax/Sphinx etc. generators to generate HTML and/or PDF documentation from plain text markup files).  One can use FontForge (free and open source) to edit TrueType and OpenType fonts, and generate WOFF 1.0 format WebFonts from them.  (For WOFF2, I use fonttools, specifically python3 fontTools/ttLib/woff2.py compress -o fontname.woff2 fontname.ttf)
Remember: WebFonts are not installed, just referred to via CSS, and the browser will use those just as if it was actually installed locally.

FontForge is also useful when you have a preferred source code editing font, but with some niggle (perhaps O 0 or I 1 l are too similar): you can create your custom variant from the original TrueType/OpenType font, by just renaming it, and editing the niggles to better suit your needs.  Just remember to install the modified version of the TrueType/OpenType font after each edit, and that the renamed font is used in your editor.
« Last Edit: December 25, 2023, 04:04:26 am by Nominal Animal »
 

Offline PlainNameTopic starter

  • Super Contributor
  • ***
  • Posts: 7314
  • Country: va
Re: Comments
« Reply #157 on: December 25, 2023, 09:40:16 am »
Quote
As DejaVu Sans is licensed under a free license that basically requires you only to provide the license file with the font typeface

How would one do that for such as a web page? Or is the intent to make the license available on request rather than provide (with the implication of 'with')?
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6967
  • Country: fi
    • My home page and email address
Re: Comments
« Reply #158 on: December 25, 2023, 10:44:37 am »
Quote
As DejaVu Sans is licensed under a free license that basically requires you only to provide the license file with the font typeface

How would one do that for such as a web page? Or is the intent to make the license available on request rather than provide (with the implication of 'with')?
Generally, as with similar BSD and MIT licenses, it suffices that it is obvious for anyone looking at the sources (or documentation, but web pages obviously don't have documentation).

In the case of my example page, the note pointing to the license file is exactly where you discover the web font URL when looking at the HTML/CSS source files, as a CSS comment; with the license file itself copied next to the font files (i.e, on this same server, not on some upstream server).  This should best fulfill the intent of the license.

(Of course, I am assuming that if you intend to use those on your own web pages, either local or a server, you'll copy the files, instead of linking to those on my server.  It's just four files, for a total of under one megabyte, a third of which is the original TrueType font file.  The same procedure would be repeated for any other free fonts licensed using a similar license.  Typically, a web site or pages should not need more than three fonts, plus optionally MathJax web fonts.)
« Last Edit: December 25, 2023, 10:48:10 am by Nominal Animal »
 

Offline PlainNameTopic starter

  • Super Contributor
  • ***
  • Posts: 7314
  • Country: va
Re: Comments
« Reply #159 on: December 25, 2023, 11:05:07 am »
Thanks.

Completely unrelated:
Quote
for a total of under one megabyte

I remember when that was unimaginably big. Nowadays, 1GB is the new MB, although even that doesn't seem so terrible to transfer.

Merry Xmas   :)
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 9559
  • Country: gb
Re: Comments
« Reply #160 on: December 25, 2023, 03:36:30 pm »
PDF files typically include the vector description of the glyphs it uses, similar to how browsers can use WebFonts.
PDFs can include vector descriptions, but rarely do. 90% or more of the technical documents I use do not display correctly unless I use a platform with comprehensive font coverage, like Chrome or Firefox. For example, I have never seen an ITU document which displays correctly without adequate symbol coverage in the platform.
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6967
  • Country: fi
    • My home page and email address
Re: Comments
« Reply #161 on: December 25, 2023, 05:51:28 pm »
PDF files typically include the vector description of the glyphs it uses, similar to how browsers can use WebFonts.
PDFs can include vector descriptions, but rarely do. 90% or more of the technical documents I use do not display correctly unless I use a platform with comprehensive font coverage, like Chrome or Firefox. For example, I have never seen an ITU document which displays correctly without adequate symbol coverage in the platform.
When the generation is via PostScript, the PS-to-PDF conversion often assumes the core PostScript fonts are installed.  This leads to PDF files that refer to PostScript Core Fonts without including them, leading to PDF files that require those fonts to be included for the PDF to render correctly.  This is typical for large organizations which "moved" to PDF by just appending a PS-to-PDF conversion to their existing PostScript format publishing systems.

I checked the PDF version of the example DejaVu Sans Mono webfont example HTML page I created, and I verified with FontForge (it can extract fonts from PDF files) that it did include the used glyphs from the DejaVu Sans Mono font in it.  Basically all PDF files I create do include the glyph definitions for all characters used in that document, so I do object a bit about that "but they rarely do [include the font typefaces used in it]".

If we adjust the statement to "PostScript-oriented and MS Windows -based PDF generation often omits default fonts used in Windows or defined as PS Core Fonts, leading to PDF files with missing fonts on other systems", then I do agree.



The point is that one should not assume that generating a PostScript file and then converting that to PDF yields optimal results, because it does not.  One should always use the "native" 'export to PDF' or 'print to PDF file' approach instead.
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 9559
  • Country: gb
Re: Comments
« Reply #162 on: December 25, 2023, 05:58:53 pm »
PDF files typically include the vector description of the glyphs it uses, similar to how browsers can use WebFonts.
PDFs can include vector descriptions, but rarely do. 90% or more of the technical documents I use do not display correctly unless I use a platform with comprehensive font coverage, like Chrome or Firefox. For example, I have never seen an ITU document which displays correctly without adequate symbol coverage in the platform.
When the generation is via PostScript, the PS-to-PDF conversion often assumes the core PostScript fonts are installed.  This leads to PDF files that refer to PostScript Core Fonts without including them, leading to PDF files that require those fonts to be included for the PDF to render correctly.  This is typical for large organizations which "moved" to PDF by just appending a PS-to-PDF conversion to their existing PostScript format publishing systems.

I checked the PDF version of the example DejaVu Sans Mono webfont example HTML page I created, and I verified with FontForge (it can extract fonts from PDF files) that it did include the used glyphs from the DejaVu Sans Mono font in it.  Basically all PDF files I create do include the glyph definitions for all characters used in that document, so I do object a bit about that "but they rarely do [include the font typefaces used in it]".

If we adjust the statement to "PostScript-oriented and MS Windows -based PDF generation often omits default fonts used in Windows or defined as PS Core Fonts, leading to PDF files with missing fonts on other systems", then I do agree.



The point is that one should not assume that generating a PostScript file and then converting that to PDF yields optimal results, because it does not.  One should always use the "native" 'export to PDF' or 'print to PDF file' approach instead.
Yes, I think most people here know most of this, but it is completely irrelevant. The reality is most of the technical PDFs you get need full character set support in the display platform, and its still uncommon to have that, unless you use Firefox or Chrome (and probably Apple's alternatives). I haven't seen a Linux platform where Okular, Evince or other tools display properly, as they all rely on the system's font set, and by default its quite lacking. There is a reason why Firefox and Chrome on Linux machines sorts fonts out for itself. Try Acrobat on Windows. It will generally detect that some glyphs are not available and trigger the installation of additional fonts, but it can be quirky, and still leave you with unrendered stuff.
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6967
  • Country: fi
    • My home page and email address
Re: Comments
« Reply #163 on: December 25, 2023, 08:37:47 pm »
I haven't seen a Linux platform where Okular, Evince or other tools display properly, as they all rely on the system's font set
NOT TRUE!  They use the embedded glyphs, if available.  For proof, open the attached xnib.pdf PDF file in any of them.

I used FontForge to modify the letter x in the DejaVu Sans Mono font to have a small extra nib/serif in the upper left, and generated a WebFont from that.
(Note that after generating the webfont files (xnib.woff2 and xnib.woff), I completely deleted the modified TrueType font.  It was never installed on my system, and only exists as a Web Font at this point.)
I then reused my existing example HTML page, changing the CSS reference to xnib2.woff2 and xnib.woff only, and replaced the text with a single x.
In Firefox, that page when opened, shows the extra nib/serif in that x.  I then saved that page as xnib.pdf, as attached.

If you open the PDF a browser, you'll see an A4 portrait sized page, with a gray rectangular outline, with a single x near the upper left corner –– with that extra nib/serif in that x.

If you open the PDF file in any of the programs you mentioned, if you see an x with a nib/serif, the viewer uses the embedded font/glyph/typeface in the PDF, contrary to what you asserted.
Only if you see a normal monospace x, does the viewer use a system font.

In all PDF viewers I have in Linux Mint 20.3, including Evince, I see the nib; they thus use the embedded glyph, and not the system font.

Note that you can still copy the letter x from the PDF when viewing it, and it will paste as a normal letter x.  It is not graphic; it is text, just with a custom typeface (glyph for x).

I assert that the font problem is because people generate the PDF files wrong!

In Windows, you have to do some deep configuration in the PDF-emitting programs/drivers to get it to embed the fonts that are installed by default on Windows, because Microsoft only considers people using Windows, and prefers things looking shitty on all other systems.  Similarly on Mac OS wrt. standard Apple fonts.  Similarly, any document generation path that first generates PostScript, will assume that PostScript Core Fonts are installed, and when such a file is converted to PDF, those PS Core Fonts are assumed to be installed in all viewers and are not embedded in the PDF file.

I am very, very serious about this, and the attached xnib.pdf is my immediate proof proving your assertion (that Linux viewers like Evince and Ocular use system fonts and not the embedded glyphs in the PDF) is absolutely incorrect, in a form that is easily verifiable.



If you find it hard to accept my reasoning above, just let me know, and I'll create a second example HTML page, this time with a customized (extended) font, and instructions on how it (and the PDF version of that HTML page) was created, so you can reproduce everything for yourself, and verify this all for yourself.
I do expect you to then try and reproduce it, and report back, however.  Tit for tat!

In particular, it is not necessary to include the TrueType font file in the CSS; the WOFF2 and WOFF formats will suffice for all current and non-ancient browser versions (Firefox 3.6 and later, Chrome 6.0 and later, IE 9 and later, Konqueror since KDE 4.4.1, MS Edge (all versions), Opera 11.10 and later (Presto 2.7.81 and later), Safari 5.1, webKit build 528 and later).  Including the TTF font is just a nicety, that may help future browsers; plus, if installed (and only the TrueType/OpenType version can be "installed"), it allows those editing the HTML to see the same glyphs in their editor, instead of only in the browser.  It does not matter for the nibbed x here, but it does matter with fonts having custom Private Use Area characters.
« Last Edit: December 25, 2023, 08:39:53 pm by Nominal Animal »
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 9559
  • Country: gb
Re: Comments
« Reply #164 on: December 25, 2023, 09:10:05 pm »
I haven't seen a Linux platform where Okular, Evince or other tools display properly, as they all rely on the system's font set
NOT TRUE!  They use the embedded glyphs, if available.  For proof, open the attached xnib.pdf PDF file in any of them.

I used FontForge to modify the letter x in the DejaVu Sans Mono font to have a small extra nib/serif in the upper left, and generated a WebFont from that.
(Note that after generating the webfont files (xnib.woff2 and xnib.woff), I completely deleted the modified TrueType font.  It was never installed on my system, and only exists as a Web Font at this point.)
I then reused my existing example HTML page, changing the CSS reference to xnib2.woff2 and xnib.woff only, and replaced the text with a single x.
In Firefox, that page when opened, shows the extra nib/serif in that x.  I then saved that page as xnib.pdf, as attached.

If you open the PDF a browser, you'll see an A4 portrait sized page, with a gray rectangular outline, with a single x near the upper left corner –– with that extra nib/serif in that x.

If you open the PDF file in any of the programs you mentioned, if you see an x with a nib/serif, the viewer uses the embedded font/glyph/typeface in the PDF, contrary to what you asserted.
Only if you see a normal monospace x, does the viewer use a system font.

In all PDF viewers I have in Linux Mint 20.3, including Evince, I see the nib; they thus use the embedded glyph, and not the system font.

Note that you can still copy the letter x from the PDF when viewing it, and it will paste as a normal letter x.  It is not graphic; it is text, just with a custom typeface (glyph for x).

I assert that the font problem is because people generate the PDF files wrong!

In Windows, you have to do some deep configuration in the PDF-emitting programs/drivers to get it to embed the fonts that are installed by default on Windows, because Microsoft only considers people using Windows, and prefers things looking shitty on all other systems.  Similarly on Mac OS wrt. standard Apple fonts.  Similarly, any document generation path that first generates PostScript, will assume that PostScript Core Fonts are installed, and when such a file is converted to PDF, those PS Core Fonts are assumed to be installed in all viewers and are not embedded in the PDF file.

I am very, very serious about this, and the attached xnib.pdf is my immediate proof proving your assertion (that Linux viewers like Evince and Ocular use system fonts and not the embedded glyphs in the PDF) is absolutely incorrect, in a form that is easily verifiable.



If you find it hard to accept my reasoning above, just let me know, and I'll create a second example HTML page, this time with a customized (extended) font, and instructions on how it (and the PDF version of that HTML page) was created, so you can reproduce everything for yourself, and verify this all for yourself.
I do expect you to then try and reproduce it, and report back, however.  Tit for tat!

In particular, it is not necessary to include the TrueType font file in the CSS; the WOFF2 and WOFF formats will suffice for all current and non-ancient browser versions (Firefox 3.6 and later, Chrome 6.0 and later, IE 9 and later, Konqueror since KDE 4.4.1, MS Edge (all versions), Opera 11.10 and later (Presto 2.7.81 and later), Safari 5.1, webKit build 528 and later).  Including the TTF font is just a nicety, that may help future browsers; plus, if installed (and only the TrueType/OpenType version can be "installed"), it allows those editing the HTML to see the same glyphs in their editor, instead of only in the browser.  It does not matter for the nibbed x here, but it does matter with fonts having custom Private Use Area characters.
Again, this is well known but irrelevant. We have to read the documents we are provided with. They hardly ever contain the glyph information. A simple clean install of any Linux distro I have tried - Fedora, Debian, Ubuntu - setups up system fonts with very little in the symbols area. Most PDF viewers use those fonts, and fail to display properly. So, I use Firefox or Chrome, which have very comprehensive font sets built in.

« Last Edit: December 25, 2023, 09:18:20 pm by coppice »
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6967
  • Country: fi
    • My home page and email address
Re: Comments
« Reply #165 on: December 25, 2023, 09:26:12 pm »
We have to read the documents to are provided with. They hardly ever contain the glyph information.
That will only change if we tell those providing the documents they are generating them wrong, and push them to do it right.

Just like with code comments, we can agree a lot of them are utter garbage, less than useful.
I'm saying that instead of stripping the code comments out or telling people not to use comments at all, we push those we can to write good/better, useful comments, and for them in turn to push others when they can.

We cannot affect everyone, neither developers or document providers.  We can only affect those who care about our feedback, like learners in my case.

Yet, if we don't push back at all, we're essentially just accepting that the world is filling with more and more shit, and doing nothing about it.
I don't think that is useful.  I think pushing back, where possible, and at minimum pointing at the true culprit, is the reasonable path.

The true culprits are wrong methods/tools/settings used to generate the documents; and bad choices of what to describe in code comments.

There is zero value in blaming the end result –– that most code comments end up being not useful; or noting how hard it is to get all the fonts installed in Linux that Windows/Mac/PostScript document generators' take for granted and assume are installed.
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 9559
  • Country: gb
Re: Comments
« Reply #166 on: December 25, 2023, 10:18:36 pm »
We have to read the documents to are provided with. They hardly ever contain the glyph information.
That will only change if we tell those providing the documents they are generating them wrong, and push them to do it right.

Just like with code comments, we can agree a lot of them are utter garbage, less than useful.
I'm saying that instead of stripping the code comments out or telling people not to use comments at all, we push those we can to write good/better, useful comments, and for them in turn to push others when they can.

We cannot affect everyone, neither developers or document providers.  We can only affect those who care about our feedback, like learners in my case.

Yet, if we don't push back at all, we're essentially just accepting that the world is filling with more and more shit, and doing nothing about it.
I don't think that is useful.  I think pushing back, where possible, and at minimum pointing at the true culprit, is the reasonable path.

The true culprits are wrong methods/tools/settings used to generate the documents; and bad choices of what to describe in code comments.

There is zero value in blaming the end result –– that most code comments end up being not useful; or noting how hard it is to get all the fonts installed in Linux that Windows/Mac/PostScript document generators' take for granted and assume are installed.
Why are they wrong? Stuffing glyphs into documents was supposed to be a temporary bodge during a trasitional period. It is, perhaps, a good long term way to deal with truly special characters, that are not part of the Unicode set, although it can create some issues with compare, sorting, and so on. After all these years we should be able to rely on a Unicode character set being complete on all but a very small platform. If I plug a USB stick into my car, and play songs, my UK model car shows the Chinese song titles properly, even for some of the more obscure Chinese characters. Its 30 years since I first worked with Unicode (which itself was a bodge from day 1, as MS got them to base it on 16 bit characters), and we are still dealing with these issues on shiny new systems. Its pathetic.
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6967
  • Country: fi
    • My home page and email address
Re: Comments
« Reply #167 on: December 26, 2023, 01:34:21 am »
Why are they wrong? Stuffing glyphs into documents was supposed to be a temporary bodge during a trasitional period.
Says who?

As we see with WebFonts, it is the only way to get glyphs to look the same everywhere.  This is an uncontrovertible fact.
In the print industry prior to PDF and web fonts, you had to discuss the font selection with your printer.

After all these years we should be able to rely on a Unicode character set being complete on all but a very small platform. If I plug a USB stick into my car, and play songs, my UK model car shows the Chinese song titles properly, even for some of the more obscure Chinese characters. Its 30 years since I first worked with Unicode (which itself was a bodge from day 1, as MS got them to base it on 16 bit characters), and we are still dealing with these issues on shiny new systems. Its pathetic.
Such bodges were dropped in 1996, when Unicode 2.0 was introduced.  By that time, UTF-8 had also evolved to its final form, with one to four bytes per code point, covering U+0000 to U+10FFFF.  Just like C has evolved from the K&R times, Unicode and UTF-8 has as well.  Just because Microsoft decided to stay in the dark ages, does not mean everyone else has to suffer too.

Older fonts' typefaces were designed before the Unicode era, and thus all currently used glyphs were never designed for those typefaces/fonts.

DejaVu fonts (derived from the Bitstream Vera and Charter typefaces), GNU FreeFont (based on the Nimbus Sans/Serif/Mono fonts donated by URW++), and Noto / No Tofu (based on Droid Sans/Serif/Mono by the original Droid font designer) were explicitly designed to fix that.  They are all licensed under a free license that allows including them in applications and web pages (with very minor license-file related requirements).
(They also allow extending the font with new glyphs, although exact requirements then vary, but generally you need to use a completely different name, and use the same license as the parent font was licensed under.)

The problem is that not all authors choose to use such fonts, and web site and forum managers don't bother to provide them as web fonts.

Just like PDF authors don't bother to embed the glyphs used, and just assume everyone has the same OS and OS version, and thus fonts, they do.
In most current Linux distributions, if you print something to a PDF file, it will embed the glyphs to the PDF file, so that those files will look the same for everyone viewing the file regardless of what fonts they have installed on their system.

I consider it an error when anyone creates a PDF file that does not do the same.  It is like choosing one specific font to be used on your web page, laying it out exactly using that font, and then not caring a whit if the users have that font installed or just see a garbled web page.
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 28101
  • Country: nl
    • NCT Developments
Re: Comments
« Reply #168 on: December 26, 2023, 03:49:44 am »
I consider it an error when anyone creates a PDF file that does not do the same.  It is like choosing one specific font to be used on your web page, laying it out exactly using that font, and then not caring a whit if the users have that font installed or just see a garbled web page.
On that subject: it would be really nice if font creators also care whether the fonts render well without needing anti-aliasing / font smoothing. Some websites are completely unreadable because the fonts don't render correctly without anti-aliasing.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 
The following users thanked this post: Nominal Animal, DiTBho

Offline AntiProtonBoy

  • Frequent Contributor
  • **
  • Posts: 988
  • Country: au
  • I think I passed the Voight-Kampff test.
Re: Comments
« Reply #169 on: December 26, 2023, 09:18:10 am »
Why does no-one write comments any more? Not even just something to say what the function is meant to do, never mind the intricacies of how it goes about that. Some cryptic 12-character name may mean something to the programmer but it's bugger-all use when coming afresh to get a birds-eye view of things.

And variables! Really, do babies die or something if you let on WTF a variable is about?
If you need comment your code extensively, it's either badly written code, or function names and variables are poorly chosen. Well structured code should be essentially self-documenting and its functionality be self evident. Function names are descriptive and variables label what they contain. Better still, type system constrains and tells you what the variable represents (like units of measure, etc). The problem with commenting code is that they are always out of date and tell you lies. The only time i find myself commenting code is to describe some unusual algorithm or mathematical concept, with citations/references included.
 

Offline PlainNameTopic starter

  • Super Contributor
  • ***
  • Posts: 7314
  • Country: va
Re: Comments
« Reply #170 on: December 26, 2023, 09:31:40 am »
Quote
If you need comment your code extensively, it's either badly written code, or function names and variables are poorly chosen

Yes, so best give up and do something else. Or, you could just add a comment to help out the viewer a bit. No-one is perfect, yet all we see here is the assumption that we must be. Or else... well, what?

I guess we don't need to mark on circuit diagrams what that block does, or note expected measurement ranges, or any of the nice stuff that make circuit diagrams great to use. The wires between pins says it all anyway, right?

Go on, allow for non-gurus to actually be able to figure your shit out. It doesn't make you any less brilliant (the reverse, in fact).
 
The following users thanked this post: nctnico

Offline AntiProtonBoy

  • Frequent Contributor
  • **
  • Posts: 988
  • Country: au
  • I think I passed the Voight-Kampff test.
Re: Comments
« Reply #171 on: December 26, 2023, 09:38:23 am »
Yes, so best give up and do something else. Or, you could just add a comment to help out the viewer a bit. No-one is perfect, yet all we see here is the assumption that we must be. Or else... well, what?
Nobody says anything about giving up. It's about learning best practices. It literally takes zero effort to type extra characters that conveys a meaningful function name, or a variable name. I've seen way too many "helpful" comments that lead me to down to a garden path of misdirections, only because the code was re-factored a few years ago, leaving the comments unchanged and completely irrelevant.
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 20768
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Comments
« Reply #172 on: December 26, 2023, 09:47:19 am »
Yes, so best give up and do something else. Or, you could just add a comment to help out the viewer a bit. No-one is perfect, yet all we see here is the assumption that we must be. Or else... well, what?
Nobody says anything about giving up. It's about learning best practices. It literally takes zero effort to type extra characters that conveys a meaningful function name, or a variable name. I've seen way too many "helpful" comments that lead me to down to a garden path of misdirections, only because the code was re-factored a few years ago, leaving the comments unchanged and completely irrelevant.

Ah, the "some comments are wrong therefore all comments are wrong" zealotry.

Names don't indicate why and why not. Names dont indicate why you don't have to look at and comprehend a big blob of code in the first place.
« Last Edit: December 26, 2023, 09:49:58 am by tggzzz »
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline AntiProtonBoy

  • Frequent Contributor
  • **
  • Posts: 988
  • Country: au
  • I think I passed the Voight-Kampff test.
Re: Comments
« Reply #173 on: December 26, 2023, 10:40:12 am »
Spare me the reductionist strawmans mate. Read my previous comment here:
https://www.eevblog.com/forum/programming/comments/msg5241498/#msg5241498

Quote
The only time i find myself commenting code is to describe some unusual algorithm or mathematical concept, with citations/references included.
« Last Edit: December 26, 2023, 10:43:46 am by AntiProtonBoy »
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 20768
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Comments
« Reply #174 on: December 26, 2023, 10:47:20 am »
Spare me the reductionist strawmans mate. Read my previous comment here:
https://www.eevblog.com/forum/programming/comments/msg5241498/#msg5241498

Quote
The only time i find myself commenting code is to describe some unusual algorithm or mathematical concept, with citations/references included.

Given what you wrote in this thread, at least two people think my response was not a straw man response.

You have my apologies for not having committed all your writings to memory.
« Last Edit: December 26, 2023, 01:40:18 pm by tggzzz »
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf