Self made glyphs won't show up in HTML


I am quiet new to the glyphs app and don’t know my way around. But, I have created an icon and made a glyph. Named it uni0000 and it got the unicode 0000.

I also created a glyph uniE009 and it got the unicode E009.

The #0000 shows up at letters and the #E009 shows up in Private use. However, both don’t work when I type &#E009; or � even other codes don’t work. The #000 shows up as a square and the #E009 shows up as just a blank space.

Can someone help me out with this?

uni0000 is the Null character in the Unicode charts. Why are you wanting to do custom naming? It might be helpful if you could post an image of the glyphs you are creating; they may have unicode values assigned already.

You should never use Unicode ‘0000’ for a glyph that you like to use. It is reserved for special purpose.

How do you test the font? Don’t install it in the system.

I have a similar question. So, we have tools that are testing for the presence of uni0000. If I have a glyph named null (or similar), it won’t get the encoding. If I have the glyph named as “uni0000” I will see the ‘0000’ encoding in the app, but it won’t export that way. I’ve tried to change this with custom parameters on the instances, but can’t figure it out. End result I want is the “null” glyph to have encoding as 0000, or to actually export the “uni0000” with the 0000 encoding.
What do you think?

0000 is a control code. Why would you want to draw a glyph for it?

I know dude, it’s crazy. I don’t need to draw anything for it, just leave it blank. I’m saying that we have a number of QA tests that look for a glyph that is encoded with 0000, and I can’t find a way to make Glyphs export it that way. So we have to ‘fix’ it with a table editor. I think this might be because of legacy requirements from the Microsoft-published OT spec, for the ‘first four glyphs’ (notdef, null, CR, space).

.notdef is supposed to be the uni0000 glyph. I don’t know, why there is an extra null glyph. But you don’t need to encode anything to 0000. if you ask a font for an Unicode that is not in the font, you get glyph ID zero. And that should be the .notdef glyph. So I don’t know why or how one should ‘encode’ 0000.

I understand. I think Monotype has been encoding .null as 0000 for years and years. I agree, we should consider changing that requirement. Thanks for your time and comments.

I don’t see anywhere that says .notdef should be U+0000. Isn’t that the whole point of being not def? It’s not defined so it has no codepoint. NULL was always U+0000 as stated in the old Microsoft Recommendation for the first four glyphs:

That recommendation was removed in OpenType 1.8 last year so maybe we don’t need to care so much about requiring those first four anymore, but I think .notdef should still be undefined and NULL should still be U+0000.

But there is no point of having a 0000 glyph in the font. What should it do? You can’t have that code in a string.