It seems that I can’t get any layers from an instance that is not added to Font.instances with interpolatedFontProxy.
Example:
def make_instance(*values):
i = GSInstance()
i.font = Font
for x in range(len(i.axes)):
i.axes[x] = values[x]
return i.interpolatedFontProxy
interpolatedFont = make_instance(50) #or any available value
print(interpolatedFont.glyphs['a'].layers[0]
would print None. Adding the instance to Font.instances fixes it, but I’d rather not do that, nor use interpolatedFont. Any idea?
The problem is that the instance gets deallocated when your function returns. So you need to store it somewhere until you are done. A global list will do the trick.
instances = []
def make_instance(*values):
i = GSInstance()
instances.append(i)
i.font = Font
for x in range(len(i.axes)):
i.axes[x] = values[x]
return i.interpolatedFontProxy
interpolatedFont = make_instance(50) #or any available value
print(interpolatedFont.glyphs['a'].layers[0])
Thanks Georg! I think I get the general logic behind this (the actual instance object doesn’t exist outside the scope of the function) but then, shouldn’t I just be unable to access anything via interpolatedFontProxy? I could still get some stuff, like glyphs, or the master, but just not layers.
In my particular case, I guess it’ll be easier to just have the function return the instance instead. Why didn’t I think of that before, well… ¯\(ツ)/¯
def make_instance(*values):
i = GSInstance()
i.font = Font
for x in range(len(i.axes)):
i.axes[x] = values[x]
return i
instance = make_instance(50)
interpolatedFont = instance.interpolatedFontProxy
print(interpolatedFont.glyphs['a'].layers[0])
I found a bug, where a color layer (either layer.isFullColorLayer()or glyph > layer.parent.isColorPaletteGlyph), when interpolated, does not have a width:
instances = []
def make_instance(*values):
i = GSInstance()
instances.append(i)
i.font = Font
for x in range(len(i.axes)):
i.axes[x] = values[x]
return i.interpolatedFontProxy
interpolatedFont = make_instance(50)
print(interpolatedFont.glyphs['a'].layers[0].width)
Run this on a glyph/layer that has one of these this color layer type, and it will always return 0.0
Other Layer types always have their actual width.
G 3227
Edit:
the color layer of type layer.parent.isColorPaletteGlyph(), if it is a color palette layer under a non-color layer master. Which is the proper setup, as I recon. Still wondering if that is a bug or intended.
Edit 2:
It seems I might need to dig through the .glyphMetrics() of that layer then?
However, I still only get a non-zero for the master locations. As soon as it is somewhere between, it returns 0 for the width:
instances = []
def make_instance(*values):
i = GSInstance()
instances.append(i)
i.font = Font
for x in range(len(i.axes)):
i.axes[x] = values[x]
return i.interpolatedFontProxy
interpolatedFont = make_instance(50)
print(interpolatedFont.glyphs['A'].layers[0].glyphMetrics()[0], " <- BUG ❌")
interpolatedFont = make_instance(0)
print(interpolatedFont.glyphs['A'].layers[0].glyphMetrics()[0], "<- OK")
interpolatedFont = make_instance(100)
print(interpolatedFont.glyphs['A'].layers[0].glyphMetrics()[0], "<- OK")