Metadata script

Hi guys,

I have been trying to run this script I wrote to make simple csv document containing information on the fonts selected:

import os
import csv
import gc

toCSV = []

input_folder = ''
output_folder = ''
output_file = 'A_File.csv'

for root, dirs, files in os.walk(input_folder, topdown=False):
	for name in files:
		if name != '.DS_Store':
			path = os.path.join(root, name)
			try:
				font = Glyphs.open(path, showInterface=False)
				print(font)
				font_dictionary = {}
				font_dictionary['Filaname'] = name
				font_dictionary['Postscript Font Name'] = str(font.instances[0].fontName)
				font_dictionary['Postscript Full Name'] = str(font.instances[0].fullName)
				font_dictionary['Weight'] = str(font.instances[0].weightClassName)
				font_dictionary['Width'] = str(font.instances[0].widthClassName)
				font_dictionary['Family'] = str(font.familyName)
				font_dictionary['Version'] = (float(str(font.versionMajor) + '.' + str(font.versionMinor)))
				toCSV.append(font_dictionary)
				font.close()
				gc.collect()
			except:
				try:
					font.close()
					gc.collect()
				except: 
					gc.collect()
					pass

keys = toCSV[0].keys()
with open(os.path.join(output_folder, output_file), 'w')  as output_file:
    dict_writer = csv.DictWriter(output_file, keys)
    dict_writer.writeheader()
    dict_writer.writerows(toCSV)

However, when I run it on my larger collections my Mac eventually runs out of memory. How can I stop this crash?

Are those .glyphs or .otf files?

Occasionally glyphs files but mainly OTFs.

It works flawlessly for small collections but anything larger Glyphs starts to use up memory.

I have 40GB RAM but eventually it maxes out. Is it the code?

It seems that there is a memory leak somewhere. Your code looks fine.

That is good to know.

In the meantime, I have rewritten the code using the FontMeta and it has worked. Would be great to do these in Glyphs too (as Glyphs offers more options).

I think there is a typo. Should not have caused the memory leak though.