Joe Parker-Rees

London-based graphic designer, working across both physical and digital platforms with a keen interest in typography, identity and web design. Currently seeking an internship or junior design position.

︎ Information

Selected Projects:

01 European Open Science Cloud (winning entry)
A new logo and identity for a European Commission initiative promoting open science practices.

02 Automated Typography
A type specimen publication that demonstrates the capabilities of automated typography in its current state.

03 KALA Experience
A unique and immersive web experience for Sri Lankan rap icon M.I.A's Kala.

04 F*ck Hostile Architecture
A campaign and publication protesting the use of hostile architecture in urban environments.

05 Baker Street Sans
A forensic new typeface identity for BBC’s Sherlock.

06 Fungi and Folklore
A zine exploring the traditions and rituals surrounding mushrooms

︎ Instagram
︎ Email

Automated Typography
Creative computing | Editorial Design

A type specimen publication that demonstrates the capabilities of automated typography in its current state.

After conducting extensive research into the field of automated type design, I wrote a brief to produce a type specimen that would demonstrate the capabilities of the technology in its current state. I interviewed several leading type designers for their thoughts, including Fabian Harb of DINAMO, Jules Durand, Leah Maldonado, Paul Mcneil of MuirMcneil, Daytona Mess and Cihan Tamti.

The publication was printed in black and white, for economical and functional reasons. Being a type specimen, the amount of imagery is minimal. The few images that are incorporated were treated with a grayscale filter. Meaningful quotes taken from primary research with profes- sional type designers was used to typeset the fonts I generated. This offered more contextual relevancy than using the conventional ‘quick brown fox.’

In the early stages of development and experimentation, I used an existing dataset of alphabets trained on StyleGAN. This provided a valuable introduction into how the technology works. I image traced one of the ML generated images and uploaded each glyph to a UFO font editor, and compiled a working .ttf font file. The result was a legible, but uncanny typeface.

Once I was confident with the process of working with ML, I trained my own dataset using the Google Fonts API. I scraped 500 images from the site and trained them on the Nvidia’s StyleGAN ‘Faces’ model for ease of use. The early stages of the training revealed some sinister, anthropomorphic letterforms. Whilst completely unexpected, I was drawn to the uncanny nature of these images, and they were featured in a spread of my publication.