Idea: Event-based design tokens versus centralized design tokens
🧵
Here is a simplistic example of how design tokens are derived from Figma but serve as the central source of truth for design specifications in code.
Consuming applications treat design tokens as read-only--neither extending nor normalizing them.
Logically, this makes sense. If we have a single source of truth for design tokens, every consuming application will have the same design specifications (which is the whole point).
However, it means that...
1) Each consumer has to source from a data set (the design tokens) that lives elsewhere (more risk of obscurity).
2) Each consumer is dependent on Figma to change in order to gain access to new tokens.
Well, obscurity can be fixed with good documentation.
And, you could build some pipeline where changes to the design tokens are reflected in Figma. Or, the design tokens can extend and alias (but not mutate) the Figma specifications in a way that makes everyone happy.
However, what if Product A has its own product-specific specifications and wants to add them to the design tokens?
Developers on this team could contribute directly to the design tokens--but that is outside work and may not be fitting for the centralized tokens.
These considerations are analogical to the debate about microservices.
You can design a microservice that is the source of truth for some domain/concept (e.g. products, licenses).
Each product consumes the microservice when it needs to interact with the data of a given domain.
Design tokens are effectively a microservice serving as a source of truth of a domain/concept, the design specifications.
In the microservices world, some suggest moving away from a single source of truth.
Instead of centralized microservices that each product queries from, each product has its own data source from which it queries.
Doesn't this lead to duplication of data if a product shares domain data with another product?
Not always. Each product that manages a domain may emit an event when changes to a domain are made.
e.g. if a product manages licenses, and it can emit an event when licenses change.
Other products dependent on that domain can subscribe to the event, ingest the domain data, and store the domain data into their own data source as it sees fit.
Now, products can not duplicate data while not having to rely on centralized dependencies.
Ok, so if design tokens are a type of microservice, in that they manage the data of a domain--the design specifications, then we can apply the event-based model for sharing domain data, for sharing design specifications.
What if...when Figma changes (the source of the design specifications domain), it emits an event with the new design specifications.
Then, each product ingests the specs and translates them into its own design tokens source.
*same image*
If a product needs to reference design tokens, it references its own source.
It cannot mutate the specs, but it can alias and extend them as needed.
Since the tokens are owned by each product, they are less obscure and not dependent on a central service/team.
What are the implications of this?
Not only would this event-based model of design tokens reduce technical complexity--it would also reduce organizational complexity (since the relationship is symbiotic).
You eliminate the need to gatekeep design tokens through a separate team.
If you enjoy philosophical takes on software architecture and organization, you may like my latest mini-ebook (it's free+): leanpub.com/rethinkingagile