Dylan Field, CEO of Figma, has announced that the company will temporarily disable its AI design feature following allegations that it was copying designs from Apple’s Weather app.
Figma disables its AI design feature temporarily
This issue was initially identified by Andy Allen, the founder of NotBoring Software, which offers a variety of apps including a customizable Weather app. Through testing Figma’s tool, Allen discovered that it consistently replicated Apple’s Weather app when used for design assistance.
Allen accused Figma of extensively training its tool on existing apps on X.
(1) As we shared at Config last week – as well as on our blog, our website, and many other touchpoints – the Make Design feature is not trained on Figma content, community files or app designs. In other words, the accusations around data training in this tweet are false. https://t.co/jlfmroPPhm
— Dylan Field (@zoink) July 2, 2024
In the following section, you can clearly observe the similarities and differences:
Field announced a temporary suspension of the AI design feature following concerns that it was mimicking Apple’s Weather app designs. Andy Allen, the founder of NotBoring Software, which creates various apps including a customizable Weather app, discovered this issue. Through testing, Allen found that Figma’s tool often replicated Apple’s Weather app when used to aid in design. Allen publicly accused Figma on X of heavily training its AI on existing apps, an accusation that Field has since denied.
The AI design feature, integrated within Figma’s software, enables users to create UI layouts and components from text descriptions. Figma introduced this tool by explaining that developers could quickly draft their ideas and explore different design directions more efficiently. Launched at Figma’s recent Config conference, the company clarified that the tool was not trained on Figma’s own content, community files, or app designs. Field reiterated this point in his response on X, stating, “The accusations around data training in this tweet are false.”
Collov AI Design stands by both the architect and the homeowner
However, in its rush to introduce new AI capabilities and stay competitive, Figma seems to have overlooked critical quality assurance processes. This has led to concerns among designers, some of whom fear that such AI design features could potentially displace jobs by making digital design widely accessible. Conversely, others believe that AI could alleviate repetitive tasks, allowing designers to focus on more innovative and creative aspects of their work.
“Just a heads up to any designers using the new Make Designs feature that you may want to thoroughly check existing apps or modify the results heavily so that you don’t unknowingly land yourself in legal trouble,” Allen stated.
Field clarified that Make Design utilizes pre-existing large language models, integrating them with custom systems specifically designed for these models. He explained that while this combination aims to enhance functionality, it faces a significant limitation. The main issue is the low variability in the models’ output. According to Field, this lack of diversity impacts the overall effectiveness of their approach.
(4) I have asked our team to temporarily disable the Make Design feature until we are confident we can stand behind its output. The feature will be disabled when our US based team wakes up in a few hours, and we will re-enable it when we have completed a full QA pass on the…
— Dylan Field (@zoink) July 2, 2024
“Within hours of seeing [Allen’s] tweet, we identified the issue, which was related to the underlying design systems that were created. Ultimately it is my fault for not insisting on a better QA process for this work and pushing our team hard to hit a deadline for Config,” Field said.
While Apple has not yet commented, Figma referred to Field’s tweets as their official statement on the matter. In response to the issue, Field announced that Figma will temporarily disable the Make Design feature until a full quality assurance review is completed.
Firstly, the use of off-the-shelf large language models combined with custom systems in Figma’s tool highlights the complexity and potential unpredictability of AI in creative processes. Dylan Field’s admission that the system’s variability was too low points to a significant oversight in the quality assurance process. This is a crucial lesson for any company integrating AI into their products: thorough testing and verification must be prioritized to avoid such missteps.
Moreover, this situation brings into question the ethical implications of AI training on existing designs. While Field denies that Figma’s AI was trained on specific app designs, the results suggest otherwise, leading to a perception problem that can damage trust in the company. This incident should serve as a wake-up call to the tech industry about the importance of transparency and ethical standards in AI development.
Featured image credit: Zac Wolff/Unsplash