Artists question Apple’s AI data transparency

3 Min Read
"Apple's AI"

Growing concerns among Apple users, particularly those in the artistic field, have surfaced due to Apple’s ambiguity surrounding the use of its AI technology. The lack of detailed information and transparency about the utilization of the AI has led to dampening enthusiasm that once thrived among users.

Apple aims to roll out AI technology later in the year, allowing users to create images from text instructions. This prominent shift not only signifies a substantial leap for Apple but also offers exciting prospects for user interaction and content development, crafting a more engaging and intuitive user experience.

Vancouver-based video game artist Jon Lam, along with other influential figures in the creative field, has criticized Apple for not providing sufficient disclosure about the sources of their AI model’s training data. The critique highlights the potential issue of unrecognized contributors being exploited. Given the potential that public artwork, such as digital games available on App Stores, could be utilized in training AI, privacy and copyright concerns arise.

Understanding the origin of training data is deemed critical, as it can foster innovation and fairness among developers and tech companies.

Apple’s AI data usage raises concerns

A transparent approach could lead to more ethical technology development, ensuring contributors to an AI model’s development are properly acknowledged and compensated.

For effective AI functionality, the usage of superior quality data is essential during the training phase. Recently, the use of approximately six billion images from the LAION-5B dataset available on the internet, has attracted criticism and given rise to legal disputes over copyright infringement.

Apple and other AI companies use AppleBot, a web crawler, to extract publicly available data from the web for AI training. While Apple has ensured to anonymize and ethically use any extracted data, its specific data sources remain undisclosed. This has raised questions about their operational transparency.

The balance between data collection for AI training and user privacy remains a challenging issue across the AI industry. An increasing consensus appears to favor more transparency about data extraction practices and usage from AI companies.

Artists such as Andrew Leung have voiced concerns over AI companies utilizing their data for AI training without prior approval and adequate compensation. The dissatisfaction among creators underscores the need for unambiguous policies regarding data usage within the AI industry. This would ensure creators’ rights, fair compensation, and resonate with healthiness in the art and AI world.

Share This Article
Emily Parker is the dynamic force behind a groundbreaking startup poised to disrupt the industry. As the founder and CEO, Emily's innovative vision and entrepreneurial spirit drive her company's success.