Вештачка интелигенција

Insights into Artists’ and Writers’ Data Control

Summary

In the midst of the growing interest in generative artificial intelligence (AI), concerns are also rising about the data used to train these machines. Artists and writers are fighting for the right to influence how AI companies utilize their work. […]

Insights into Artists’ and Writers’ Data Control

In the midst of the growing interest in generative artificial intelligence (AI), concerns are also rising about the data used to train these machines. Artists and writers are fighting for the right to influence how AI companies utilize their work. They have filed lawsuits and publicly voiced their objections to the way these models collect data from the internet and incorporate their art without permission.

Opt-out Programs and Meta

In response to this pressure, some companies have launched “opt-out” programs that allow users the option to remove their work from future models. OpenAI, for example, has included this feature in their latest version of the text-generating AI model, called text-dasE. Meta has also established a similar mechanism, but confusion has been observed.

Ineffective Bans

While Meta’s mechanism for removing data from generative AI models appears to be an “opt-out” program, there is actually no efficient way to delete the data. Several artists who have tried to use this mechanism have expressed deep disappointment and frustration with the process. Meta has sent them standard letters stating that their requests are “impossible demands” because the artists have not provided evidence that their personal data or works actually exist in the responses of Meta’s generative AI.

The Real Goal of Meta’s Mechanism

Many artists have concluded that Meta’s data deletion mechanism is not intended to assist individuals, as claimed, but rather appears to be a marketing gimmick. While Meta has emphasized that it has no plans to offer an “opt-out” program, many artists point out its ineffectiveness and lack of clarity regarding the data used to train its models.

The Data Control Issue

Artists find it difficult to provide evidence that Meta’s models have been trained on their work or other personal data because Meta has not disclosed details about that data. Even if artists provide evidence, it does not guarantee that their data will be approved for deletion. In response to frustrations with this process, Meta has stated that people’s data can only be removed from AI models if there are legal grounds for such action.

Conclusion

Although Meta claims that its data deletion mechanism is not an “opt-out” tool but merely a request for deletion, this mechanism is often perceived as a poor substitute for real control that artists and writers have over their data. It is uncertain whether this form of request will ultimately help anyone gain control over how AI companies use their data. Nevertheless, this example clearly demonstrates that such tools do not provide adequate control and protection for artists and writers in relation to their work and data.

Frequently Asked Questions (FAQ)