Photoshop's newest terms of service has users agree to allow Adobe access to their active projects for the purposes of "content moderation" and other various reasons. This has caused concern among…
The question I’d like to ask them is WHY they want to get involved in Content Moderation. They make a toolset, nothing more, so why do they care what someone is using the tools for? What could they possibly get out of this that makes it worth the time or expense?
I imagine it’s because of the generative AI stuff. If they’re using their servers to generate, they’re going to be responsible for what it puts out, even if it’s just responding to user prompts.
Yep, and with access to the work files they not only can use final images for AI training but they have access to the complete background information like the different layers of an image.
As someone who’s used their tooling and the generative tooling… I have to admit trying to push its limits for giggles. It is VERY conservative already so I don’t see why they’d need additional moderation privileges.
The illustrator tools are terrible. But removing and replacing backgrounds in Photoshop has been spectacular with one caveat - they are less great if you give it any instruction. If you use the generative fills with prompts the results are not at all great. However, if you leave the prompt blank it does a bang-up job matching the existing background set / scene.
Equally impressive has been generating parts of photos that are missing when extending the canvas size.
It tends to work best with photos that are “inside” (interiors) with strong geometric cues - but it has expertly matched lighting, backgrounds and their level of focus (or lack thereof).
The content is being uploaded to Adobe’s servers, they likely have the right and may even be legally required to moderate it to some degree.
This yet another reminder that the cloud is just somebody else’s computer. Somebody who might want to impose some degree of control with what is done with their computer, for whatever reason.
The question I’d like to ask them is WHY they want to get involved in Content Moderation. They make a toolset, nothing more, so why do they care what someone is using the tools for? What could they possibly get out of this that makes it worth the time or expense?
I imagine it’s because of the generative AI stuff. If they’re using their servers to generate, they’re going to be responsible for what it puts out, even if it’s just responding to user prompts.
Removed by mod
Yep, and with access to the work files they not only can use final images for AI training but they have access to the complete background information like the different layers of an image.
As someone who’s used their tooling and the generative tooling… I have to admit trying to push its limits for giggles. It is VERY conservative already so I don’t see why they’d need additional moderation privileges.
This is an awful change.
I tried using their generative tools a while back and they were pretty terrible. Curious what your experience has been.
The illustrator tools are terrible. But removing and replacing backgrounds in Photoshop has been spectacular with one caveat - they are less great if you give it any instruction. If you use the generative fills with prompts the results are not at all great. However, if you leave the prompt blank it does a bang-up job matching the existing background set / scene.
Equally impressive has been generating parts of photos that are missing when extending the canvas size.
It tends to work best with photos that are “inside” (interiors) with strong geometric cues - but it has expertly matched lighting, backgrounds and their level of focus (or lack thereof).
Thanks for the insight, I was using it to create something new from a prompt, so my bad experience seems to align with yours.
Feeding some other crappy AI
The content is being uploaded to Adobe’s servers, they likely have the right and may even be legally required to moderate it to some degree.
This yet another reminder that the cloud is just somebody else’s computer. Somebody who might want to impose some degree of control with what is done with their computer, for whatever reason.