I’ve been experimenting with different AI photo editors purely out of curiosity, and the jump in realism is definitely noticeable. Even something like
Generic Anchor shows how the models are picking up small details that older tools used to miss completely. The tech is clearly moving toward more layered understanding of skin tones, lighting, folds, and all that. But that’s exactly why I think regulation will eventually catch up. Once people start using these tools to create convincing fake images of others without their permission, lawmakers won’t ignore it; they usually step in after a few big scandals hit the news. The funny part is that some developers try adding disclaimers saying outputs are “AI-generated,” but nobody actually reads those disclaimers. It also depends on the country: some places move faster with privacy-related laws, others barely care. So yeah, the realism will grow, but so will the pressure to restrict what’s allowed. Kind of like how deepfake rules started appearing only after celebrities complained publicly.