AI & Photography: A Field Guide for ASMP Members (2025)

What do we mean by “AI” here?

  • Generative image models (e.g., Midjourney, Stable Diffusion, Firefly) create or transform pictures from prompts, image-to-image, or in-painting.
  • LLMs (e.g., ChatGPT, Claude) are text systems—great for briefs, treatments, estimates, captions, keywording, and research, but they don’t create photos; they plan and describe.
  • Assistive AI inside editors (masking, selection, denoise, up-rez) accelerates post, not authorship.

This post focuses on where those worlds touch your business, your rights, and your reputation.


The Legal Landscape (fast-moving, but clearer than last year)

Copyright & “human authorship” (U.S.)

  • The U.S. Copyright Office says: AI outputs are protectable only where a human’s creative choices are sufficiently determinative—mere prompting isn’t enough; case-by-case analysis applies. Register the human-authored portions and disclose AI assistance. U.S. Copyright Office (2024)     U.S. Copyright Office (2025)

Major litigation to watch

  • NYT v. OpenAI/Microsoft (SDNY): Court allowed core copyright claims to proceed; a May 13, 2025 order compelled preservation of ChatGPT logs—big implications for data governance. Outcome affects training-data defenses and licensing norms. Justia Law
  • Getty Images v. Stability AI: Parallel UK/US actions over training on Getty content and mark removal. Results will shape dataset licensing and model liability. Finnegan

Deepfakes, impersonation & right of publicity

  • FTC Impersonation Rule (final): bars impersonation of gov’t/business; the FTC is expanding enforcement around AI-enabled fraud and is exploring coverage for individual impersonation. Federal Register
  • Tennessee ELVIS Act (first of its kind): adds voice to right-of-publicity protections; civil and criminal remedies for unauthorized AI clones. States are copying this model.Home+3Reuters
  • “Take It Down Act” (U.S., 2025): criminalizes publication of non-consensual intimate deepfakes and imposes takedown obligations on platforms. AP News

Europe (and beyond)

  • EU AI Act (OJ July 12, 2024): risk-based regulation with specific transparency duties for generative AI and guidance for General-Purpose AI (2025 drafts). Expect disclosures for AI-generated content and stricter provenance in news/ad contexts. Artificial Intelligence Act
  • Italy (Sept 18, 2025): first EU country to enact a comprehensive national AI law aligned to the AI Act; criminal penalties for harmful deepfakes and limits on text/data mining—relevant for dataset builders and agencies sourcing AI content. The Guardian

Provenance & authenticity (industry standards)

  • C2PA/Content Credentials: cryptographic “nutrition label” for media. Leica, newsrooms and editors are shipping native support; this is the most practical, near-term way to prove what’s camera-original vs. AI-altered. Canon and Nikon have expressed support for C2PA but have yet to implement into new models. See Leica info at contentauthenticity.org

Disclaimer: None of the above should be construed as legal advice. Always check with your lawyer before adding language to contracts.


Ethics & Professional Standards (the ASMP line)

  • Consent: no synthetic likenesses of real people (or property) without explicit permission.
  • Disclosure: tell clients when any generative process shaped the result—especially in advertising, advocacy, or journalism.
  • Attribution & licensing: don’t ingest or fine-tune on client-owned libraries without a written license spelling out scope and retention.
  • Bias & representation: test your prompts and models for stereotyping; commit to inclusive datasets and review.
  • Provenance: ship Content Credentials on deliverables; document capture/edit chain.

(These map to emerging laws and reduce risk while preserving trust.)


Practical Playbook for Members

A) Contract & Release Add-Ons (plain-English starters)

  • AI Use & Disclosure
    “Photographer may use assistive AI tools for editing. Photographer will not generate or depict any person’s likeness or any new scene with generative AI unless Client gives prior written consent. If generative AI is used, Photographer will disclose the method in writing and embed Content Credentials.”
  • Training & Fine-Tuning
    “Photographer will not use Client Assets to train, fine-tune, or feed any AI system except as expressly licensed herein, and then only with dataset isolation and deletion on project end.”
  • Provenance
    “Photographer will deliver files with C2PA Content Credentials enabled where supported.”
  • Model/Property Release (AI Clause)
    “The undersigned does not consent to AI-based cloning or synthesis of their voice, likeness, or identity unless a separate rider is signed specifying scope, media, and term.”

(Run by counsel; tailor to your state and client type. Clauses align with U.S. trends on right of publicity and deepfake rules.) Holland & Knight

B) Workflow & Tooling

  • Capture & ingest: enable Content Credentials in-camera/editor where available (Leica today; others coming). contentauthenticity.org
  • Edit: log what’s restorative vs. generative; keep a sidecar note.
  • Deliver: embed credentials; add a one-line disclosure when generative methods shaped the final.
  • Archive: segregate AI-assisted projects; store prompts/settings like you would a lighting diagram.

C) Talking Points for Clients

  • “We use AI as a tool, not a substitute for authorship.”
  • “You’ll know when generative methods are used, and we’ll document provenance.”
  • “Your libraries won’t be used to train anything without a separate license.”

The Next 5 Years: What’s Likely

  • Provenance everywhere: C2PA badges spread from cameras to CMSs and ad platforms; “no-badge” creative gets flagged or discounted in sensitive categories. contentauthenticity.org
  • Clearer copyright tests: Case law will sketch a “human-control” threshold for protectability; registration forms will continue requiring AI disclosure. U.S. Copyright Office
  • Right-of-publicity goes federal (or near-federal via harmonized states): momentum from ELVIS-style laws and NO FAKES proposals. OMM
  • Client policy clauses: Brands and agencies will mandate disclosure/provenance the way they mandate COIs and indemnities.
  • Creative advantage: Photographers who fuse authentic capture + responsible augmentation + clear disclosures win more briefs.

Further Reading & Live Trackers


Member Action Checklist

  • Turn on Content Credentials (or note provenance)
  • Add AI clauses to your contracts & releases
  • Disclose generative use, when used
  • Don’t train on client libraries without a license
  • Keep your own “AI log” (prompts/settings) in the job folder
  • Point clients here when they ask “Are you using AI?”

Featured image made with Midjourney V 7

Related post: “Ai in the Edit Bay“, Disney vs Midjourney, and Ethical Ways to use Generative Ai.


Quick Faqs

Are AI “headshot replacements” OK to deliver?

Only with informed consent. Get a signed release that explicitly allows synthetic alterations and likeness generation. Avoid misleading representations in employment or commercial contexts; label AI-assisted deliverables.

Can I train or fine-tune AI on a client’s photo library?

Not without a written license outlining scope, security, dataset isolation, and deletion timelines. By default, do not use client assets for training.

How do I prove my images are authentic in an AI world?

Use provenance: enable C2PA/Content Credentials where possible, keep edit logs for generative steps, and include a short disclosure in your delivery notes. For sensitive use cases, supply BTS or capture proofs.

Can I register AI-assisted images with the U.S. Copyright Office?

Yes, if your human authorship is substantial (e.g., capture, composition, retouching, selection/arrangement). Disclose any AI assistance in the application and only claim what you (not the model) authored.

Do I have to disclose generative AI use to clients?

Best practice: yes. For advertising, advocacy, and editorial work, disclose when generative elements shaped the result. Build this into your estimate/contract and deliver files with provenance (e.g., Content Credentials).