Testing the AI Image Detection Tools - Can Comic Artists Prove Their Art Isn’t AI?
- Luke McKoy
- Nov 16, 2025
- 5 min read
Updated: Mar 7

I’ve been on a lot of Comic Artist Community forums in Facebook and Reddit recently, and have seen a rise in the amount of AI artwork that’s being shown. The response to it is almost universal - the general public grabbing their pitchforks while the creator looks on in disbelief, genuinely surprised people have an issue with their work, still seeing themselves as artists, just ones using a different medium.
No matter what your stance is on using AI, the comic industry is building one - at the New York Comic Con in October 2025, Jim Lee, President of DC Comics said, “DC Comics will not support AI generated storytelling or art.” Marvel Comics Editor-in-Chief C.B. Cebulski stated, "We never used it, we will not be using it, and we don't condone it in the Marvel Comics division". President/COO of Top Cow, Matt Hawkins came out strong with their Talent Hunt, saying "anyone using AI to either enhance their own art, create art from scratch or prints out AI images and traces them will be banned and I will personally email every editor in comics and let them know who you are."
But AI image generation is getting better by the day. I’ve seen work out there that doesn’t look like the typical “AI slop”, and it was only by looking through the artwork with a fine-tooth comb that I noticed the issues. Now am I the best judge? No. But with AI getting better, and people learning how to use it better to generate images, it means that we’re going to get to a point where you as an artist will have to prove your comic art is human-made.
So I decided to find AI image detection tools and see how good they were, using two distinct comics:
Fortune - written by me and hand-drawn by my friend, Josh Hurlburt.
Heresy - written by Jeff Gasalao and illustrated using Midjourney
---------------------------------------------------------------------------------- Update: Retesting AI Detection Tools (Feb 2026)
Since publishing this post, I've seen more work from AI artists and discovered new AI Image dectectors, so I decided it was time to re-test. The tools I trialled are:
All were tested using eight new comic pages from the same creator I tested previously. These pages were all generated with Midjourney, but the creator has clearly developed their workflow over time, and I wanted to see whether the AI detection tools had improved over time as well.
Here are the current results:
Contradictory Signals
What stood out this time wasn’t just low detection rates — it was inconsistency between tools on the same images.
One page was done up in an inked black-and-white, 1950s style, and that was flagged as AI by SightEngine, while all other tools passed it as human.
Conversely, another page was detected by every tool except for My Detector and NoteGPT.
In other words:
Some tools flagged images that nearly all others missed.
Some tools missed images that nearly all others flagged.
There was no clear agreement between systems on which images were “obviously” AI-generated.
An alternative view: how many tools did each image “beat”?
Another way to look at these results is to flip the question. Instead of asking “How accurate is each tool?”, we can ask: How many detection tools failed to flag each image as AI
Seen this way, the issue isn’t that one or two tools are inaccurate - it’s that AI-generated comic pages, when refined by a human, are consistently evading detection across the entire detection ecosystem.
This reinforces the same conclusion from earlier tests: current AI detection tools cannot reliably determine whether a comic page is AI-generated, even when tested side-by-side on the same images.
So What Now?
It’s difficult to say whether these results reflect improvements in AI image generation, improvements in how creators use these tools, or limitations in current detection methods. In reality, it’s likely a combination of all three. Tools aren’t failing because they’re bad - the landscape has changed. Detectable signatures from older models are disappearing, and creators are becoming better at smoothing over visual cues that detectors rely on.
At the end of the day, proving your work is human comes down to what artists have always done: showing your process. Posting sketches, warm-up drawings, and half-finished panels in the Facebook and Reddit communities you’re already a part of will show the evolution of your style - and AI can't evolve its artwork, it just mimics. Style isn’t generated; it’s shaped over years of trial, error, and repetition, and those little imperfections and breakthroughs become part of your artistic fingerprint (just like how Rob Liefeld is instantly recognizable for his exaggerated anatomy).
As AI gets better, having a visible record of that growth - whether sharing in your usual online spaces or building an online portfolio on something like Emerge - will matter more than ever. It’s not about proving every page; it’s about letting people see the journey behind your work.
That said, it’s worth acknowledging the human cost of this shift. Recording your process can make sense in professional or hiring contexts, but it doesn’t scale well as a default expectation everywhere. It’s a bit sad that, because of the rise of AI image creation, entire creative communities may end up feeling pressure to change how they work just to avoid suspicion.
There’s no silver bullet here. AI creation and AI detection tools will continue to chase each other in an ongoing cycle. For now, if you’re concerned about your art being mistaken for AI - or about receiving AI-generated work from others - the safest approach is skepticism over certainty: rely on context, consistency, and process rather than any single detector, but if in doubt, check work across multiple tools.
------------------------------------------------------------------------------------------------
The Original Test
Method:
5 pages and the cover from both comics were uploaded to the AI image detection tools, with each page being tested twice to minimize bias and ensure consistent results.
AI Art Detection Tool Comparison for Comic Art (2025 Test)
Tool | Detected AI Correctly | Recognized Human Art Correctly | Accuracy |
Copyleaks | 12/12 | 10/12 (misread Fortune cover as “partially AI”) | 91.7% |
AI or Not | 10/12 | 12/12 | 91.7% |
8/12 | 10/12 | 75% | |
7/12 | 9/12 | 66.7% | |
Hugging Face | 11/12 | 2/12 | 54.2% |
Hive Moderation | 0/12 | 12/12 | 50% |
Decopy AI | 0/12 | 12/12 | 50% |
Was it AI | 5/12 | 0/12 | 20.8% |
Summary of Results:
Copyleaks was our top performer - both times being cautious with Fortune's cover and saying "there could be some AI involved", so I marked it as incorrect even though it was 75% sure it was Human.
AI or Not came in second place, struggling with only one page for Heresy, which everyone else also got wrong (except Hugging Face, which called pretty much everything AI).
While Hive Moderation and Decopy AI got 100% corrrect for the human artwork, because they got 0/12 for AI, it suggests they're not actually reliable. Hugging Face wasn't much better, with 11/12 correct for AI but 2/12 for Human.



Comments