When you submit a .docx, .pdf, or .pages file, your professor (or their TA) doesn't just read the content. If they're running a document forensics tool — and many are, increasingly — the tool inspects the file itself. Author fields. Edit history. Timestamps. Font signatures. Paste patterns. Embedded objects. Things you didn't know were in there.
This page is straightforward: here's exactly what those tools see, what each signal means in plain language, what raises a flag and what doesn't, and how to check your own document before you submit. Read it whether or not your school uses these tools — they're going to be more common, not less.
The fields anyone can see
These show up the moment your professor opens your file in a forensics tool. They're not hidden; you can see them yourself by going to File → Properties in Word, or by unzipping the docx and looking at the XML.
Author (the "creator" field)
Whoever first created the file in Word / Google Docs / Pages. If you started your essay from a template your roommate gave you, the author field might still say their name. If you wrote it on a parent's computer, it might say their name. This is fine if you can explain it; it's not fine if you submit a file authored by "John Smith" when you're Maria.
Last modified by
Whoever last saved the file. Usually you. If it's not you, that's a question to be ready for.
Created date
When the file first existed. If you started your essay before the assignment was given out, the created date is older than the assignment. That's not bad — maybe you reused a doc from another class as a starting point — but be ready to explain it if asked.
Modified date
When you last saved it. Should be on or before the submission deadline. If you're submitting at 11:58 PM and the modified date is 4:00 PM, that's fine. If the modified date is after the deadline, that's a red flag for "someone edited after submission" or the file has a wrong system clock.
Application
What program made the file. Microsoft Word for Office 365 is normal. Google Docs (exported as .docx) is normal. Pages 13.2 is normal. WPS Office is fine but less common. The application string sometimes raises questions if it doesn't match the school's normal student software — not because it's wrong, but because it's worth asking about.
The fields you probably didn't know existed
These are the ones forensics tools surface that most students don't think about.
Total editing time (the "totalTime" field)
The application keeps a rough count of minutes your document was being actively edited. This is the signal that flags the most "I-did-this-in-five-minutes" submissions. A 2,000-word essay with 6 minutes of totalTime and a revision count of 1 is the classic "pasted from elsewhere" pattern. A 2,000-word essay with 4 hours of totalTime and 60 revisions is the classic "worked on this all week" pattern.
If you wrote your essay in Google Docs and exported to docx at the end, your totalTime in the docx will be near-zero — Google Docs's edit-time doesn't transfer. This is a frustrating false signal; if your school knows about it, they account for it. If they don't, mention it.
Revision count
The application bumps this every time you save. A long essay with revision: 1 is unusual unless you wrote the whole thing in a single sitting without saving (most editors auto-save every few minutes). Most genuine essays have 20-100+ revisions.
Edit history (tracked changes residue)
If you ever turned on Track Changes — even briefly — the underlying XML can keep traces of accepted edits even after you clicked "accept all." Forensics tools can read these traces. If you accepted suggestions from someone else (a tutor, a parent, a friend, an LLM), the source of those suggestions sometimes shows up.
Font signatures
Most documents have one or two fonts and consistent font sizes. Pasted content sometimes brings its own fonts in — the source application's defaults. A forensics tool sees this as paste-from-elsewhere. The fix is simple: after pasting any content, select-all-and-set-the-font-to-your-document-default before submitting. This is good academic-writing hygiene regardless of forensics.
Hidden text and tracked changes
Word lets you mark text as "hidden" (white-on-white, formatted out of view). It's still in the file. Tracked changes can also persist invisibly. Forensics tools surface these.
Embedded objects
If you pasted a spreadsheet, a chart, or an image, the file embeds it. Forensics tools can see what was embedded. If your essay contains an embedded screenshot of a ChatGPT conversation, that's findable.
File-system path leaks
When you embed an image, the document sometimes records the path on your computer where the image came from. If that path is C:\Users\maria\Documents\ChatGPT Output\paste.png, that's findable too.
What raises a flag (and what doesn't)
Things that often raise flags but are usually innocent:
totalTimenear zero — happens with Google Docs exports, with last-minute drafting, with copying from another doc you wrote- Author field doesn't match you — happens with shared computers, templates, family members logging in
- Mismatched font signatures — happens with paste-from-anywhere, not just paste-from-AI
- Old creation date — happens with template reuse, recycled docs
Things that more reliably raise flags:
- Author field is a name you have no plausible connection to
- File-system path includes "ChatGPT," "AI," "generated," or similar
- Embedded screenshots of LLM conversations
- Inconsistent font across paragraph boundaries within the same idea
- "Hidden text" that contains visible prompts, instructions, or generated output
- Edit history showing a series of "accept" events for an entire essay's worth of suggestions
How to check your own document before submitting
The forensics tool your professor uses is probably already free for you to use too. Drop your file at forensics.autotend.io — no signup, nothing leaves your browser, no one sees the result but you.
You'll get a report showing every signal we describe above. If something raises a flag in our tool, it will probably raise a flag in your professor's tool. The point isn't to "beat" forensics — the point is to know what your document looks like before submitting, fix anything that's an honest mistake (wrong author field, missed font consistency), and have answers ready if you're asked about anything else.
What forensics tools can't tell
Forensics tools work on the file. They don't read your mind, judge your prose, or magically detect AI authorship. We can't tell whether the text in your document is yours; we can only tell whether the file itself shows signs of paste-from-elsewhere or unusual editing patterns. Many of those signs have innocent explanations.
If your school uses a forensics tool that gives a single "AI confidence" number based on the prose itself, be skeptical of that score — those tools are documented to false-positive on non-native English writers and on formal-register prose. Ask what their methodology is.
The honest meta-advice
The best protection against being wrongly accused of AI use is the same as the best practice for academic writing:
- Work in one tool from start to finish (don't paste between apps unless necessary).
- Save frequently (so your revision count and
totalTimereflect real work). - If you use AI as a brainstorming aid (with your school's permission), don't paste the output into your essay — read it, set it aside, and write in your own words.
- If you use Google Docs, write in Google Docs and submit from Google Docs (download as PDF directly, not via "export to Word"). The forensic signature is much cleaner.
- If you collaborate with a tutor or a peer, do it with track-changes on so the help is visible and properly attributed.
- Check your own document with a forensics tool before submitting. Fix anything that's an honest mistake.
You're not trying to be invisible to forensics. You're trying to make sure that what shows up in the forensics tool matches what actually happened.
Related methodology
- What docx metadata reveals about a document — the prof-facing version of this page
- Why prose-style AI detection is biased — the case against single-number AI scores
- See the full list at Learn