How does the generator handle citations and avoid fabricating sources?
The generator only proposes citations drawn from the bibliographic inputs you provide (DOI metadata, Zotero/Mendeley exports, or supplied PDF snippets). When a suggested reference is not present in the provided sources, the system flags it as a candidate rather than asserting it as an established fact. Authors are prompted to confirm or replace proposed citations before export.
Can I produce output formatted for specific journals or LaTeX templates?
Yes. The generator produces sections formatted for copy/paste into Word or Google Docs and outputs LaTeX-friendly text suitable for common manuscript environments (figure captions, methods lists, tables). It does not automatically apply proprietary publisher templates but generates export-ready content that reduces formatting work when you adapt it to a journal template.
How do I use raw data or statistical tables as inputs for narrative results?
Provide a CSV/Excel table or paste the key statistics (means, SD/SE, CI, p-values, sample sizes) and the comparison groups. Use the "Results — narrative from statistics" prompt cluster to convert those numbers into a neutral Results paragraph; the assistant will only describe the statistics you supplied and will note which values it used in the generated text.
What workflows help preserve reproducibility and methods fidelity when using AI assistance?
Start by supplying the original protocol bullets, reagent lists, instrument models, and raw data snippets. Use the Methods conversion prompt to create a stepwise protocol, then run a reproducibility checklist prompt that asks for reagent concentrations, parameter values, and potential ambiguity. Keep the source snippets attached to each Methods paragraph so reviewers can trace each sentence back to its source.
Can the tool help draft responses to peer reviewers while preserving author control?
Yes. Use the reviewer response prompt cluster to draft polite, point-by-point replies. Provide the reviewer comments, the manuscript excerpts being revised, and the concrete edits or new figures you plan to include. The output is a draft for the authors to edit — the assistant explicitly marks suggested text versus author-supplied changes to preserve control.
How should sensitive or unpublished data be managed when using an AI writing assistant?
Treat unpublished or sensitive data according to your institution's policies. Keep private datasets and identifiable information out of shared or public workspaces; use local file imports or secure institutional repositories that the team controls. The generator is designed to attach provenance notes, but authors remain responsible for data governance and ethics approvals.
Does the generator replace the need for domain expert review and ethical approval?
No. The tool assists with drafting and consistency but does not replace subject-matter expertise, statistical review, or institutional ethical approvals. Outputs should be reviewed and validated by domain experts and governance bodies before submission or public release.
What export options exist for collaboration with co-authors using Word, LaTeX, or reference managers?
Export options include plain text sections for Word/Google Docs, LaTeX-friendly text for integration into manuscript source files, and reference lists formatted from supplied bibliographic metadata. The generator also exports a revision log that records prompts and source snippets to support co-author review.