Inspired Comforts

Find Your Freedom

How to use ChatGPT during cancer treatment: a real workflow

Inspired Comforts
Tech & AI in Recovery · The AI workflow pillar

A practical guide to using ChatGPT, Claude, Gemini, and similar tools during a medical event — to translate bloodwork, draft hard family conversations, prep questions for appointments, and keep your appointment schedule from collapsing. With explicit caveats about what they can’t do, and the patient-safety rules that matter.

[ Hero photo: a phone screen on a kitchen counter, soft natural light. Replace with commissioned. ]
The simple answer

AI assistants — ChatGPT, Claude, Gemini — are useful during medical treatment for translation tasks (decoding bloodwork, glossing medical terms, summarizing research), administrative tasks (scheduling, draft emails, prep questions), and emotional logistics (drafting hard family conversations, saying no to commitments). They are NOT a substitute for your care team, do not have access to your medical records unless you paste them in, can hallucinate medical claims, and should never be used as the source of a treatment decision. Below: the four uses that work, the rules that matter, and the prompts that get the most useful answers.

Why this article exists

Most “AI for healthcare” articles are written by tech companies. This one is written for the patient — someone with five doctor’s appointments next month, a stack of paperwork, and a partner who keeps asking “what’s the difference between Stage 2 and Stage 3 again.” The use cases below are the ones that hold up across customer feedback and patient-community discussions.

For the boundaries, this article relies on the FDA’s published guidance on AI as medical device, the AMA’s principles for augmented intelligence in medicine, and the World Health Organization’s ethics guidance on AI in health. None of which approve general AI assistants as diagnostic tools.

Four uses that genuinely work

Use 1 · Translation

Decoding bloodwork, pathology reports, and discharge papers

Paste the result into ChatGPT and ask: “Translate this bloodwork into plain English. Tell me what each value means and which ones a non-specialist might ask follow-up questions about.” The translation is usually clear, accessible, and a good starting point for conversations with your team. What it cannot do: tell you whether a value is “good” for you specifically — that depends on your treatment history, drug interactions, and your team’s reference ranges, not the lab’s standard ranges.

Use the answer to write better questions for your doctor.
Use 2 · Question prep

Drafting questions for appointments

“I have an appointment next week with my oncologist about [drug]. Help me write 10 questions a thoughtful patient would ask, prioritized by importance.” This produces a better question list than most patients walk in with, and it surfaces topics you might not have known to ask about. What it cannot do: prioritize for your specific situation. Take the list to your appointment as a starting point, not a script.

Print the list. Bring it.
Use 3 · Hard conversations

Drafting messages you don’t have the energy to write

“Help me write a short text to my mother explaining that I won’t be able to come for Thanksgiving this year because of treatment, in a way that’s warm but firm.” AI assistants are good at this — they can produce a draft you can edit, in 15 seconds, when you don’t have 15 minutes to draft from scratch. The same applies for thank-you notes after a meal train, no-thank-you notes for unhelpful advice, and the gentle phrasing for asking a friend to stop calling every day.

Edit before sending. The draft is starting clay, not finished pottery.
Use 4 · Logistics

Scheduling, summarizing, and remembering

“Here are my treatment dates [paste], and here are my work commitments [paste]. Help me figure out which weeks I should ask for lighter loads.” Or: “Summarize this 12-page treatment plan into a one-page version I can give to my partner.” Or: “What questions should I have answered before signing this consent form?” These are administrative tasks AI handles well; the answer is yours to verify.

Trust but verify.
“AI tools can be powerful augmentations to clinical care but should never replace the patient-clinician relationship.”
— summarized from AMA principles on augmented intelligence in medicine

The rules that matter

  1. AI is not your doctor. Even when it sounds confident. Even when it’s right. The confidence-versus-accuracy gap is the single most documented issue with general AI assistants in medical contexts.
  2. Verify medical claims. If the AI tells you something specific about a drug interaction, dosing, or a treatment recommendation, verify with your care team or with a published authoritative source (NIH, Mayo Clinic, MSK, ACS).
  3. Don’t paste protected information into free consumer products without thinking. ChatGPT and most consumer AI products may use your inputs to train future models unless you opt out. OpenAI’s data controls FAQ covers how to opt out for ChatGPT; similar pages exist for other vendors. For sensitive information, opt out first.
  4. Be skeptical of “what does this scan show” questions. AI image readers exist as FDA-cleared medical devices in clinical use, but consumer AI assistants are not those — they will speculate confidently about images they aren’t trained to read. Don’t ask ChatGPT to interpret your CT scan.
  5. Use the right tool for the task. ChatGPT is fine for translation and drafting. For drug-interaction questions, the Drugs.com interaction checker is more reliable. For specific cancer-drug research, NCI’s clinical trial database is authoritative.

The prompts that get the best answers

What you want Prompt that works
Plain-English bloodwork “Here are my latest CBC and metabolic panel results: [paste]. Translate each value into plain English and flag the 3 things a thoughtful patient might want to ask their doctor about.”
Question prep “I’m seeing my oncologist next week. Diagnosis: [stage/type]. Drug: [name]. Help me write 10 prioritized questions, organized by topic.”
Hard message draft “Draft a short, warm but firm text saying [the message]. Audience: [relationship]. Tone: [warm/professional/etc].”
Treatment plan summary “Summarize this treatment plan in plain English at a 9th-grade reading level, in under 200 words. Include 3 things I should ask before signing.”
Side-effect tracking “I’m experiencing [symptoms]. Suggest 5 questions I should ask my care team. Don’t tell me what’s wrong; help me describe it accurately.”
Insurance denial appeal “My insurance denied [treatment]. Help me draft an appeal letter focusing on medical necessity. The denial reason was: [paste].”
Saying no to commitments “Help me write a polite text declining [event] because of treatment. Recipient: [relationship]. Goal: warm, brief, leaves the door open.”

What we do not recommend using AI for

  • Diagnostic decisions. Don’t ask AI whether you have a condition or what’s wrong with you. That’s your care team’s job.
  • Drug-dosing calculations. Pharmacists and your care team. Don’t trust AI math on medications.
  • Image interpretation. Scans, dermatology photos, anything visual. Use the FDA-cleared tools your team uses, not consumer AI.
  • Mental health crisis support. If you’re in distress, contact 988 (Suicide & Crisis Lifeline in the US) or your local equivalent. AI is not a substitute for crisis care.
  • Anything where being wrong is expensive. Treatment decisions, financial decisions, anything legally binding. AI is a draft tool, not a final-decision tool.

Practical AI literacy is part of the modern patient skill set

The wardrobe is one part of recovery; managing the information flow is another. We don’t sell AI tools or workflow software, but we believe in writing about what real patients are actually doing — and a lot of patients are using ChatGPT and similar tools every week. Read more on tech and AI in recovery, including specific app and tool reviews.

Frequently asked questions

Which AI assistant is best for medical translation?
All major ones (ChatGPT, Claude, Gemini, Perplexity) handle plain-English translation of medical reports well. They differ slightly in tone and formatting; pick the one whose voice you find easiest to read.
Is it safe to paste my medical records into ChatGPT?
Free ChatGPT may use your inputs to improve models unless you opt out via the data controls page. ChatGPT Team and Enterprise plans don’t train on your data by default. For sensitive medical records, either opt out first, use a paid tier, or strip identifying information before pasting.
Can AI help me decide between treatment options?
It can help you understand the options and write better questions for your team. The decision itself is yours and your care team’s, weighing factors AI can’t see (your specific history, your values, your support system, your finances).
Are there medical-specific AI tools I should use instead?
Some FDA-cleared clinical AI tools exist (used by your hospital, not by you directly). For consumer use, general assistants are usually enough. FDA maintains a list of AI/ML-enabled medical devices if you want to see what’s clinically validated.
My doctor said not to use ChatGPT for medical questions — am I doing something wrong?
No, but it’s a fair concern. The right framing: AI helps you prepare, translate, and draft. Your doctor decides. The tool is for your side of the conversation, not theirs.
What about hallucinations?
Real and persistent across all major AI assistants. If the AI gives you a confident-sounding medical fact, citation, or drug name, treat it as a draft to verify, not a finding. The error rate on cited medical references in particular remains high enough to matter.

Sources and further reading

Continue reading

By Zainab, Inspired Comforts editorial. Inspired Comforts exists because people we love went through some of these conditions, and the recovery clothing they needed did not exist the way it should have. We are not nurses. We care obsessively about helping you retain as much of yourself as possible — through surgery, chemo, dialysis, postpartum, whatever is coming. On medical questions we cite real published practitioners and link to their work in full. If you read something here that does not match what your care team is telling you, trust your care team. We will keep doing the wardrobe research. Read more about us.
A note on what this is. This article is general information drawn from the sources cited above and from real-patient experience patterns. It is not medical advice, not a diagnosis, and not a substitute for the guidance of your care team. Your situation is specific to you. Always discuss decisions about your treatment, medications, and care with your physician, surgeon, oncologist, nephrologist, OB, or relevant specialist. If you are experiencing symptoms that worry you, contact your medical team. In an emergency, call 911 or your local emergency number.
Visited 1 times, 1 visit(s) today
Close Search Window
Close