How to Talk to AI: A Prompting Guide for Student Accounts Professionals

Alexandra Lindsay
May 12, 2026

This is Lesson 1 of The Student Accounts AI Field Guide Series, a companion learning series alongside our State of AI in Student Accounts report. 

AI is already in your office. Maybe you're using it yourself, quietly, between tasks, without telling many other people. Maybe someone on your team is. According to Meadow's State of AI in Student Accounts report, 76% of student accounts professionals are already using AI at work, often without formal guidance or approved tools. Download the full report here.

This is an exciting signal - the curiosity is there. The willingness is there. What's often missing is a clear starting point — a practical answer to the question: how do I actually use this well?

That's what this series is for. And there's no better place to start than prompting — the skill that determines whether AI gives you something useful or something you immediately delete. Master prompting and you have an incredible resource at your fingertips. 

KEY DEFINITION: Large Language Models (LLM)s: this lesson centers on usage of LLMs, or, the technical term for the type of AI that powers tools like ChatGPT, Claude, Google Gemini, and Microsoft CoPilot

The "large" refers to the scale of training: these models are trained on enormous amounts of text from the internet, books, and other sources, which is what allows them to generate fluent, contextually relevant responses. The "language" part is straightforward: they work with text in, text out (though newer versions also handle images, audio, etc.).

In everyday terms, when someone says "I asked ChatGPT" or "I use Claude for work," they're using an LLM. We will use LLM and AI interchangeably in this lesson. 

Think of AI Like a New Hire

At Meadow, we think of AI tools as interns. Smart ones, but interns nonetheless.

A good intern can do a lot. They can draft, research, summarize, explain, and organize. But they only perform as well as the direction you give them. Hand a new intern a vague request and you'll get an imperfect result. Give them context, a clear goal, and a specific ask, and they'll often surprise you.

LLMs work exactly the same way. The single most important thing you can do to improve your results is to be more specific. Not a little more specific. A lot more specific.

This doesn't come naturally at first, it requires re-training ourselves on how we seek out information and how we explain what we need or want. We're used to search engines: tossing in a few keywords and letting the algorithm figure it out. Prompting is different. It's closer to briefing a colleague than running a search. The more you treat it that way, the better your results will be.

Before We Get to Examples: Four Prompt Best Practices That Apply Everywhere

These principles apply regardless of whether you are using AI to help you plan a vacation or using it to figure out how to deliver a tough message to a student about their bill.

1. Give AI a role.

Before you ask AI to do something, tell it who it is. This is one of the most underused and highest-impact techniques in prompting.

"Act as an experienced student accounts director at a mid-sized private university."

This frames every response through a relevant professional lens: the right vocabulary, the right assumptions, the right level of nuance. But the role prompt works in two directions. You can also use it to step into your student's shoes:

"Act as someone who deeply understands the experiences and feelings of a first-generation college student who just received a past-due balance notice and doesn't understand why their financial aid didn't cover the full amount."

Now you're drafting a response for that student without using sensitive information. Empathy is built into the prompt.

2. Ask AI if it has questions before it executes.

This is a simple habit that dramatically improves output quality. Before you ask AI to write or produce anything, add this line:

"Before you begin, ask me any clarifying questions that would help you do this better."

You'll be surprised how often it surfaces something you hadn't thought to include — the tone you want, the audience you're writing for, a constraint that matters. Five seconds of clarification can save you three rounds of editing, or worse, a throw up the hands declaration that AI is useless.

Pro Tip: If you tend to use the same LLM often, you can add a setting which tells it to always ask you questions.

3. Include context that an LLM can't assume.

AI doesn't know your institution. It doesn't know your policies, your student population, your voice, or your history with a particular account. When that context matters (and in student accounts work, it almost always does) you have to supply it.

Bad: "Write an email to a student about their late payment."

Better: "Write an email to an undergraduate student who has a $1,200 past-due balance from last semester. This is their first overdue balance with us. Our institution has a 30-day grace period before holds are applied and this student’s bill is over 30 days overdue. The tone should be firm but supportive. We want them to act, but we don't want them to feel embarrassed or panicked."

The first prompt will produce something generic enough to be almost useless. The second will produce something you can actually edit and send.

4. Ask for sources and review everything before it reaches a student 

LLMs are biased to give you answers whether they are true or not. You have every right to ask for sources for every claim made or data point shared. Asking for sources upfront helps you with the important and necessary work of fact-checking before sending anything that an LLM produces. Remember you are accountable for what leaves your office, and AI doesn't know what it doesn't know. Build the review step into your workflow from day one.

What Happens When You Don't Follow the Rules

Let's look at two real examples of prompts that seem reasonable but will consistently produce weak results.

Weak Prompt #1:

"Help me draft an email to a student following up on them being late on their payment."

Why it fails: This prompt has no context. AI doesn't know how late, how much, what communication has already happened, whether there are extenuating circumstances, what your institution's policies are, or what outcome you're hoping to achieve. It will produce something technically correct and completely generic — the kind of email that students recognize immediately as a template and are less likely to respond to.

What could happen: You get a polished-sounding but hollow draft. You send it with minor edits. The student doesn't respond because nothing about it feels personal or specific to their situation. You follow up again anyway.

Weak Prompt #2:

"Summarize the latest FAFSA changes."

Why it fails: Which changes? From when? For what audience — students, staff, administrators? What level of detail is appropriate? What are you going to do with the summary? Without answers to these questions, AI will give you a broad, surface-level overview that may not reflect the most current guidance and won't be tailored to how your office actually needs to use the information.

What could happen: You share a summary with your team that glosses over a key provision affecting your population. Someone later has to course-correct.

Prompt Examples for the Three Most Common Use Cases

The data from Meadow's research identified where student accounts professionals are most actively using — or eager to use — AI. Here are prompts built for each, with examples of both what not to do and what actually works.

Use Case 1: Drafting Responses to Student Inquiries

In our research, 46% of student accounts professionals are already using or piloting AI for drafting responses to student inquiries — making it the most common active use case in the field. It's a natural fit: high volume, repetitive patterns, and a task where consistency matters.

Scenario: A student emails asking why their balance is higher than expected after their financial aid was applied.

Weak prompt:

"Write a response to a student asking about their balance."

Strong prompt:

"Act as a student accounts advisor at a small private liberal arts college. A student has emailed asking why their account balance is higher than they expected after financial aid was applied. They sound frustrated and confused. Write a response that: (1) acknowledges their concern without being dismissive, (2) explains in plain language the most common reasons a balance might be higher than expected after aid is applied (unmet need, indirect costs not covered by aid, mid-year changes), and (3) invites them to schedule a call to review their specific account. Keep the tone warm, clear, and concise. Avoid jargon."

Scenario: You want to step into the student's perspective before drafting communications about a new payment plan option.

Prompt:

"Act as a sophomore student at a regional public university. You're working part-time, your family doesn't have savings to help with tuition, and you just received an email from the student accounts office mentioning a payment plan. You're not sure what it means or whether you can afford to sign up. What questions would you have? What would make you feel like this office is on your side?"

Use the output to inform what your communication actually needs to address — before you write a single word of it.

Use Case 2: Interpreting Federal and State Guidelines

22% of practitioners in our research are using or piloting AI for interpreting federal or state guidance — and another 28% are interested but not yet doing it. This is one of the highest-value use cases for the role, because the source material is dense, frequently updated, and easy to misread.

A few important notes before the prompts: AI is not a lawyer or a compliance officer. Use it to accelerate your understanding, not to replace expert review. Always verify against primary sources, and never use AI output as the final word on anything compliance-related.

Scenario: You need to quickly understand a recent update to R2T4 (Return of Title IV) guidance before a team meeting.

Weak prompt:

"Explain R2T4 to me."

Strong prompt:

"Act as a higher education financial aid compliance expert. Explain the Return of Title IV funds (R2T4) calculation process as it currently applies to institutions that use standard term programs. I'm a student accounts director preparing to brief my team. I need to understand: (1) when the calculation is triggered, (2) what the key steps are in the calculation, (3) what the most common compliance mistakes are, and (4) what documentation we should have on file. Use plain language where possible and flag anything that has changed recently or that frequently trips up compliance reviews."

Scenario: Your institution is updating its refund policy and you need to understand what federal regulations actually require versus what is institutional discretion.

Prompt:

"I'm reviewing my institution's refund and withdrawal policy. Help me distinguish between what is federally mandated (under Title IV and HEA regulations) versus what falls within institutional discretion. Specifically, I'm focused on [TOPIC — e.g., the timeframe for issuing refunds after withdrawal, credit balance disbursement requirements]. Flag any areas where state law might add additional requirements. Before you begin, ask me any clarifying questions that would help you give a more useful answer."

Use Case 3: Outbound Billing Reminders and Communications

While only a portion of respondents are actively using AI for outbound communications today, 40% identified it as a workflow they're excited to use — the highest interest level of any workflow in our research. It's easy to see why: billing communications are high-volume, high-stakes, and deeply dependent on tone. Getting them right matters for both collections outcomes and student relationships.

Scenario: You need to draft an outbound text message to students with a balance due before the semester payment deadline.

Weak prompt:

"Write a text message reminder for students who owe money."

Strong prompt:

"Write a text message reminder for undergraduate students at a mid-sized public university who have an outstanding balance due before the end of the semester. The message should: (1) be under 160 characters, (2) include a clear call to action with a link placeholder, (3) be direct without being threatening, and (4) acknowledge that students have options (payment plan, financial aid office) if they need help. Write three versions: one that is matter-of-fact, one that leads with urgency, and one that leads with support. I'll choose the best fit."

Scenario: You want to draft a series of escalating reminders for past-due accounts — starting warm and becoming firmer over time.

Prompt:

"Act as a student accounts communications specialist. I need to draft a three-email sequence for students with a past-due balance. Email 1 goes out 7 days after the due date (gentle reminder, assume it may have slipped their mind). Email 2 goes out 21 days after the due date (firmer, mentions consequences, still leaves door open). Email 3 goes out 45 days after the due date (serious tone, clear consequences, specific next steps). Our institution places a hold on registration for balances over $500. The tone across all three should feel like it's coming from a person, not a collections department. Before you write, ask me any clarifying questions."

One More Thing: This Gets Easier

The prompts above may feel long the first time you write them. That's normal. The good news is that you'll start to build a library — your own set of tested, refined prompts for the situations that come up again and again in your office. Once a prompt works well, save it. Share it with your team. Treat it like a template.

The practitioners seeing the most value from AI right now aren't the ones with the most technical skill. They're the ones who took the time to write a good brief — and then kept refining it.

This is The Student Accounts AI Field Guide Series, a companion learning series alongside our State of AI in Student Accounts report. Each lesson covers a practical topic grounded in real data from the field. Sign up to receive the full series in your inbox.

Meadow's research for this series is drawn from the State of AI in Student Accounts report, based on a survey of 147 student accounts, billing, collections, and One Stop professionals conducted in March 2026. Download the full report here.

Subscribe to our blog

Oops! Something went wrong while submitting the form.

Ready to get started?

Get in touch with our team today.