I can be an anxiety-ridden moment when the IMRA quality report lands in a publisher’s inbox. No matter how strong the submission felt, opening these critical documents means facing the state’s first public measure of your work. The findings can be encouraging or frustrating, but regardless, there will be work to do. If organizations operate in a growth mindset, they will understand they just received a blueprint for success. They now have a carefully completed study of their program, the result of the most rigorous state review in the nation. What matters most is how publishers react and execute, guided by an organized strategy and project management.
Feedback is firsthand valuable intelligence; each comment reflects how reviewers experienced your materials, what they could easily verify, and what required more evidence than you provided. Teams that treat this information as data rather than criticism come out stronger, not only for the current adoption but for every subsequent state process.
The first person who should read the report is the project lead who managed the submission. They know the version history, the naming conventions, and how each piece of evidence was packaged. Their job is to review quietly and map out what the report is really saying before sharing it with the broader team. A simple spreadsheet works best and should serve as the foundation for cross-departmental appeals preparation. The process should note the indicator, copy the reviewer’s comment verbatim, record the evidence initially submitted, and flag whether the issue seems to stem from missing content, unclear wording, or misplaced evidence.
Within a day, the lead should gather the small group that will manage the process: content experts, standards specialists, project management, and one person responsible for consistency of tone. Together, they decide who owns each finding, where supporting documentation will live, and how to keep files and versioning clean. A clear structure at this point prevents confusion later when deadlines tighten. I recommend scheduling an appeals kick-off meeting with senior leadership present to ensure complete organizational transparency and universal agreement on the approach.
Each assigned team member drafts responses for their section, keeping the language factual and complete. A good appeal restates the reviewer’s finding in plain language, explains how the program meets the requirement, and points precisely to the evidence—page numbers, file names, timestamps if needed. New or revised materials should be acknowledged without defensiveness. The state is not asking for perfection; it is assessing whether a publisher has responded responsibly and transparently.
Before appeals are submitted, responses should undergo an internal peer review. Someone who didn’t write the section checks that citations match, attachments open correctly, and the tone remains consistent. This review should also flag anything that might appear contradictory across grades or subjects. The goal is to ensure the final submission reads as one coherent response, not a patchwork of different voices.
If parts of the report are unclear or if indicators seem inconsistent, the team should reach out to TEA for clarification rather than guessing. A short, direct question such as. “Can you confirm which rubric item corresponds to this comment?” can save hours of second-guessing later. Keep all correspondence in the IMRA folder to demonstrate diligence and professionalism.
As the appeal takes shape, it helps to remember that IMRA reviewers and the State Board of Education are two separate audiences. Reviewers focus on quality and alignment; the Board takes a broader lens, including suitability and community context. Reading the draft appeal as if it might later be discussed publicly helps the team tighten explanations and remove jargon.
When the appeal is complete, everything should be archived: the tracker, evidence files, peer review notes, and correspondence. A short debrief afterward, what went smoothly, where time was lost, which indicators reappeared, turns a stressful phase into institutional learning. Many publishers create living documents from these reflections, so the next cycle leads with a strong strategy.
| Internal Tool | Purpose | Owner | Notes |
|---|---|---|---|
| Feedback Tracker | Records each finding, reviewer comment, and related evidence; forms the backbone of the appeal. | Project Lead | Built immediately after report receipt. |
| Evidence Repository | Central folder containing all cited files, labeled for easy reference. | Adoption & Product Team | Maintain strict version control. |
| Peer Review Log | Documents internal checks for accuracy, tone, and completeness. | Product Alignment or QA Lead | Completed before submission. |
| TEA Correspondence Filer | Stores all clarification emails and responses from the agency. | Project Manager | Demonstrates due diligence. |
| IMRA Playbook | Living document capturing lessons learned and workflow improvements for next cycle. | Leadership Team | Updated after debrief. |
Through the cross-departmental approach, the feedback and appeals process becomes an audit of readiness rather than a reactive response to criticism. It reveals where the evidence is most unmistakable, where the guidance is strongest, and how the program performs when viewed from the outside. Proactive teams use it not to defend their work, but to understand how it holds up under the weight of public review. This will ensure that, by the time the Board sees it, every answer is already built into the program itself.




