AWS Public Sector Blog
Empowering personalized learning at scale: Loyola Marymount University’s AI course companion
The traditional model of academic support—which includes office hours, tutoring centers, and teaching assistants—can’t always keep pace with student needs in today’s universities. Studies have shown that 85% of students have already adapted to the changing landscape of academic support by using generative AI tools, but these generic solutions often fall short of providing content tailored to their classes and personalized guidance—and their use can raise concerns about data privacy and academic integrity.
Loyola Marymount University (LMU) envisioned something different: an AI companion that could speak in each professor’s voice, reference the exact materials from their courses, and be available around the clock. Working with Amazon Web Services (AWS), the university developed a secure, course-specific tool called the AI Study Companion, which has earned strong faculty approval while providing students with the support they need at substantial cost savings compared to commercial alternatives.
Addressing student needs with 24/7 course-specific support
LMU’s mission emphasizes personal connections in learning through a high-touch, individualized approach. With most students turning to generic, off-the-shelf AI tools, the university saw an opportunity. “One of the things that sparked this is, ‘How do we make a better version of what’s currently available?’” said Matt Frank, director of teaching, learning, and research technology at LMU.
Brian Drawert, manager of research computing at LMU and the AI Study Companion’s developer, explained the core issue: “AI was already trying to help students with their coursework, but doing it poorly. The challenge was giving them a chat interface that actually answered questions for their class.”
Modern learners also juggle complex schedules, including jobs, family commitments, and study abroad programs, making traditional faculty office hours inaccessible to many students. Building a 24/7 solution was particularly important.
Building on AWS for data control and student privacy
LMU’s existing relationship with AWS made it a natural foundation for the project. Through ongoing data center migrations to AWS, the university had already established trust in the platform’s security capabilities—which was a crucial factor given the sensitive nature of educational data.
“We didn’t want to use a tool where their data would be going to some outside vendor, and we had no idea what was happening with that data,” Frank explained. “So, we really wanted to build an environment where LMU would be in control of the data and the security.”
This control was essential for meeting the Family Educational Rights and Privacy Act (FERPA) compliance requirements and protecting faculty intellectual property (IP). FERPA protects students’ educational records and personal information, while faculty course content—including lectures, syllabi, and teaching materials—represents significant scholarly work and IP that required robust protection. Since the system would ingest classroom recordings containing both professor and student voices, maintaining strict access controls was non-negotiable. Building within LMU’s controlled AWS environment helped keep both student data and faculty IP secure and inaccessible to external vendors or AI training models.
The technical foundation of the AI Study Companion leverages multiple AWS services: Amazon Bedrock for core generative AI capabilities, Amazon Transcribe for processing classroom recordings, Amazon Simple Storage Service (Amazon S3) for data storage, Amazon Elastic Container Service (Amazon ECS) for scalability, Amazon OpenSearch for search functionality, AWS Lambda for automated processes, and AWS Web Application Firewall (WAF) for security. “We wanted to start with a robust toolkit,” Frank explained. “Amazon Bedrock presented itself as that correct toolkit. We were able to build our specific needs and use cases on top of an already robust and flexible platform.”
Rapid development through AWS specialist collaboration
The project timeline proved remarkably fast. LMU hired Drawert in late 2024 and held its first AWS meeting on January 27, 2025. Drawert made his first code commit on March 28, 2025 and by early May, LMU demonstrated a working solution at the Association of Jesuit Colleges & Universities IT Management Conference (AJCU-CITM). The timeline was particularly compressed because LMU wanted to launch the AI Study Companion in classrooms by August 2025 to start helping students at the beginning of the fall semester.
AWS solutions architect Lorin Miller worked closely with Drawert throughout the development process. The collaboration began with discovery sessions to understand LMU’s vision and requirements, then moved quickly into proof-of-concept development. AWS brought in specialists for AI, containers, and networking to make sure each component was properly optimized. LMU’s existing cloud team also provided essential support in getting Drawert trained on the platform. Echo360, LMU’s classroom capture vendor, also supported the project with an innovation grant that helped offset hosting costs.
Weekly content ingestion mirrors faculty teaching approach
LMU’s AI Study Companion works differently from generic AI tools. Instead of providing all course information at once, the system mirrors how faculty structure learning throughout a semester. “We don’t give the tool all of the information right at the start of the semester,” Frank said. “We will give it the actual information that the students are going to learn in week one in week one, and then week two in week two.”
This approach respects faculty instructional choices while providing students with appropriately leveled guidance. In practice, this means that Amazon Transcribe processes classroom recordings and transcripts are reviewed for accuracy before being uploaded to the knowledge base—maintaining human oversight.
From the students’ perspective, the experience is straightforward. Students log in with LMU credentials, select their course, and can interact conversationally with the AI Study Companion on any device. They can request study guides, clarify missed material, or ask questions about course content.
Beyond providing accurate answers about actual course content, the AI Study Companion has one unexpected benefit that emerged during testing. “We discovered that it not only provided class-specific information, but also captured the professor’s unique style, such as their affectations, inflections, and even their jokes,” Drawert explained.
Early results show faculty satisfaction and cost savings
The project pilot launched in August 2025 with approximately 125 students. Faculty feedback has been consistently positive, with professors noting the tool’s accuracy and alignment with course material. While formal research data remains under Institutional Review Board (IRB) review, early impressions suggest strong potential.
“The AI Study Companion has been an exciting and innovative project,” said Kat Weaver, interim executive vice president and provost at LMU. “Faculty using the tool have reported their satisfaction with the ease of integration as well as student engagement and comprehension.”
Based on these encouraging results, the university plans aggressive expansion. The solution’s cost-effectiveness makes this kind of expansion feasible. Compared to enterprise AI tool licenses costing $30 per student monthly, LMU’s custom solution provides substantial savings while delivering more tailored functionality.
This combination of lower costs and superior personalization has attracted attention from other institutions, leading to discussions about potential partnerships to adopt similar solutions. Because of this, LMU is exploring how to package the solution for broader adoption. “This could help a lot of students, potentially thousands of students within the next 12 months,” Drawert noted.
Supporting responsible AI innovation in higher education
LMU’s AI Study Companion demonstrates how institutions can leverage cloud technology and generative AI to enhance learning without sacrificing core educational values. By building on AWS infrastructure, the university developed a solution with the desired functionality of commercial alternatives while maintaining complete ownership of the educational experience.
For higher education leaders considering AI adoption, LMU’s approach offers a blueprint for responsible innovation that puts student success and faculty expertise at the center while leveraging cloud-native AI services.
