Conference Schedule & Session Descriptions

 

8:30-9:10 Registration and Continental Breakfast 

Gathering Room - Williamson 3422/3423

 

9:10-9:20 Welcome 

Gathering Room - Williamson 3422/3423

 

9:30-10:20 Session 1 

Track: AI Efficiencies (Room 3415)

  • AI as a Teaching Partner: Practical Supports for Planning, Engagement, and Reflection (Billy Green) - As conversations around artificial intelligence in higher education continue to shift, many faculty are less focused on defining AI and more interested in how it can support the everyday work of teaching. This interactive workshop explores how instructors can use AI as a planning and reflection support tool across the instructional cycle while maintaining faculty judgment and disciplinary expertise. Drawing from classroom-tested practices in teacher preparation and undergraduate courses, participants will engage with four practical, faculty-centered use cases. First, the workshop examines how AI can support the design of lesson hook activities and open-ended discussion questions aligned to weekly topics, helping invite multiple perspectives and deeper student thinking. Second, participants explore how AI can support the creation of flexible rubrics that apply across multiple project formats, particularly within choice-based assignments, allowing for consistency in evaluation while preserving student autonomy. Third, the workshop demonstrates how AI can assist in designing structured choice boards that align varied project options to shared learning outcomes. Finally, participants examine how AI can be used to identify common themes across student reflection data and suggest possible instructional responses, supporting faculty in making sense of feedback at scale. Throughout the session, AI is positioned as a support for instructional decision-making rather than a replacement for professional judgment. Participants will actively apply strategies to their own course materials and leave with adaptable examples and guiding principles they can use in their teaching, regardless of discipline or prior experience with AI.

Track: Integrity Perspectives (Room 3418)

  • What the Mentors Learned: Insights from a Year of Listening to Faculty and Piloting AI Integration (Rachel Faerber-Ovaska, Vamsi Borra, Courtney Poullas, and Dana Sperry) - Over the past year, YSU's Campus Mentors for AI program engaged dozens of faculty members across STEM, humanities, and professional disciplines in ongoing conversations about integrating generative AI into teaching and professional activities. This panel brings together mentors representing English, Engineering, and Art to share collective insights gained from two semesters of listening to, and learning alongside, their colleagues. The conversation on campus has shifted. Moving beyond general anxiety about "cheating," faculty are asking practical, discipline-specific questions: How do we redesign assignments so students develop core skills while leveraging AI? What tools matter in our field? How do we acknowledge AI's reality without abandoning academic integrity? This panel addresses that shift through distinct perspectives. Dr. Vamsi Borra (Engineering) will demonstrate how AI can serve as a neutral lab/class partner for hands-on exercises in Engineering education, encouraging active learning. Courtney Poullas (English) will discuss implementation "wins" and specific assignment scaffolding from developmental writing contexts. Prof. Dana Sperry (Art) will explore a paradox in contemporary creative practice: the widespread adoption of AI in industry while educators and artists work to preserve the human creative voice. Ultimately, the panel will address a cautionary question that emerged from these interactions: the danger of overconfidence in assuming certain human skills are inherently "AI-proof." Mentoring group facilitator Dr. Rachel Faerber-Ovaska will moderate a concluding Q&A on implementing these crossdisciplinary strategies in various academic contexts. Attendees will leave with practical, field-tested strategies and insights from professional peers who spent the year exploring AI alongside their departments.

Track: AI and Assignment Design  (Room 2235)

  • Using Student Survey Data and Think-Aloud Data to Develop Ethical Academic Writing Instruction in an Era of Generative AI (Diana L. Awad Scrocco) - To engage students in the academic writing process effectively in an era of generative AI, we must understand students' perceptions and actual uses of AI, and we must select pedagogical practices that demotivate unethical AI use. In this presentation, the speaker begins by sharing survey data from YSU students about their motivations for using or avoiding AI, claims about what inspires them as writers, and views regarding ethical AI use. The speaker then shares preliminary data from a think-aloud protocol study in which students respond to feedback on their written drafts from Generative AI, peers, and instructors. This speaker considers similarities and differences between these human and AI feedback sources and examines how students understand, interpret, and plan to use those types of feedback. Using the survey and think-aloud protocol data, the speaker offers guidance on how writing instructors can teach students to critically consider feedback on their writing from different sources and use them mindfully and ethically during their revision processes. The speaker concludes by proposing teaching structures that address students' writing anxieties and misconceptions, appeal to their motivations for writing, and inspire them to engage in the writing process ethically and authentically.

Track: Assignment Design with Co-Pilot (Room 2234)

  • AI-Supported Assignment Design: Transforming a Traditional Midterm into a Communication-Rich, Collaborative Assessment (Sean Melnik) - This presentation presents a case study on how generative AI was used to redesign a midterm project in a Business Communication course. Instead of a conventional exam, students were assigned a group-based internal communication audit for a fictitious company. AI tools were employed not to replace student thinking, but to support faculty workflow, enhance assignment clarity, and model professional workplace communication scenarios reflecting industry expectations. The presentation demonstrates how AI (Microsoft Copilot) was used to: 1. Create five unique, discipline-specific case scenarios involving communication breakdowns. 2. Align each scenario with course outcomes and Essentials of Business Communication (Guffey & Loewy, Ch. 1–7). 3. Develop a transparent rubric emphasizing written/oral communication, professionalism, and critical thinking. 4. Scaffold student understanding of organizational communication while preserving academic integrity through oral presentation, teamwork, and applied problem-solving. Participants will receive transferable templates they can adapt to their own courses—particularly those seeking to balance AI supported content creation with human-centric, performance-based assessment. 

Back to top

 

10:30-11:20 Session 2 

Track: AI Efficiencies (Room 3415)

  • The Augmented Instructor: Partnering With AI to Refine and Innovate Course Assessments in an Upper-Level Biology Course (Jill Tall) - As artificial intelligence (AI) becomes increasingly visible in higher education, much of the conversation has focused on student use, academic integrity, and policy enforcement. This session intentionally shifts the lens to faculty use of AI as a pedagogical design collaborator, offering a transparent, practice-based account of how partnering with AI supported course assessment innovation and redesign while preserving human judgment, rigor, and core disciplinary skills. Drawing on an upper-level undergraduate elective, BIOL 4896 Introduction to Biomedical Research, the presenter will describe how she explicitly worked with an AI collaborator ("Ari" ChatGPT) during the instructional design process, rather than in student task completion. AI was used to brainstorm alternative assignment structures, refine prompts, clarify expectations, and anticipate student challenges. All pedagogical decisions, values, and evaluative judgments remained firmly instructor-directed. Two BIOL 4896 course assignments designed to improve student written and oral communication proficiencies will serve as case studies: Penguin Tank, a new assessment and Extemporaneous Speak-a-Thon, a refined assessment. Importantly, students were not required to use AI; instead, AI functioned behind the scenes as part of the instructor's reflective design toolkit. The session will conclude with a structured discussion inviting faculty to consider where AI might support their own course and assessment design processes, what boundaries they would establish, and how human-centered learning goals can remain central in AI-augmented teaching. Participants will leave with practical design strategies, reflective prompts, and a clearer framework for engaging AI as an instructional partner rather than a replacement.

Track: Integrity Perspectives (Room 3418)

  • Academic AI Use and Practices: Is It Legitimate? Is It Ethical? Diagnosing Ethical Legitimacy Gaps (Patrick J. Bateman and Christina Saenger)  - AI is rapidly diffusing across higher education, but diffusion and normalization are not the same as ethical justification. Building on established conceptual work in legitimacy and business ethics, this session integrates microlegitimacy theory and Integrative Social Contracts Theory (ISCT) into a practical, stepwise tool for evaluating AI uses, applied in the academic setting. We introduce an "ethical legitimacy gap" lens: practices can become accepted through authority cues, perceived consensus, and institutional routines even when ethical authorization, meaningful voice, and decision-relevant recourse remain thin; conversely, ethically careful practices can remain socially contested. Participants will learn and apply a stepwise ethical legitimacy test (standing, norm content, authentic authorization, hypernormlinked constraints, and responsibility/remedy) to six common contexts: (1) faculty use of AI in course design; (2) AI course assistants embedded in the LMS; (3) student use of AI for writing (disclosed vs. undisclosed); (4) faculty use of AI for feedback and grading; (5) AI detection tools used in academic misconduct processes; and (6) proctoring and surveillance technologies. The goal is not blanket bans or blanket acceptance. Instead, we surface the norms actually governing practice, identify where consent becomes 'consent theater,' and specify what due care, equal standing, dignity, and meaningful contestability require in settings where student exit is constrained. We will work through several scenarios and discuss how different policy choices shift cases across quadrants. Attendees will leave with a checklist and concrete governance moves to strengthen transparency, protected voice, timely human review, accountability, remedy, and ongoing revisability as tools and norms evolve.

Track: AI and Assignment Design  (Room 2235)

  • Teaching at Scale with a Course-Specific GPT (Erica Neuman) - This presentation describes the design of a course-specific GPT to support student learning in a short, intensive, fully online course. Rather than relying on generic AI tools, I will demonstrate how to create a custom GPT grounded in existing course materials, including the syllabus, assignment descriptions, and full-length lecture transcripts from the semester-long version of the class. The goal is to provide online students with round-the-clock, course-aligned support that mirrors the instructor's terminology, sequencing of concepts, and pedagogical priorities—particularly in a format where students lack in-person access to the instructor. The course GPT was explicitly designed with instructional guardrails: it reinforces key concepts, explains ideas using the language of the course, and provides hints and clarification without completing graded work. In this way, the tool functions as a scalable, always-available teaching assistant rather than an answer generator. The presentation will walk through the lowbarrier process of creating the GPT, discuss how lecture transcripts were used to align explanations with instructor intent, and reflect on how this approach supports equity, transparency, and academic integrity in online learning environments.

Track: Assignment Design with Co-Pilot (Room 2234)

  • Augmenting, Not Replacing: Scaffolded & Engaging AI Strategies for Student Success (Sarah Gary and Louise Campbell) - "The purpose of AI is to amplify human ingenuity, not replace it." This sentiment, shared by Microsoft CEO and Chairman Satya Nadella, is paramount in BUS 2610. This is a tool course that teaches foundational skills in collaborating, writing, and presenting in the professional business world. Given the rapid adoption of AI tools in the workplace and the integration of Copilot in the Microsoft Office 365 suite, we have added explicit instruction addressing why and how to effectively use AI in a business setting. The writing module has been adapted to incorporate an AI review of an assignment with a corresponding personal reflection considering AI's strengths, weaknesses, and effectiveness in writing. The AI-focused lectures, collaborative activities, and hands-on assignments challenge students to engage with AI at a deeper cognitive level. We facilitate thought-provoking class discussions on the ethical and appropriate use of AI tools in academic and business settings, considering bias and hallucinations, guidelines surrounding sensitive information and workplace data, and the necessity of human editorial oversight of AI output. Throughout the course, we balance AI's learning and career-readiness benefits with an awareness of its limitations. We strive to ensure students use it to augment, not replace, human understanding and critical thinking. Our focus is incorporating AI to help our students excel as confident and effective business professionals, and in this session, we will share our experiences adapting this course to meet our students' AI literacy needs.

Back to top

 

11:20-12:30 Lunch and Student Panel 

Williamson 3422/3423

 

12:45-1:35 Session 3 

Track: AI Efficiencies (Room 3415)

  • Creating High-Quality Instructional Audio with AI: A Faculty Workflow (Patrick J. Bateman) - As online courses scale and grow in popularity, students increasingly expect frequent, high-quality instructional audio that supports engagement and clarity. Yet producing professional-sounding audio is constrained by time, limited access to quality recording equipment, and the fact that most faculty are not voice actors. This session shares a practical workflow using AI text-to-speech, with a focus on ElevenLabs. Drawing on direct online classroom use, this presentation shows how AI-generated audio can support course introductions, module overviews, discussion summaries, and end-of-module wrap-ups. Faculty draft concise scripts, then generate consistent audio in their voice without the variability of ad hoc recording (e.g., plosives, "ums," mic clipping, or multiple retakes due to an error or verbal flub) and without suboptimal setups (e.g., quick takes on a laptop or re-recordings due to background noise). The session addresses both benefits and limitations. While AI voices can occasionally sound generated, quality has improved substantially and, when configured well, the audio often sounds very good. In practice, this approach can provide tonal consistency, pacing, and clarity and can arguably deliver better overall sonic quality and professionalism than audio content created through typical recording methods, supporting accessibility and reducing cognitive load for students. Importantly, this reframes AI not as a replacement for faculty presence, but as a tool for amplifying it. By reducing technical and time burdens, faculty can focus more on pedagogy, feedback, and course design. Participants will leave with practical scripting tips and a clear process for creating higher-quality audio content. No prior experience with AI tools is required.

Track: Integrity Perspectives (Room 3418)

  • Perspectives on Academic Integrity: A Panel Discussion (Moderated by Mark Vopat)  

Track: AI and Assignment Design  (Room 2235)

  • Integrating AI to Enhance Quantitative Research Literacy in Social Work Education (Ron Davis) - As artificial intelligence (AI) tools become increasingly embedded in academic and professional contexts, social work educators face the challenge of integrating these technologies in ways that strengthen—rather than replace—core research competencies. This presentation describes an AI-enhanced assignment implemented in an undergraduate social work research course to support students' quantitative research literacy while emphasizing ethical and critical engagement with AI tools. Students used AI platforms (e.g., Semantic Scholar, Elicit.org, or ChatGPT) to generate a list of peer-reviewed quantitative research articles aligned with a topic of interest. They then employed AI to summarize one selected article's purpose, methodology, and findings, followed by a structured comparison between the AI-generated summary and their own independent analysis. A reflective component required students to evaluate the accuracy, usefulness, and limitations of AI in supporting research tasks. The assignment aligns with CSWE EPAS competencies related to research-informed practice, critical thinking, and ethical decision-making. This presentation will outline the assignment design, assessment rubric, and observed outcomes related to students' critical appraisal skills and ethical awareness. Attendees will gain practical strategies for responsibly incorporating AI into research-focused social work coursework.

Track: Assignment Design with Co-Pilot (Room 2234)

  • CoPilot Powered Business Analysis in Excel (Courtney Borruso) - Incorporating CoPilot allows our business students to move beyond the basics of data analysis in Excel. The aim is to teach students analysis with CoPilot by providing teachable examples and allowing them to practice their new skills on Midterm and Final Projects. This curriculum extension focus on generating clear prompts, iterating prompts to fine tune output, requiring and analyzing explanations from AI, and leveraging CoPilot to create interactive worksheets and analysis that pushes past the boundaries of Excel.

Track: Using Co-Pilot (Room 2233)

  • Prompt to Presentation, Using Co-Pilot with PowerPoint (Kalyn Huff) - The session will discuss and demonstrate how to use Copilot to create an accessible PowerPoint presentation from an IT trainer's perspective. I used Copilot to build the foundation of my PowerPoint slides to save time, and it provides a good flow while keeping my content organized and accessible.

Back to top

 

1:45-3:15 Keynote 

Gathering Room - Williamson 3422/3423

AI in the Classroom: Practical Strategies for Today’s Faculty (Adam Pryor)

About the Keynote Speaker: Dr. Adam Pryor is a consultant and strategist who helps non-profits and higher education institutions implement generative AI. As the founder of Pryor Consulting and Senior Advisor for the CIC’s AI Ready program, he has designed training for over 25,000 professionals across 190 campuses. Adam’s work is informed by his experience as a tenured professor and Provost at Bethany College. He designs custom AI solutions for everything from workflow automation to the pedagogy of using AI. A writer and speaker on the ethical use of technology, he authors the "Purposeful AI" Substack and has published five books considering issues in society and technology addressing topics such as theology, biotechnologies, and climate change.

Back to top