Note : The Highlighted session are of 25 minutes only
|
Time
|
Show All
|
Exploration and Discovery: Emerging Frontiers in Analytics and AI
|
Crossroads and Connections: Building Collaborative Data Pathways
|
Plotting the Course: Strategic Planning Through Insight
|
Mapping the Landscape: Understanding Our Data Terrain
|
Panel
|
|---|---|---|---|---|---|---|
| 8:30 am - 9:00 am |
Opening / Speaker Introduction
Facilitator/Moderator: Lance Kennedy-Philips
Speaker: Fotis Sotiropoulos |
|||||
| 9:00 am - 9:05 am | Break | |||||
| 9:05 am - 9:55 am |
Identifying AI Related College Activity and Impact: Using Generative AI to Build a Scalable Keyword Framework
As universities rapidly expand AI related teaching, research, and investments, institutional leaders need clear insight into how artificial intelligence (AI) appears across the curriculum and research activities. However, there is a lack of a standardized vocabulary for describing AI content. Emerging technologies, evolving terminology, and vendor branded tools (e.g., ChatGPT, Claude, Gemini) make it difficult to capture a complete and up-to-date picture of AI related curriculum and research.
This project demonstrates a scalable, replicable method that uses generative AI to create a comprehensive list of AI related keywords and vendor specific terms, refine them through iterative cleaning, and map them to class enrollment, student credit hour, and research (e.g. proposals and awards) data. The resulting approach enables more accurate identification of AI related SCH and research activities to support the university’s goal of transformative integration of AI across research and curricula across all disciplines. This presentation will cover the following content:
Data analysts and administrators might find the content relevant. Participants will learn to:
Facilitator/Moderator: Humaira Rahman
Speaker: Jennifer Wu, Matthew Zerphy, Robert Rabb, Chenying Wang |
How is Artificial Intelligence Utilized in Laboratory Medicine?
Artificial intelligence (AI) is technology that enables computers and machines to simulate human learning, comprehension, problem-solving, decision-making, creativity, and autonomy. Problem-solving and decision-making of AI are particularly useful in laboratory medicine.
AI can be utilized in laboratory medicine to predict laboratory test results, enhance laboratory utilization, automate laboratory testing, improve diagnostic accuracy and turnaround time, facilitate laboratory test interpretation, predict patient outcomes, and enhance laboratory information systems. Data analytics and AI scientists, clinicians, and clinical laboratorians need to learn these applications so that they can collaborate to develop and utilize AI models in medical practice. In this presentation, examples of AI applications in all disciplines of laboratory medicine, including clinical chemistry, hematology, microbiology, immunology, molecular diagnostics, and transfusion medicine, will be showcased, respectively. In addition, Future directions of AI applications in laboratory medicine will be explored. The main objective of this presentation is to bring discussions among data analytics and AI scientists, clinicians, and clinical laboratorians for potential collaborations on the development and utilization of AI models in laboratory medicine. The primary audience for this presentation is data analytics and AI scientists, clinicians, and clinical laboratorians. After attending the presentation, the audience should be able to summarize applications of artificial intelligence in various disciplines of laboratory medicine and collaborate with each other to explore future applications of artificial intelligence in clinical laboratories. Facilitator/Moderator: Andrew Arvin
Speaker: Yusheng Zhu |
From Data to Action: Abington’s Chaiken Center for Student Success’ Assessment Journey
At Penn State Abington’s Chaiken Center for Student Success, assessment is embraced as a core element of professional practice rather than an external obligation. Data-driven strategies inform strategic planning, foster continuous improvement, and enhance the effectiveness of academic support services.
This presentation is designed for support service professionals, including those working in tutoring, success coaching, advising, counseling, career development, and student engagement. Participants from any service-oriented department will gain practical insights into building a culture of assessment and using data to drive meaningful change. The session will outline a systematic process for developing an assessment program, covering key steps such as defining departmental roles and mission, aligning assessment goals, and implementing data-informed strategies to guide decision-making. The Chaiken Center’s cohesive data collection framework will serve as a case study, featuring assessment tools such as: Academic Support Needs Survey, Academic Support Utilization Survey, DFW Course Analysis, Tutoring Utilization Data, Client Report Form Data These measures have informed decisions on staffing, resource allocation, and targeted academic initiatives. For professionals considering their own assessment needs, this session will demonstrate how data can be leveraged for strategic planning and overall service improvement. Objectives for Participants:
To foster engagement and gather insights into institutional practices, the presentation will incorporate an interactive Mentimeter activity. Finally, the presentation will showcase examples from the Chaiken Center's data-driven actions, including hiring a professional math tutor, launching the Math Advancement and Placement Success (MAPS) program, collaboratively training peer mentors/tutors/coaches across multiple departments, and creating a Peer Success Coaching program —illustrating the direct connection between assessment and institutional impact. Facilitator/Moderator: Marc Counterman
Speaker: Dennis Millan |
Panel: Data Pathways to Student Success
Data Pathways to Student Success brings together panelists from the Office of Planning, Assessment, and Institutional Research; Penn State Altoona; Undergraduate Education; the Graduate School; and IT to explore how Penn State can more effectively harness data to advance student success. Designed for academic leaders, cocurricular leaders, and the data analysts who support them, this session will examine the breadth of student success data currently available, identify actionable levers informed by those data, and highlight key gaps where additional information is needed. Attendees will leave with a clearer understanding of the data landscape and a sharper sense of how to leverage it to support student success across the University.
Facilitator/Moderator: Betty Harper
Speaker: Jeff Adams, Bill Clark, Peter Moran, Janet Schulenberg, Hannah Williams |
||
| 9:55 am - 10:00 am | Break | |||||
| 10:00 am - 10:50 am |
AI-Guided Forecasting for Small Business Sales
Anyone who is interested in forecasting techniques will be fit for this presentation as an audience. This presentation discusses the challenges of business forecasting faced by small businesses and if LLMs can help in such decision making. The mai objective of the presentation is to describe the role that LLMs can play in robust forecasting techniques under uncertainty for small businesses. The main topics are based on how marketing research companies forecast demand in a new market and how LLM based forecasting can create more value under certain assumptions. The presentation aims to help more businesses which are small and under resourced, to learn how applying AI under certain assumptions can overcome forecasting barriers in an uncertain market, to some extent.
Facilitator/Moderator: Kirsten Hochstedt
Speaker: Subhadra Ganguli |
LightCast
Facilitator/Moderator: Sandy Tak
Speaker: Aaron Tippett, Bryce Young, Morgan Halpert |
Enrollment Management – Tracking Student Success
This presentation will explore several dashboards and methodologies the Enrollment Management office uses to help track student success. We will explore pre-entry attributes and how they correlate with academic performance factors while at Penn State. This presentation would be beneficial for anyone wanting to understand the data available to help paint a picture of the student success landscape at Penn State.
Facilitator/Moderator: Sandy Sollenberger
Speaker: Ryan Brady, Haiou Hu, Brody Albregts |
Panel: Plotting the Course: Strategic Planning Through Insight
Facilitator/Moderator: Michael Rosenberg
Speaker: Steve Borrelli, Jodi Harris, Kristy Hove, Peter Moran, Daniel Newhart |
||
| 10:50 am - 10:55 am | Break | |||||
| 10:55 am - 11:20 am |
Predicting Senior Student Graduation Probability: An Ensemble Machine Learning Model Using Academic and Socioeconomic Indicators
This presentation summarizes main results from the Senior Study conducted by the OPAIR Data Science Team. The objective of the study is to estimate the likelihood that senior students will graduate within six years and to identify groups of students who may benefit from targeted support.
An ensemble machine learning model was developed to predict graduation probability using a combination of academic performance indicators (e.g., DFW ratio, total credits earned) and socioeconomic characteristics (e.g., first-generation status, Pell Grant eligibility). In addition, critical point analysis was applied to highlight key thresholds where graduation risk increases, allowing students to be grouped by level and type of risk. These insights can support data-informed decision-making and the development of targeted intervention strategies to improve student success. Final results will be delivered through an interactive Power BI dashboard to enhance practical use by stakeholders. Facilitator/Moderator: Sandy Tak
Speaker: Xianzeng Niu |
Showcasing Success: Using Post-graduation Data
This presentation will focus on the challenges with the lack of data quality from the post graduate survey and how the college analysts paired with Career Enrichment Network found a way to harness the data to provide meaningful insight to Departments. We will focus on data reliability testing, supplemental data, data modeling and visualizations to aid with student recruitment into majors. In addition we will discuss collaboration among members of the college throughout the development and how each presenter played a vital role from concept through deployment.
Facilitator/Moderator: Aaron Tippett
Speaker: Sandra Sollenberger, Heather Rutten, Katie Wysocki |
Credit Accumulation as a KPI for the World Campus Strategic Plan
Who is the primary audience for this presentation?
The primary audience is World Campus decision-makers and support teams who monitor and improve student progress and retention. What specific needs or challenges does the audience face that this presentation will address? Credit accumulation is strongly tied to students’ academic success. Therefore, this presentation provides an in-depth analysis to support its use as a KPI for the World Campus strategic plan. To understand how students at World Campus are progressing towards credit accumulation (i.e., earning credits with a C or better) across time and across key groups (e.g., UG/GR; FT/PT/Hybrid). To interpret differences in credit accumulation between the key groups. To identify actionable metrics tied to retention by determining the correlational relationship between credit accumulation and retention. What are the main objectives of the presentation? To establish first-year accumulated credits as a World Campus KPI. To provide summaries of the average accumulated credits by term and academic year, broken out by students’ academic standing and career. To examine the relationship between credit accumulation and retention, highlighting which relationship is strongest for strategic planning. To provide implementation examples by creating a PowerBI dashboard so stakeholders can monitor the KPI metrics routinely. What are the main points or topics that will be covered? To present trends in average accumulated credits (e.g., by term and by student types). To present the correlation results linking credit accumulation and retention. What are the expected outcomes for the audience after attending the presentation? Attendees should walk away able to: Demonstrate the draft credit accumulation KPI dashboard. Describe credit accumulation patterns over time for major WC student groups (e.g., UG/GR; FT/PT/Hybrid). Understand how to use credit accumulation as a planning lens, especially recognizing first-year accumulated credits as a strong metric associated with year-1 retention. Make retention and strategic planning discussions more informed by basing decisions on observed relationships. Facilitator/Moderator: Ashton Webb
Speaker: Siyu Liu |
From Chaos to Coordinates: Turning Operational Workflows into Reliable Data
Institutions generate large amounts of data through everyday operational workflows such as forms, emails, approvals, and ad-hoc tracking systems, yet much of this information never becomes reliable or reusable data. Analysts and administrators are often left reconciling spreadsheets, responding to one-off requests, or building dashboards on top of inconsistent inputs. This session is designed for data analysts, business intelligence professionals, functional administrators, and data-informed leaders who struggle with fragmented data sources and limited visibility into how data is created upstream.
Rather than focusing on dashboards or advanced analytics, the presentation examines how routine operational workflows can be intentionally designed to serve as dependable data sources. Using practical, real-world examples from academic operations, the session highlights common patterns such as intake tracking, recurring project management, and structured data collection. These examples demonstrate how simple design choices in everyday tools can replace email-based processes and spreadsheets with shared, auditable systems that are easier to analyze and maintain. After attending the session, participants will be better equipped to map their own data terrain, recognize where data quality issues originate, and apply repeatable workflow patterns that turn everyday operational activity into reliable data for analysis, collaboration, and strategic planning. The session also highlights how thoughtful process design can improve transparency, consistency, and equity across institutional operations. Facilitator/Moderator: Christopher Barnes
Speaker: Marc Counterman Mapping Early-Career Research Trajectories Using Institutional Data
Facilitator/Moderator: Ryan Brady
Speaker: Robert Rabb, Erika Swift, Anne DeChant, Alyson Eggleston |
||
| 11:30 am - 12:00 pm |
Closing Remarks
OPAIR Update 2026
|
Note : The Highlighted session are of 25 minutes only
|
Time
|
Show All
|
Exploration and Discovery: Emerging Frontiers in Analytics and AI
|
Crossroads and Connections: Building Collaborative Data Pathways
|
Plotting the Course: Strategic Planning Through Insight
|
Mapping the Landscape: Understanding Our Data Terrain
|
Panel
|
|---|---|---|---|---|---|---|
| 8:00 am - 9:35 am |
Opening / Speaker Introduction
Speaker: Sandra Sollenberger
|
AI Panel
Facilitator/Moderator: Sandra Sollenberger
Speaker: Vasant Honavar, Soundar Kumara, S. Shyam Sundar, Mehrdad Mahdavi, Maggie Slattery |
||||
| 9:35 am - 9:40 am | Break | |||||
| 9:40 am - 10:30 am |
Creation of AIE Degree and Minors from Stakeholder Feedback
The primary audience for this presentation is data analysts; however, faculty, students, and administrators are welcome to participate in the discussion. This presentation will also show how faculty can collaborate with data and assessment professionals to integrate human insight with ML and XAI, and ensure models are not only accurate but also meaningful in supporting students’ success.
In addition to using ML and Explainable AI (XAI) for prediction and interpretation, the session will also emphasize the essential role of human knowledge, disciplinary context, and domain expertise when using machine learning (ML) models, particularly in complex educational settings, where their effectiveness and responsible use depend on human-centered design, theory-informed variable selection, and contextual interpretation of results. By the end of this presentation, participants will be able to:
Key points to be covered will include ML model comparison, Explainable AI and its role in transparency and trust, and the integration of human-centered knowledge to refine feature engineering, interpret model outputs, and translate findings into policy and practice. Expected outcomes include increased awareness of ML approaches appropriate for educational research, deeper understanding of how XAI helps de-black-box predictive models, and, recognition that human expertise is central to ensuring ML models lead to equitable, responsible, and impactful student success interventions. Facilitator/Moderator: Christopher Barnes
Speaker: Thomas La Porta, Robert Rabb, Mehrdad Mahdavi |
Mapping the Student Experience: An Introduction to Student Affairs Research & Assessment (SARA)
This presentation will provide an overview of Student Affairs Research & Assessment (SARA). We will discuss our mission and approach, explain how we design and administer surveys that capture multiple aspects of a student’s journey, and highlight how these data connect across units and student populations. Topics will include the key surveys we manage, our processes for gathering and analyzing data, how we collaborate with partners, and examples of how our insights inform programs, policies, and strategic planning. By the end of this presentation, attendees will have a clearer understanding of SARA's general work and major projects, how SARA’s work supports division and institutional goals, and how they can access SARA data insights and potentially collaborate in the future.
Facilitator/Moderator: Ryan Brady
Speaker: Yolanda Murphy, Anna Raup-Kounovsky, Eliza Bradley, Adam Christensen |
Data & Analytics Strategy Development and Approach Best Practices presented by Gartner
Facilitator/Moderator: Amy Elliott
Speaker: Amanda Showers and Jonathan Beris |
|||
| 10:30 am - 10:35 am | Break | |||||
| 10:35 am - 11:25 am |
Supporting Students' Success: Using Machine Learning Models, Explainable AI and Data Analytics to Predict Engineering Students’ Persistence
The primary audience for this presentation is data analysts; however, faculty, students, and administrators are welcome to participate in the discussion. This presentation will also show how faculty can collaborate with data and assessment professionals to integrate human insight with ML and XAI, and ensure models are not only accurate but also meaningful in supporting students’ success.
In addition to using ML and Explainable AI (XAI) for prediction and interpretation, the session will also emphasize the essential role of human knowledge, disciplinary context, and domain expertise when using machine learning (ML) models, particularly in complex educational settings, where their effectiveness and responsible use depend on human-centered design, theory-informed variable selection, and contextual interpretation of results. By the end of this presentation, participants will be able to:
Key points to be covered will include ML model comparison, Explainable AI and its role in transparency and trust, and the integration of human-centered knowledge to refine feature engineering, interpret model outputs, and translate findings into policy and practice. Expected outcomes include increased awareness of ML approaches appropriate for educational research, deeper understanding of how XAI helps de-black-box predictive models, and, recognition that human expertise is central to ensuring ML models lead to equitable, responsible, and impactful student success interventions. Facilitator/Moderator: Andrew Arvin
Speaker: Ibukun Osunbunmi, Jennifer Wu, Robert Rabb, Dr Stephanie University College Portfolio Workstream
Facilitator/Moderator: Heather Rutten
Speaker: Jeff Gable |
Measuring What Matters: Assessing NACE Competencies Through Experiential Learning in Penn State Outreach
As higher education evolves to meet the demands of an increasingly dynamic workforce, Penn State Outreach is advancing a coordinated, data-informed approach to experiential learning that supports students’ career readiness across disciplines, colleges, and campuses. This presentation will introduce Outreach’s innovative model for assessing experiential learning through both qualitative and quantitative methods, grounded in the National Association of Colleges and Employers (NACE) Career Readiness Competencies.
In this session, we will share how Outreach and partnering units are building a comprehensive assessment pathway that connects student experiences—such as community engagement, research, employment, service learning, and creative inquiry—to measurable growth in essential competencies. Participants will learn how Outreach is leveraging a robust Canvas-based reflection course where students complete structured prompts, journal activities, competency self-ratings, and purpose and journey-mapping exercises. These tools strengthen metacognition, help students articulate their skill development, and promote self-efficacy—an important predictor of persistence and workplace success. The presentation will highlight the data collection strategy that integrates reflective narratives, rubric-based evaluations, survey data, and pre/post assessments to form a holistic picture of student transformation and impact. We may also explore early findings from pilot cohorts and discuss how qualitative insights are being combined with quantitative indicators to inform program design, support faculty and community partners, and align with institutional goals related to career readiness, student success, and land-grant impact. Session attendees will walk away with: --Practical examples of integrating NACE competencies and community engagement learning outcomes into experiential programs --Sample reflection prompts and Canvas course structures --Models for mapping student experiences to competencies, outcomes, and pathways --Strategies for using assessment data to refine programs and elevate student success --Approaches for building cross-unit collaboration around experiential learning --An invitation to the new Experiential Learning Assessment Working Group - hosted by Penn State Outreach By connecting assessment, storytelling, and student development, this work contributes to a university-wide vision of preparing Penn State students to thrive in an evolving job market where agility, purpose, and human skills matter more than ever. Facilitator/Moderator: Ashton Webb
Speaker: Michael Zeman and Andrea Allio |
Building a Data-Informed Strategic Plan
Target audience: Institutional researchers, strategic planners, and academic leaders interested in leveraging data for planning and assessment.
Main objectives: Penn State World Campus’s 2025–2030 strategic plan showcases the effective integration of diverse data sources, offering practical insights into how data can be leveraged to set measurable goals and KPIs. The plan also emphasizes equity-focused strategies designed to identify and address under-participation or performance gaps among student populations. Audience Needs: Many institutions struggle to move from data collection to actionable insights in planning. This session equips participants to tackle strategic planning challenges by demonstrating how to integrate environmental scans, benchmarking, and stakeholder input into priorities, set measurable KPI targets, and embed equity considerations in planning and monitoring processes to address gaps in student participation and performance. Main topics: Penn State World Campus’s strategic planning process integrates a variety of data sources and methods, including environmental scanning, focus groups with students and faculty, Student Success Strategy Groups, KPI benchmarking, and equity tracking for participation gaps. This comprehensive approach informs the development of a robust KPI framework and provides actionable examples. Expected Outcomes: 1) A replicable model for data-informed planning. 2) Strategies for engaging stakeholders through data. 3) Practical tips for setting KPIs and monitoring equity. Facilitator/Moderator: Sarah Van Oss Speaker: Jodi Harris, Ashton Webb, Nadia Richardson, Alex Mura Data Strategy Update
Facilitator/Moderator: Sandy Tak
Speaker: Amy Elliott |
“What is Reference Data, and Why is it Important?”
In this session, we will discuss why it’s necessary to have a stable, standardized set of codes and values to categorize and give meaning to other data. We will cover what reference data is, how it is used, how it differs from transactional and master data, and where it shows up in everyday systems and reports. We’ll share some examples of common reference data and give participants the opportunity to follow up with their own. We hope that participants will gain an understanding of why well-managed reference data is critical for data quality, regulatory compliance, analytics, and consistent decision-making across Penn State.
Facilitator/Moderator: Betty Harper
Speaker: Michael Cooper and Frank Kachurak |
||
| 11:25 am - 11:30 am | Break | |||||
| 11:30 am - 12:00 pm |
Closing
Speaker : Carly Sunseri / Sandy Tak / Sandra Sollenberger
|