Boston College's Linden Lane in spring.

Students, faculty, and staff at 51²è¹ÝSSW are harnessing the power of AI to speed up literature reviews, make research easier to communicate to broader audiences, and strengthen experiential learning opportunities.

The Boston College School of Social Work has begun using artificial intelligence as a practical tool to enhance teaching, research, and clinical training.

Students, faculty, and staff are harnessing the power of AI to speed up literature reviews, make research easier to communicate to broader audiences, and strengthen experiential learning opportunities.Ìý

Dean Gautam N. Yadama said social workers are uniquely positioned to engage with emerging technologies like AI. “Social workers are trained to be adaptive. As AI touches every industry, our training ensures that our students and faculty are at the forefront in the appropriate and productive use of new tools,†he said. “As a major research University, Boston College has the resources—from working groups to campuswide initiatives—to help us navigate and shape the latest developments in AI.â€

The School’s approach to AI centers on ensuring students, faculty, and staff understand how the technology works so they can shape tools to fit their needs, rather than relying on generic platforms. Faculty emphasize experimentation and transparency, with a particular focus on protecting client privacy.

“I encourage people to dive in, explore, and learn how to direct AI for their own benefit, rather than simply being recipients of it,†said Kirsten Davison, associate dean for research. “Once you really understand how much prompting and programming is involved, you start to see AI not as something that does the thinking for you, but as a sophisticated instrument you can shape and control.â€

Davison, who represents 51²è¹ÝSSW on the Campus AI Steering Committee alongside Professor Cathy Taylor, has designed a chatbot to help faculty translate academic research into plain language summaries for mainstream audiences. The scholars provide the substance—uploading research papers, highlighting key points, and connecting their findings to timely news events—while the chatbot provides the structure.Ìý

Other students, faculty, and staff are testing the capabilities of AI across research, teaching, and administration.

Debbie Hogan, for example, assistant director of the doctoral program, has created a chatbot in 51²è¹Ý’s learning management system to answer recurring questions about the curriculum from PhD students.Ìý

Carolyn Romano, assistant professor of practice, has developed AI-driven role-play simulations so students can practice therapeutic techniques with virtual personas rather than relying solely on peer-to-peer exercises.Ìý

Linda DeLauri, director of research and program development, has built a chatbot to help faculty assess whether grant proposals meet funder expectations for community engagement.Ìý

I encourage people to dive in, explore, and learn how to direct AI for their own benefit, rather than simply being recipients of it. Once you really understand how much prompting and programming is involved, you start to see AI not as something that does the thinking for you, but as a sophisticated instrument you can shape and control.
Kirsten Davison , associate dean for research

Beyond these individual initiatives, 51²è¹ÝSSW is fostering AI literacy through workshops led by the Center for Digital Innovation and Learning at 51²è¹Ý. An , the , will train a cross-section of students, faculty, and staff to create custom AI assistants to solve particular challenges in research and practice—a first-of-its-kind initiative at 51²è¹Ý.

“We see this kind of discipline-specific AI training as perhaps the most important next step in AI at 51²è¹Ý,†said John FitzGibbon, director of digital learning innovation and AI at CDIL. “51²è¹ÝSSW is at the forefront of this, and we are honored to support such a dynamic group of faculty, staff, and students who are so closely aligned with 51²è¹Ý’s mission as they move forward in this area.â€

To guide 51²è¹ÝSSW’s efforts, Taylor created an AI working group at the School last spring. A resulting document, “Core AI Guidance and Resources for Students,†addresses foundational questions: How can I use AI responsibly across various roles and contexts? What can I do to build my AI literacy? If I do use AI, how and when should I cite it?

“The initial approach focused on ensuring that we have a cohesive understanding of how we want to guide students using AI to enhance their learning, while taking into account that there are ethical considerations,†said Taylor. “It’s acknowledging that AI is with us, and we want to make good use of it without doing harm in relation to what students are learning.â€

Looking ahead, 51²è¹ÝSSW envisions AI as an integral part of its two-pronged mission: to prepare students to tackle the world’s most pressing social problems and to advance scholarship that strengthens the social work profession.

The School is exploring how to integrate AI into the curriculum, with an emphasis on giving students hands-on opportunities to experiment with tools such as NotebookLM, an AI-powered research assistant. Faculty, for their part, hope to use AI to create visual abstracts and infographics that make scholarship more accessible to the public.ÌýÌý

Davison envisions 51²è¹ÝSSW as a leader in innovative and socially responsible use of AI.

“I would like the School to be viewed as an innovator in this space,†she said. “We have tremendous assets at 51²è¹Ý, like CDIL, which puts you in the driver’s seat and fosters a community of learning where we share what we’re doing.â€

Stay tuned for an upcoming series highlighting how students, faculty, and staff at 51²è¹ÝSSW are putting AI into practice across teaching, research, and clinical training.

Ìý

Back To Top