Developer Productivity with Large Language Models

Track: Artificial Intelligence
Abstract
The emergence of large language models (LLMs) has transformed how developers approach coding, offering new opportunities to improve productivity and collaboration. The combination of LLMs and Java offers a transformative opportunity to streamline workflows, enhance code quality, and reduce development cycles.

This session explores how LLMs, such as OpenAI’s GPT and GitHub Copilot, can empower developers by automating repetitive coding tasks, improving debugging efficiency, and generating optimized, boilerplate-free code. We discuss practical applications of LLMs in Java development, including intelligent code generation, debugging assistance, and documentation creation. Additionally, we explore how LLMs can support the learning curve for new Java developers by providing real-time guidance and best practices.
Mo Haghighi
Dr Mo Haghighi is distinguished engineer/director for Cloud Platform and Infrastructure at Discover Financial Services. His current focus is hybrid and multi-cloud strategy, application modernisation and automating application/workload migration across public and private clouds. Previously, he held various leadership positions as a program director at IBM, where he led Developer Ecosystem and Cloud Engineering teams in 27 countries across Europe, Middle East and Africa. Prior to IBM, he was a research scientist at Intel and a Java developer at Sun Microsystems/Oracle. Mo obtained a PhD in computer science, and his primary areas of expertise are distributed and edge computing, cloud native, IoT and AI, with several publications and patents in those areas. Mo is a regular keynote/speaker at major developer conferences including DevOpsCon, Java/Code One, Codemotion, DevRelCon, O’Reilly, The Next Web, DevNexus, IEEE/ACM, ODSC, AiWorld, CloudConf and Pycon.