• Integrate AI tools for teaching specific content or skills; 
  • Guide students in using AI for academic, research, and creative work; 
  • Apply AI-proof / technology-free teaching methods to respond to AI; 
What I did in the course and why

In my courses (MEDIART210, INFOSCI105, INFOSCI201), I typically encourage the use of AI for boosting up learning curve and productivity when it comes to complex technical tasks related to coding. This approach re-scales human’s capabilities (and expectations) so that with the same amount of study, energy, and time, far more complex results can be achieved. I see LLMs as powerful companions that can enhance students’ skills and make a wide breadth of knowledge (tailor-made to very specific needs) directly accessible. This way, projects that would have previously required tons of prerequisites and possibly long development time (just because students would have needed to know a lot more) can now be realized without previous experience and within one single course with no prerequisites.  

At the same time, students need to learn how to discriminate, understand, edit, improve, re-purpose, connect and critically enhance AI-generated results, so that they are in control of the technology instead of being controlled (which is the risk with such powerful tools).  

Overall, I try and leverage the potential of AI to “scale up” challenges: connect different development frameworks, use different programming languages, adopt advanced technical principles, and access directly to complex algorithms tailored to the specific need — these are now goals within reach of beginners, if properly guided.  

On the other side, it is very important that students develop critical thinking to properly interface with AI. One exercise that is quite helpful for coding is pen-and-paper pseudo-code. That allows students to exercise their understanding of the underlying algorithmic logic, without needing to learn language-specific idiosyncrasies and syntaxes.

Outcomes and observations

I cannot comment on changes, as my approach has been consistently the same during my period at DKU (one year). Given the specific format of teaching, it would also be hard for me to compare DKU to other situations I have experienced in the past. Also, it was when I joined DKU that I started to organically integrate AI into teaching. However, one clear observation is that students would use AI for basically any activity, whether we allow them or not. Another clear observation is that students might be smarter and more updated than teachers about possible applications of AI. I have to say I have learned one thing or two from some great students about how to use AI. 

Examples

The video is from the project of a (very dedicated and talented) student who is a freshman and did not take any previous course in CS-related subjects. The final project, result of 7 weeks of class, but only 4 weeks of development, shows the merging of 3D printing, physical interaction (through Arduino Uno board, software developed in Arduino IDE and language C+) and real-time graphics, where a simple low-poly avatar’s eyes are controlled from the same Arduino board (this part realized in Unity, using the language C+). 

The technical pipeline is quite virtuosic and definitely should not be within reach for a beginner. 

I only included one project, but in general such complex results are manageable by students if they are left free to use AI, with some specific instruction related to the most complex “choke points” of the development. Indeed, I started to require such articulate outcomes, and students have all been able to manage. 

I also include a photo of two students of mine (Jingyan Peng and Yijia Sun) who have been attending classes MEDIART210 and INFOSCI105 respectively. I have also been mentoring them (alongside Prof. Ziv Cohen) for the 2nd Innovation Competition for Students: Start Your Dreams with Sony in Shanghai, in which they recently won 1st prize. The technical realization of the demo product for the competition was conducted in Unity using AI as a companion, with similar approach to the one illustrated both in MEDIART210 and INFOSCI105. 

Challenges and Lessons Learned

Challenges are related to several areas:  

  • Mismatch between users’ need and AI outcome. For example, in software development, AI might provide results that are either over-engineered (take into account more use-cases than actually needed by the project), too poorly engineered (bad/weird practices), hallucinated (rare, and typically related to bad prompting/not enough information provided to the model), related to different versions of development framework (old training set, or assumptions made by the model) etc. 
  • Lack of critical analysis of AI outcome. The AI-generated result is taken and used as-is, without comprehension of its role in the bigger picture, or of its quality, just blindly trusting the model.  
  • Prompting maelstrom. Although this becomes increasingly rare, there are still situations in which, if the first AI-generated result is not optimal, trying to fine-tune it through prompting leads to increasingly deteriorated results. That can get students (or any user) stuck forever. 
  • Lack of learning in some areas. AI can actually take up some roles so effectively that students feel released from the need to actually learn some skills themselves…until they bump into a wall. In the long term, this reduces their capability to work independently. But will they need to work without AI in their future careers? Or shouldn’t we accept that some skills and teaching principles might be simply obsolete? Or their importance is reduced compared to even 5 years ago?  
  • Impossibility to contain the phenomenon. One thing that certainly does not work (or only would work with some students and not others) is to tell them: don’t use AI. If we don’t want them to use AI, we need to design activities that do not require a computer. But then, for some disciplines, are students gaining valuable experience or are they going to just discard what the learned “analogically” as soon as they step out of academia?