ChatGPT is the most popular brand of a new set of text generating tools that are popping up all over the Internet. It is a new chatbot that will give often human-like responses to almost any kind of question. Here is a simple example.

The answers given by ChatGPT are not ‘correct’ in the sense that they are validated by any human or artificial system. They are generated by the algorithm on the spot.
There have been hundreds of articles written already about the application of this new technology to the university classroom and to professional work everywhere. On the one hand, people are concerned that this will lead to students using this system to answer parts or even all of their assignments. On the other hand, people are excited by the possibilities of these new tools to take on the rote work that is often a part of people’s learning and professional lives. At the University of Windsor, we’ve had discussions at Senate, and at APC to start the process of addressing this issue on our campus.
We have two sessions directly addressing this issue this month. One, ChatGPT – why won’t some students do their own work – directly addresses the issue of students using systems, whether ChatGPT, Chegg or otherwise, to help them do their assigned classwork. The other ‘Inclusive assessment practice in a digital age’ addresses the broader issue of assessment, but focuses on how adapting our assessment practice can alleviate many of the concerns that these new systems create.

Seven Links

Leave a Reply

Your email address will not be published. Required fields are marked *