Think & Built Bigger Faster Better

A second-year marketing student at the Hong Kong University of Science and Technology (HKUST) named Ashley Lam Cheuk-yiu was given the task of creating an ad campaign for a company utilizing the theories they had learned in class.

The 20-year-old produced a 600-word essay in which she suggested that the tech giant Apple use its catchphrase “Think Different” to promote gender equality and human rights in addition to its iPhone goods.

Then she requested the identical task be completed using ChatGPT, a generative artificial intelligence (AI) program that was developed last year. It developed a campaign plan that was cohesive and entitled “Efficiency Unleashed: Experience the Future with iPhone” very immediately.

“I think AI did a better job than me naming the campaign,” she stated. However, some of its paragraphs lacked originality and were overly repetitive. The majority of Hong Kong’s universities have adopted the generative tools for the upcoming academic year because of the rapid advancement of AI.

OpenAI’s ChatGPT can interact in a human-like manner and generate essays from web content. In addition to having to change their teaching methods, academics navigating the uncharted waters of AI claimed they encountered difficulties in ensuring that students acknowledged use of the technologies.

Universities just needed six months to go from outright banning AI to allowing some wiggle room and creating ways to incorporate the technology. The University of Hong Kong (HKU) has overturned a February ban and will starting this month permit students to use AI platforms for classwork and assignments.

Microsoft, which supports OpenAI, previously declared it will collaborate with eight universities in Hong Kong to create educational solutions and conduct a series of workshops to promote the use of AI. GPT-4, the most recent AI model created by OpenAI that can produce written material, and DALL-E-2, which can transform concepts into AI-generated visuals, are both available to students at HKUST.

Academics noted that a significant challenge was identifying plagiarism and judging originality because they had to make the assumption that most students would turn to AI-driven chatbots for assistance. Chung Shan-shan is a senior lecturer of environmental and public health management at Baptist University. “I tried uploading two essays to five different (pieces of plagiarism) detection software, and all the’similarity percentages’ were different,” she stated. “Determining whether a student actually used generative AI is challenging.”

Turnitin, a web-based similarity detection service frequently used in academia to evaluate assignments and articles, is one of the software tools used to identify plagiarism. It has been updated to recognize text authored by AI. VeriGuide, a technique for detecting plagiarism, was created by Chinese University.

With the aid of a token system created by the university, Baptist University students are currently permitted to use the GPT-3.5 model. However, they must disclose when they used generative AI for a task, name the tool they employed, and indicate which parts of an essay were produced by AI. Chung emphasized ChatGPT’s potential to simulate human discussions and asserted that it may aid students in comprehending concepts more fully and enhancing their essay writing fluency.

She stated that she had changed her teaching strategies as well and had introduced informal quizzes on recent events. When students attempted to respond to questions on more recent subjects, it became clear that one drawback of AI tools was that they relied on readily available, out-of-date information online, she added.

The GPT-4 is based on data that is accessible online through September 2021. Universities have also had to think about how to discipline students who infringe the rules, whether by copying generated content or failing to acknowledge any non-original contributions.

Penalties may include a reduction in assignment grade. Bruce Li Kar-lok, a senior teaching fellow in accounting at Polytechnic University (PolyU), was less concerned. He claimed that in his field, student aptitude still prevails over chatbots.

He claimed that in an assignment he created on the calculation of shares and changes connected to dates, prices, interest, tax, and dividends, ChatGPT only correctly answered two out of six questions. Li explained that while the AI program could handle simple, uncomplicated formulas, it had difficulty deciphering the reasoning behind complex problems or doing further calculations.

“AI chatbots are similar to first-grade students,” They must learn logic from us, he continued. Li said that he would advise pupils to compare chatbot-generated responses and look for inconsistencies. However, he asserted that over time, AI tools might still help pupils become more logical thinkers and problem solvers.

Arkers Wong Kwan-ching, an assistant professor of nursing at PolyU, claimed that he already recognized potential advantages of using AI in education. He continued by saying that after using AI technologies that offered “conversational feedback” to get ready for lessons, he had seen that students were more eager to communicate and ask questions.

“Students can use chatbots to swiftly explain difficult subjects before the class even starts. As we may spend less time discussing topics, this enables deeper dialogues, he said. In order to prevent plagiarism, Wong said he intended to add more “contextual” queries that the chatbots’ database would not be able to respond to.

The decision by the institutions to allow AI tools was less about technological advancement than it was about preparing students for a rapidly changing world, according to senior lecturer Jean Lai Hok-yin of Baptist University’s department of computer science.