According to the results from a new study by Microsoft’s researchers and those from Carnegie Mellon University revealed that workers have begun to rely increasingly more on generative AI for their work tasks.
Which, in turn, has harmed their critical thinking skills. This reliance sometimes causes degeneration of those critical thinking skills that need to be developed.
Researchers noted that there is an ironic twist to the whole issue of automation is the fact that it mechanizes. Routine tasks except for the ones that require exception handling are left to human use.
The routine opportunity for that human is taken away with which he/she is to exercise judgment and build up cognitive musculature, such that he is left atrophied and ill-prepared when the exceptions do arise.
In this research, they engaged 319 knowledge workers who provided 936 personal experiences of using generative AI tools in their work activities. The participants completed a survey detailing their use of generative AI tools with their perceived capabilities. They mentioned the ability to discriminate AI’s output, and their confidence in doing the same without the assistance of AI in completing the activity.
Examples cited were a teacher using an AI image generator, to create images for a handwashing presentation and a commodities trader who used ChatGPT for resource and strategic recommendations. Another example included was that of a nurse who consulted a pamphlet designed by ChatGPT for newly diagnosed diabetic patients.
The results showed that workers who were more confident in AI’s capability reported a lesser degree of perceived engagement in critical thinking. In contrast, skepticism toward what the AI produced caused users to think more critically.
The researchers in the study highlighted, “The data shows a shift in cognitive effort as knowledge workers increasingly move from task execution to oversight when using GenAI.”
Thanks for choosing to leave a comment. Please keep in mind that all comments are moderated according to our comment Policy.