ChatGPT, Copilot, and other AI tools

Learn about using and researching AI tools like ChatGPT.

Microsoft Copilot (UMN)

Tutorial: AI tools and research

AI tools and research tutorial. 3 minutes.
After completing this tutorial, you will be able to:
  • Describe how AI works
  • Understand how (and if) you can use AI for your research
  • List some major concerns experts have about the current state of AI

What is ChatGPT?

Microsoft Co-pilot and ChatGPT  are examples of "artificial intelligence" "large language model," similar to a chatbot, but more robust. These tools are not a search engine where you are given results to a specific search, but instead it creates “new content” by predicting the word most likely to come next (e.g. based on publicly available Internet sites like wikipedia and youtube transcripts).

Language Learning Models (LLMs), like ChatGPT and CoPilot, are designed to model human language. They use mathematical models to predict what the next word is most likely to be based on what you are asking for but because they use natural language it feels like they are thinking. But keep in mind: they don't think. They don't understand, read, choose or give you the "best information." Sometimes it might feel or seem like it, but this isn't how the technology works. That said, they also usually won't tell you where they got the information they're pulling from, and who is doing the work behind the scenes. Many, if not most, are unregulated and influenced by how we all interact with it. 

The UMN Office of Information Technology (OIT) has developed guidelines on appropriate use of ChatGPT and other generative AI tools, especially concerning UMN data. 

How does ChatGPT or CoPilot work?

CoPilot, ChatGPT and other LLMs continue to evolve.

One main way users interact with ChatGPT is to ask it a question, or give it a prompt and receive a quick answer.

How do they work? 
Unlike a search engine, which searches and then gives results using information already created -- Large Language Models (LLMs) are making "new" content predicting the word most likely to come next (e.g. based on HUGE dataset -- publicly available Internet sites like wikipedia and youtube transcripts (which includes racist, conspiracy sites, etc.)). They are designed to model human language and use mathematical models to predict what the next word is most likely to be based on what you you are asking for. Keep in mind -- they don't think. They do NOT understand, read, choose or give you the "best information." Sometimes it might feel or seem like it but it is but this isn't how the technology works. 

Challenges and possibilities of generative AI tools

Challenges of using ChatGPT or other LLMs 

  • There is a mix of correct and incorrect information
  • Has limited knowledge of very current events in the world (e.g. last week, last month, etc.) 
  • Likelihood of biased content is high - especially for controversial topics.
  • Privacy concerns -- what is the company doing with the data it collects from users?
  • Beware of asking for any information that would have big consequences if it was incorrect (such as health, financial, legal advice, and so on). It has a tendency to make up answers or give a mix of correct and incorrect information, but still sound very confident.

Benefits of using ChatGPT or other LLMs 

  • Can provide simple explanations to well known, non-controversial topics
  • Can provide sample text
  • Can create a list of keywords, search terms
  • Explaining information in ways that are easy to understand
  • Helping write or debug computing code
  • Summarizing and outlining texts

What are Prompts? Prompt engineering? 

Prompts are the things you write into the tool to try to get it to do what you want. Better prompts can help you to try to get better outputs. These tools need very specific instructions, and they need you to verify/critically evaluate the information or output they give you. Learn more about prompts or a course like Prompt Engineering for Chat GPT.

Last Updated: Sep 17, 2024 9:10 PM