Using Microsoft Copilot

Take your learning to the next level with Copilot and safeguard your data
Microsoft Copilot is one of many generative Artificial Intelligence (AI) tools available in the market.聽Each AI tool is slightly different. 黑料网大事记 supports the ethical and responsible use of AI in research and learning. For this reason, we are providing free access to Copilot with Commercial Data Protection for our current students which protects your data, making it safer to use than free versions.聽
Check your assessment requirements before using AI
The use of Artificial Intelligence (AI) in your assessments may or may not be allowed. Speak to your lecturer or course coordinator to determine whether AI tools are permitted, and to what degree they may be used.
See also
Discover the聽different modes of Copilot and tailor its responses based on your needs
Balanced mode
This is the default setting, providing a mix of creativity and accuracy. Ideal for most academic tasks. Use this mode first if you're unsure.
Creative mode
Best for brainstorming and generating innovative ideas. Use this when you鈥檙e looking for inspiration or an unconventional approach.
Precise mode
Use this when you need detailed, specific information. It focuses on accuracy and detail, perfect for technical explanations and data analysis.
What you need to know about Copilot
Each time you plan to use AI tools in your studies, remember to use them ethically and responsibly. Unethical use of Copilot could lead to academic misconduct and penalty. Do not plagiarise and always verify AI outputs before incorporating them in your work. Learn more about the accepted uses of AI.
-
Copilot, like all AI language models, has its limitations. Beware of the below limitations when you're using the AI tool:
- Context understanding:聽while Copilot is good at following instructions, it may not fully grasp the deeper context or nuances of complex academic subjects.
- Up-to-date information: Copilot鈥檚 training data have a cutoff time, meaning it might not have the latest research or current events.
- Bias and errors: Copilot, as with all AI, can inherit biases and incorrect information (also known as hallucinations). This means it might generate incorrect, biased or misleading information or content. Always cross-verify information with trusted sources. All references cited by AI should be verified to ensure the information is correct.
-
- Summarise information: "Summarise the key points of this article in bullet points."
- Generate questions: "What are some critical questions one should ask after reading this chapter?"
- Concept explanation: "Explain the concept of [complex topic] in simple terms for a beginner."
- Comparative analysis: "Compare and contrast [Topic A] and [Topic B] in terms of [specific criteria]."
-
To understand the depth and utility of Copilot, and to see its limitations, consider these tests:
- Create an image: Try asking it the following prompt: 鈥渃reate a diagram showcasing the neural networks of the human brain鈥.
- Familiar paper test: Find a paper you know well and ask, copy and paste the text into Copilot with the following prompt: "Explain how this paper might be useful to, or important to understand for, the average University student." This test demonstrates Copilot鈥檚 ability to interpret and explain academic papers. You can then ask yourself:聽
- "Has any critical or important information been left out? Made up or misrepresented?鈥澛
- 鈥淗as the output resulted in a response that is in line with what I might have crafted myself?"
You might also be interested in
Have a question?
If you have any questions about 黑料网大事记 AI policy or referencing AI in university, we鈥檙e here to help.