AI BASICS
What You Need to Know About GenAI
By the end of this unit, you will understand what generative AI is, recognize the different types of tools available, and know what to consider—including privacy and data security—when choosing and using AI tools in academic settings.
Key Takeaways
- GenAI tools generate original content in text, image, and other formats based on your prompts.
- Many tools are multimodal and designed for specific tasks—choose based on your goals.
- Always review a tool’s privacy policy before using it, especially with sensitive or academic content.
- Licensed tools like CUNY Copilot are safer for educational use.
- Faculty should set clear AI use policies in class; students should follow and clarify when needed.
- Use GenAI to support your work, not replace your judgment.
What Is Generative AI?
Generative AI, often called GenAI, refers to systems that create original content in response to user input. This content can take many forms: written text, spoken responses, code, images, charts, or even music. Unlike search engines, which retrieve existing information, GenAI produces new outputs based on patterns it has learned from massive datasets. These outputs are generated on the spot, tailored to the prompt you give it.
AI tools have become increasingly multimodal, meaning they can understand and generate not just text, but also images, audio, and more. Some tools let you upload files, interpret charts, analyze images, or even answer questions about videos. This makes GenAI an extremely flexible assistant for a wide range of academic and professional tasks.
While GenAI tools are increasingly common, using them well takes more than curiosity. It takes digital judgment. Whether you are writing a paper, building a presentation, planning a lesson, or analyzing a dataset, GenAI can help—but only if you choose the right tool, understand what it is doing, and use it responsibly.
Data Privacy & AI Training: What You Enter Matters
When using AI tools, many people focus on keeping their login secure. But the greater privacy risk lies in what you input. Any content you enter—student work, institutional documents, drafts, emails, or even your own notes—can be stored, analyzed, or used to train future models, especially if you are using a public or free tool.
AI training is the process by which these tools learn to generate content. They are fed vast amounts of data (books, articles, websites, code), and then further refined by example prompts and human feedback. This is known as machine learning.
However, it is important to understand that AI does not “think.” These models are statistical systems that predict what content is likely to come next, based on what they have seen before. They sound intelligent, but they do not understand meaning or truth. That is why GenAI tools can still generate false, biased, or misleading information. Your role is to guide the AI and critically assess what it gives you.
Questions to Ask Before You Use GenAI
What do you need help with—generating ideas, writing content, analyzing data, or visualizing something? GenAI tools are often optimized for different purposes, and not all tools handle all formats equally.
Next, consider what kind of information you will be entering. If it includes student work, Personal Identifiable Information (PII), research notes, or institutional content, you need to prioritize tools that are FERPA-aligned and follow clear data privacy practices. For example, CUNY Copilot is integrated into Microsoft 365 and adheres to institutional policies, making it safer for academic use than many public-facing tools.
Also, ask yourself: Do I understand what this tool is doing? Can I take ownership of the final product? Responsible AI use means you are not outsourcing your judgment or voice—you are enhancing it.
Types of GenAI Tools You May Encounter
Some tools are conversational, like chatbots. Others are visual, like image generators or design assistants. Some are embedded into everyday tools you already use, like writing aids in Microsoft Word or Google Docs. Many now combine features, offering chat, image interpretation, file analysis, and audio transcription in a single platform.
Here are some of the most widely used GenAI tools available today:
CUNY-Licensed
- Microsoft Copilot – Embedded into tools like Word, Excel, and Outlook; available to CUNY users with a 365 license.
Public
- ChatGPT – A conversational AI that can generate text, code, summaries, and explanations. (Multimodal in Pro version)
- Claude – Known for its long memory and context-friendly design, developed by Anthropic.
- Gemini – Google’s GenAI platform, integrated with Google Workspace; strong in image+text prompts.
- Llama – A family of open-source language models developed by Meta.
- Perplexity – Conversational browser that combines AI generation with real-time citation of sources; useful for research.
- DeepSeek – A suite of open-source models from China that support text, code generation, and other GenAI tasks. Competitive with mainstream tools and often used for testing alternatives to proprietary models.
- Adobe Firefly – A visual GenAI tool for generating and editing graphics and images.
- Sora– A tool from OpenAI that creates images from text prompts, now integrated into ChatGPT.
- Gamma – Generates clean, modern presentations or slide decks from brief text input. Supports visual and written content layout.
- Khanmigo – A teaching assistant powered by GPT and used within Khan Academy.
Each tool comes with strengths, limitations, and its own privacy terms. Be especially cautious with free or experimental tools that request logins or offer few details about how your data is used.
Making Informed Choices with GenAI
You do not need to become an AI expert to use these tools well, but you do need to know what they do, how they handle your data, and when they are appropriate. Always read a tool’s privacy policy before entering sensitive information. When working with academic materials, choose tools that follow institutional privacy standards, like CUNY-licensed Copilot.
Just as important, expectations around GenAI use should be clearly defined. Faculty should communicate how GenAI tools may or may not be used in their courses. Students should follow those expectations and ask for clarification when needed. Shared understanding supports responsible use, and responsible use builds trust.
Quiz – What You Need to Know About GenAI