
Transforming Life with AI: ChatGPT Insights & Applications
Discover how ChatGPT, an AI-powered chatbot, is revolutionizing personal productivity, personalized learning, relationship management, health care, wealth management, legal assistance, social interactions, and creativity. Learn about the history of ChatGPT, guest speaker insights, and the impact of effective prompts. Uncover the power of AI to enhance various aspects of your life.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
ChatGPT: How AI may change your life Tsun Chow 1 Supported by ChatGPT 4/29/23
A quick introduction History of ChatGPT ( a recorded guest presentation) Personal Use cases Tips for effective prompts (with demo) Impact on Jobs Key technical concepts behind ChatGPT Latest advances Outline 2
AI powered Chatbot that can converse like a human, easily pass the famous Turing test of artificial intelligence Released in late November 2022 by OpenAI, pretrained on knowledge available on the Internet OpenAI is a company funded by a group investors, the biggest investor is Microsoft (> 10 billion dollars) It takes the world by storm The dawn of AI has arrived, with many competitors right at its heels It puts a general-purpose powerful AI tool in the hands of consumers will change your life: both good and bad Introduction of ChatGPT 3
History of ChatGPT Guest speaker: Professor Steve Feynman from MIT 4
Personal productivity: editing, summarization, meeting preparation, trip planning Personalized learning: learn to program in Python, teach yourself linear algebra, Spanish Relationship management: learn to talk to your teenager, or date Health management: preventive tests or manage common health issues Personal Use Cases Wealth management; investment strategies Legal assistance: estate planning, divorce Social: toast speech at weddings, graduation, invitations Creativity: write blogs, poems, customized children's stories 5
Prompt: a piece of text that is used to generate a continuation or completion. input to a language model, and the model generates text that follows or continues the given prompt. What is a Prompt? Why should you care? The choice and structure of the prompt determines the quality and relevance of the generated text Effective prompt design is an important aspect of using language models for various applications, such as text generation, summarization, and question answering. 6
Low level interactive prompts(Suggested by ChatGPT): Informational prompts: Tips for Effective Prompts "Can you tell me about [topic]?" "What is [topic]?" "I need more information about [topic]." Transactional prompts: "I want to [task], how can I do that?" "How do I [task]?" "Can you help me [task]?" 7
Personal prompts: "How are you today?" "What's your name?" "Can you tell me a little bit about yourself?" Confirmation prompts: "Is this information correct?" "Did I understand that correctly?" "Are you sure you want to [action]?" Low level prompts Error handling prompts: "Sorry, I didn't understand that. Can you please rephrase?" "I'm sorry, I couldn't find any results for that. Can you try a different search term?" "I'm sorry, I'm not sure what you're asking. Can you please provide more information?" 8
Prompt Engineering Principles Will not yet learn from your past behavior It does not know you personally. Every new conversation, you need to tell it as precisely as possible as to what to do It will not ask you for clarifications or more instructions It does not know what it does not know It will do its best to comply with your request and will make up anything in its effort to do so. It sits on top of the knowledge base of the entire Internet as of 2021 It knows a lot on what has been written: from Shakespeare to the operating manual of your lawn mower, But ignorant of any events post 2021 Prepared to be amazed and disappointed 9
Standard prompt: Task description + object of the task Example: Edit the following: text Prompt types* Prompt with examples: Task description + examples + object of the task Example: What is the capital of a country? Spain: Madrid USA: Washington D.C. What is the capital of Ukraine? 10
Role prompt I want you to be [role] task description Prompt Engineering Example: I want you to be my wedding planner. Help me to plan my wedding by asking me questions one at a time 11
Add context to the prompt Context + Task . Prompt Engineering Example: I plan a vacation trip to Las Vegas for 4 days and 3 nights , provide me with an itinerary that includes dancing and shows Example: I have experience in programming Python, teach me how to program in R. 12
Modify your prompt Add style Example: Write in the style and quality of an expert in [field] with 20 years of experience. Prioritize unorthodox, lesser-known advice in your answer. Explain with detailed examples and minimize tangents and humor. Add descriptors to change the tone Example: Use adjectives such as funny, friendly, academic tone at the end of the prompt 13
Prompt Engineering Chain-of-reasoning Intermediate steps on solving a task+ task Example: Q: What is 965*950? ChatGPT will give a wrong answer: 568350. (right answer: 916750 To help it to get to the right answer, use this prompt instead: Intermediate step: 965*590=965*(500+90)=965 *500 + 965*90. Task: What is 965*950? 14
Role + Context + descriptor definitions + prompt in terms of descriptors+ task. Example: I want you to be a novelist, writing a scene in a restaurant. Anything in () indicates the style of a dialogue Anything in [] indicates the participations of the dialogue Anything in {} indicates the scenario. (romantic, flirting)[a man and a woman]{dating} describe what you see and what you hear. (caring and loving)[a mother and a young child]{a dining out treat] describe what you see and what you hear. (heated, friendly)[a man and his male friend]{arguing about a football game} describe what you see and what you hear. Modular Descriptors for complex prompts 15
"temperature" a parameter used in generating text output. controls the randomness and creativity of the generated text. Temperature =0.1 conservative or more precision or accuracy Temperature =0.9 creative or adventurous Temperature=0.5 balanced. (this is the default) E.g. temperature = 0.1 explain what is a LLM? E.g. temperature =0.9, describe what you see and hear in a busy restaurant on a weekend night Temperature 16
Jobs impacted Teaching: ChatGPT can assist with tasks such as grading essays and providing feedback on student work, potentially reducing the workload of professors and freeing up more time for other tasks. Legal services providers: ChatGPT can assist with tasks such as drafting legal documents and conducting legal research, potentially increasing efficiency and productivity in the legal field. Insurance agents: ChatGPT can assist with tasks such as processing claims and providing customer support, potentially reducing the need for human agents and allowing for faster, more efficient service. 17
Jobs impacted Other occupations impacted include: Telemarketers: ChatGPT can potentially replace human telemarketers by using natural language processing to make automated calls and interact with customers in a more personalized way. grant writers, marketers, consultants, human-resources professionals public relations specialists, court reporters and programmers 18
Key Terms LLM: LLM stands for Large Language Model, ** which is a type of deep learning model that is trained on large amounts of text data and can generate coherent and relevant text outputs based on input prompts. Generative: refers to the fact that the model is capable of generating new text (next words) that is similar to the text it was trained on. This is in contrast to discriminative models, which are trained to make binary decisions between different classes or categories. Pre-trained: Pre-training refers to the process of training a deep learning model on a large dataset to learn general features or representations, which can then be fine-tuned on a smaller, task-specific dataset. Pre-trained models are often used in transfer learning, where the knowledge learned from one task can be transferred to another related task. Transformer: (to be expanded later) The Transformer is a deep learning architecture that was introduced in a 2017 paper by Vaswani et al. It uses multi-head self-attention mechanisms to model the relationships between input tokens, allowing it to effectively capture dependencies between words in a sequence.
Neural network: consists of layers of interconnected artificial neurons that receive input, perform a weighted sum of the inputs, and apply an activation function to produce an output. The weights in the network are typically learned through a process called backpropagation, which adjusts the weights in the network to minimize the difference between the predicted output and the true output. Key Terms Deep learning: neural networks with multiple layers (>3 Layers) to learn from data and make predictions or decisions. Able to learn complex patterns and relationships in data by progressively extracting higher-level features from the input data. 20
Few-shot learning: Model is trained with a small amount of data Typically less than a few hundred examples Trained to perform a specific task Zero-shot learning: Model is trained on one task Able to generalize and perform a related task No additional training data required Key Terms Hallucination Responses that are grammatically correct but semantically incorrect or irrelevant Responses may not be related to the input given by the user 21
The Transformer model*** $: A stack of transformer blocks Each block contains multiple self-attention layers and feedforward neural network layers Self-attention layers allow the model to attend to different parts of the input sequence and learn dependencies between them Feedforward layers process and transform the input GPT Architecture Trained in two stages First stage: unsupervised pre-training on a large corpus of text using masked language model (MLM) to predict the next word in a sequence of words Second stage: fine-tuning on a specific task, such as text classification or language generation, using a smaller labeled dataset Fine-tuning involves updating the model's weights using backpropagation and stochastic gradient descent to improve task performance. 22
Word embeddings: Used in natural language processing (NLP) to represent words as dense vectors in a high-dimensional space Vectors(an array of numbers) capture semantic and syntactic relationships between words Useful for NLP tasks such as language translation, sentiment analysis, and text classification Self attention Self-attention: Mechanism used in some neural network architectures to capture dependencies between different parts of a sequence In language modeling, allows the model to consider all words in a sentence or document when making predictions Not limited to just the words that immediately precede or follow the current word 23
Advantages of self-attention over other word embedding methods: Contextualized representations: Takes into account words before and after a given word in a sentence, improving accuracy for tasks such as language translation or sentiment analysis Ability to handle long-range dependencies: Effectively models dependencies between words that are far apart in the input sequence, which can be challenging for other methods like bag-of- words or n-grams Computational efficiency: Efficient mechanism for capturing dependencies compared to other methods like recurrent neural networks, enabling training of larger models on larger datasets to improve accuracy. Self- attention vs Word Embeddings 24
OpenAI GPT-4: better in math, more capable in reasoning, image to text. Limit up to 25K words vs 2K words. released API for ChatGPT Announced the support for ChatGPT plugins Microsoft Latest advances GitHub copilot Bin that combines search and ChatGPT Office 365 copilot Google released Bard to a limited audience Open source Dolly plus many others Examples of other AI tools: Midjourney: text to image, Balabolka: text to speech D-ID: text to video 25
References * Learn Prompting https://learnprompting.org/docs/intro ** What are generative AI models https://youtu.be/hfIUstzHs9A *** Illustrated Guide to Transformers https://youtu.be/4Bdc55j80l8 $ The Illustrated transformer https://jalammar.github.io/illustrated-transformer/ 26