우와, 1강의 읽을거리만 해도 한가득이네요;;;
모든 읽기 자료: 생성형AI 소개 (G-GENAI-I) / All Readings: Introduction to Generative AI (G-GENAI-I)
생성형 AI에 대해 정리한 글입니다 / Here are the assembled readings on generative AI:
● Ask a Techspert: What is generative AI?
● Build new generative AI powered search & conversational experiences with Gen App
● What is generative AI?
● Google Research, 2022 & beyond: Generative models:
● Building the most open and innovative AI ecosystem:
● Generative AI is here. Who Should Control It?
● Stanford U & Google’s Generative Agents Produce Believable Proxies of Human
● Generative AI: Perspectives from Stanford HAI:
● Generative AI at Work:
● The future of generative AI is niche, not generalized:
다음은 대규모 언어 모델에 대한 글입니다 / Here are the assembled readings on large language models:
● NLP's ImageNet moment has arrived:
● Google Cloud supercharges NLP with large language models:
● LaMDA: our breakthrough conversation technology:
● Language Models are Few-Shot Learners:
● PaLM-E: An embodied multimodal language model:
● Pathways Language Model (PaLM): Scaling to 540 Billion Parameters for Breakthrough
● PaLM API & MakerSuite: an approachable way to start prototyping and building generative AI applications:
● The Power of Scale for Parameter-Efficient Prompt Tuning:
● Google Research, 2022 & beyond: Language models:
● Accelerating text generation with Confident Adaptive Language Modeling (CALM):
● Solving a machine-learning mystery:
추가 자료 / Additional Resources:
● Attention is All You Need:
● Transformer: A Novel Neural Network Architecture for Language Understanding:
● Transformer on Wikipedia:
Transformer (machine learning model) - Wikipedia.
● What is Temperature in NLP?
● Bard now helps you code:
● Model Garden:
● Auto-generated Summaries in Google Docs: