Skip to content

AWS GenAI Engineer - Basics Quiz

Back to Quiz Home


This quiz covers the fundamentals of Amazon Bedrock, Foundation Models (FMs), and basic inference parameters.


#

What is Amazon Bedrock?

#

Which Amazon Titan model is best suited for search and semantic similarity tasks?

#

What does the "Temperature" inference parameter control?

#

What is "RAG" (Retrieval-Augmented Generation)?

#

Which pricing model for Amazon Bedrock guarantees a specific level of throughput for steady-state workloads?

#

What is a "Foundation Model" (FM)?

#

Which Bedrock feature allows you to block PII (Personally Identifiable Information) from reaching the model?

#

What is the primary difference between Anthropic's Claude 3 and Amazon Titan?

#

What does "Top-P" (Nucleus Sampling) do?

#

What is "Prompt Engineering"?

#

Which vector database is fully managed and serverless, recommended for use with Bedrock Knowledge Bases?

#

What is the "Context Window" of an LLM?

#

Which specialized AWS chip is designed to accelerate Deep Learning inference?

#

How can you consume a Bedrock model privately within your VPC?

#

What is "Fine-Tuning"?

#

What is a "Token"?

#

Which model provider on Bedrock offers "Jurassic-2" models?

#

What is "Zero-Shot" prompting?

#

Which Amazon Bedrock feature allows you to evaluate model performance?

#

What is the "System Prompt"?

Quiz Progress

0 / 0 questions answered (0%)

0 correct


📚 Study Guides


📬 Weekly DevOps, Cloud & Gen AI quizzes & guides