Full-Stack AI-Powered Quiz Platform with Cloud & Infrastructure as Code (IaC)
QuizGenius is a modern web application that leverages Generative AI to automatically transform uploaded lecture notes (PDF/TXT) into rigorous, Master's-level academic quizzes. Built with React and FastAPI, the entire application is containerized with Docker and deployed to AWS using Terraform for a fully automated, scalable infrastructure.
This project maps directly to modern cloud concepts, fulfilling the following AWS requirements:
- Compute (Amazon EC2): Hosts the Dockerized frontend (Nginx) and backend (FastAPI).
- Storage (Amazon S3): Securely stores uploaded user documents (PDF/TXT).
- Networking (Amazon VPC): Custom Virtual Private Cloud, Subnets, Internet Gateway, and Security Groups isolating traffic.
- Database (Amazon DynamoDB): NoSQL persistence for user authentication, API responses, and historical quiz scores.
- Infrastructure as Code (Terraform): Fully automated deployment of all AWS resources.
- AI-Powered Generation: Upload any PDF/TXT and the Groq AI engine generates questions tiered into Easy (Foundational), Medium (Applied Concept), and Hard (Analytical).
- Exam Mode: Answer questions sequentially, track unattempted questions, and review detailed explanations for correct/incorrect answers.
- Persistent History: All past quizzes are saved to DynamoDB and can be reviewed from the Dashboard at any time.
- Secure Authentication: JWT-based user login and signup flow.
QuizGenius/
├── backend/ # FastAPI Python application
│ ├── main.py # Core API logic and Groq/AWS integrations
│ ├── Dockerfile # Python 3.11 build instructions
│ └── requirements.txt
├── frontend/ # React + Vite frontend
│ ├── src/ # Components, pages, and UI logic
│ ├── Dockerfile # Multi-stage Node.js build -> Nginx server
│ └── nginx.conf # Serves React SPA and proxies /api/ logic
├── terraform/ # Infrastructure as Code
│ └── main.tf # Defines VPC, S3, DynamoDB, EC2, and user_data boot script
├── docker-compose.yml # Local orchestration
└── README.md
If you wish to test the application locally on your machine using Docker Desktop:
Create a .env file inside the backend/ directory based on this exact structure:
# backend/.env
# AWS Credentials (Required for S3 and DynamoDB)
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_SESSION_TOKEN=your_session_token # If using temporary Lab credentials
AWS_REGION=us-east-1
# S3 Storage Configuration
S3_BUCKET=your_s3_bucket_name
# Authentication (Can be any random secure string)
JWT_SECRET=your_super_secret_jwt_string
# AI Engine
GROQ_API_KEY=gsk_your_groq_api_key_hereFrom the root directory of the project, run:
docker-compose up -d --build- Website:
http://localhost - Backend API:
http://localhost:3001(Nginx proxieshttp://localhost/api/...directly to the backend).
Because this project uses Terraform, deploying the entire stack takes less than 5 minutes.
- Ensure your AWS CLI is configured with active credentials.
- Navigate to the Terraform directory:
cd terraform - Edit the
terraform.tfvarsfile to include your GitHub URL and API keys. - Run:
terraform init terraform apply -auto-approve
- Wait 3 minutes for the EC2 internal boot script (
user_data) to finish installing Docker and cloning the repository. - Visit the printed EC2 Public IP address in your browser!
Built by Chahna Patel for Cloud Computing.

