screen-capture.1.webm
- server link: https://reports-7qdt.onrender.com
- Node.js with Express.js
- MongoDB with Mongoose for database
- BullMQ with Redis for background job processing
- Cloudinary for file storage
- Multer for file uploads
- frontend link: https://reports-blue-pi.vercel.app/
- React.js with modern hooks
- Axios for API calls
- CSS3 for styling
- Single Report Submission: Submit individual monthly reports
- Bulk CSV Upload: Upload multiple reports via CSV with background processing
- Real-time Progress Tracking: Monitor CSV processing progress (e.g., "12 out of 20 completed")
- Admin Dashboard: View aggregated data with month filtering
- Job Status API: Track background job processing status
- Database Idempotency: Prevents duplicate reports from same NGO/month
- Error Handling: Graceful handling of partial failures in CSV processing
- Node.js (v16 or higher)
- MongoDB
- Redis server
https://drive.google.com/file/d/1P38wGDYFsjHYdkkus4CZrnO_UQTIIFOK/view?usp=sharing
POST /api/v1/report- Submit single reportPOST /api/v1/reports/upload- Upload CSV file (returns job ID)GET /api/v1/job-status/:jobId- Get job processing statusGET /api/v1/dashboard?month=YYYY-MM- Get dashboard data
Submit Single Report:
curl -X POST http://localhost:3000/api/v1/report \
-H "Content-Type: application/json" \
-d '{
"ngoId": "68bb213f998ad3d67e1d8d6f",
"month": "2024-01",
"peopleHelped": 100,
"eventsConducted": 5,
"fundsUtilized": 50000
}'Upload CSV:
curl -X POST http://localhost:3000/api/v1/reports/upload \
-F "[email protected]"Check Job Status:
curl http://localhost:3000/api/v1/job-status/csv_12345Get Dashboard Data:
curl http://localhost:3000/api/v1/dashboard?month=2024-01The CSV file should have the following columns:
ngoId,month,peopleHelped,eventsConducted,fundsUtilized
68bb213f998ad3d67e1d8d6f,2024-01,100,5,50000
68bb213f998ad3d67e1d8d6f,2024-02,150,8,75000
- Used BullMQ with Redis for scalable background job processing
- Jobs are processed asynchronously to handle large CSV files
- Real-time progress tracking with detailed status updates
- MongoDB with compound unique index on
ngoId + monthfor idempotency - Prevents duplicate reports from the same NGO for the same month
- Aggregation pipelines for efficient dashboard data calculation
- Individual row failures don't stop the entire job
- Detailed error logging and user feedback
- Queue-based architecture can handle high volumes
- Database indexing for performance
- File storage via Cloudinary for scalability
- Cursor AI: Used for code generation, debugging, and architectural guidance
- GitHub Copilot: Assisted with boilerplate code and API bug fixing
- ChatGPT: Assist in small syntax error
With more time, I would implement:
- Authentication & Authorization: cookie(dynamically) auth for admin dashboard
- Rate Limiting: Prevent API abuse
- Logging: Structured logging with Winston
- Monitoring: Health checks and metrics
- Testing: Unit and integration tests
- Docker: Containerization for easy deployment
- CI/CD: Automated testing and deployment pipeline
- Database Optimization: Connection pooling and query optimization
- Security: Input sanitization and CORS configuration
The application provides:
- Clean, responsive UI for all features
- Real-time progress tracking for bulk uploads
- Comprehensive dashboard with filtering
- Error handling and user feedback