GAIA is the first product of Eco AI.ly, an innovative startup founded by Duarte Alexandrino and Guilherme Grancho, two Imperial College London students. Eco AI.ly is dedicated to revolutionizing the environmental impact of AI and technology through data-driven insights and sustainable solutions.
GAIA (your Green AI Assistant) serves as your loyal companion in the journey toward environmental sustainability. This comprehensive platform assists developers, companies, and organizations in reducing their environmental impact when training AI models and making energy-conscious decisions.
Eco AI.ly is a forward-thinking startup dedicated to bridging the gap between artificial intelligence and environmental sustainability. Founded by two passionate Imperial College London students, Duarte Alexandrino and Guilherme Grancho, the company is committed to:
- ๐ฑ Environmental Innovation: Developing cutting-edge solutions that make AI and technology more sustainable
- ๐ฌ Research Excellence: Leveraging academic rigor and industry best practices to create impactful products
- ๐ค Collaborative Impact: Partnering with developers, companies, and organizations to drive meaningful environmental change
- ๐ Data-Driven Solutions: Using real-time data and advanced analytics to inform sustainable decision-making
- ๐ Scalable Technology: Building platforms that can adapt and grow with the evolving needs of the green tech ecosystem
GAIA represents our first major product - a comprehensive platform that transforms how organizations approach environmental responsibility in their AI and energy consumption decisions.
- Real-time Environmental Monitoring: Live tracking of carbon intensity, renewable energy production, and power grid analytics
- AI-Powered Predictions: Advanced LSTM neural networks providing 24-hour forecasts with 90.9% accuracy
- Interactive Dashboards: Multiple frontend interfaces (Streamlit & Next.js) for different user needs
- Automated Reporting: Professional PDF generation with comprehensive analytics
- Data Visualization: Interactive charts, graphs, and real-time metrics
Portugal Energy Grid Analysis - Our initial deployment focuses on Portugal's electrical grid, providing comprehensive insights into:
- Carbon intensity trends and predictions
- Renewable energy percentage forecasting
- Power production vs consumption analysis
- Cross-border energy import/export tracking
- AI/ML: TensorFlow, LSTM Neural Networks, Scikit-learn
- Backend: FastAPI, Python, Docker, Google Cloud Run
- Frontend: Next.js, React, TypeScript, Streamlit
- Styling: Tailwind CSS, Framer Motion, Modern UI Components
- Data: ElectricityMaps API, Real-time Energy Data
- DevOps: GitHub Actions, Docker, Cloud Deployment
As the flagship product of Eco AI.ly, GAIA offers unparalleled advantages for organizations seeking to minimize their environmental impact:
- ๐ Real-time Monitoring: Live environmental data with automatic updates
- ๐ค AI-Powered Insights: Machine learning predictions with high accuracy
- ๐ Interactive Visualization: Dynamic charts and customizable dashboards
- ๐ Sustainability Focus: Dedicated to environmental responsibility
- ๐ฑ Multi-Platform Access: Web, mobile-responsive interfaces
- ๐ Automated Updates: Continuous data refresh and processing
- ๐ Energy Arbitrage: Opportunity detection for optimal energy decisions
- ๐ Professional Reporting: Automated PDF generation with detailed analytics
- ๐ Startup Innovation: Cutting-edge solutions backed by academic research and entrepreneurial agility
Two Complementary Frontend Interfaces:
Real-time analytics and comprehensive environmental monitoring
-
Carbon Intensity Analytics ๐ก๏ธ
- Live carbon intensity monitoring (gCOโeq/kWh)
- 24-hour AI forecasting with 90.9% accuracy
- Historical trend analysis and pattern recognition
- Energy arbitrage opportunity detection
- Automated PDF reporting with professional formatting
- Model performance statistics and confidence metrics
-
Renewable Percentage Tracking ๐ฑ
- Real-time renewable energy percentage monitoring
- AI-powered 24-hour predictions using LSTM models
- Historical data analysis with interactive time series
- Energy optimization recommendations
- Comprehensive PDF reports with trend analysis
- Performance metrics and accuracy tracking
-
Production vs Consumption Analysis โก
- Real-time power grid production breakdown
- Consumption pattern analysis across time periods
- Interactive pie charts with hover details
- Flexible time range selection (1, 3, 6, 12, 24 hours)
- Detailed energy balance metrics
- Professional PDF report generation
-
Import vs Export Tracking ๐
- Cross-border energy flow visualization
- Real-time import/export breakdown by country
- Interactive visualizations with detailed tooltips
- Energy balance and self-sufficiency metrics
- Time-based analysis with custom periods
- Comprehensive reporting capabilities
Cutting-edge user experience with modern design
-
Advanced UI Components โจ
- Reactive particle systems with optimized animations
- Liquid energy flow visualizations
- Glass morphism design language
- Advanced 3D components with React Three Fiber
- Framer Motion animations and micro-interactions
-
Portugal Energy Dashboard ๐
- Real-time metrics with beautiful card layouts
- Interactive Recharts visualizations (Area charts, Line charts)
- AI prediction badges with color-coded status
- Auto-refresh functionality (5-minute intervals)
- Dark/Light theme support with smooth transitions
- Fully responsive design for all devices
-
Modern Technology Stack ๐ ๏ธ
- Next.js 15.3.2 with Turbopack for fast development
- TypeScript for type safety and better DX
- Tailwind CSS 4.0 for modern styling
- React 19 with latest features
- Advanced animation libraries (GSAP, Lottie)
- Architecture: Deep LSTM with optimized hyperparameters
- Training Data: Historical energy data from ElectricityMaps
- Accuracy: 90.9% test accuracy on both models
- Classification: 6-class prediction system (0-5 scale)
- Real-time Inference: Sub-second prediction response times
- Carbon Intensity Model: Predicts emissions intensity trends
- Renewable Percentage Model: Forecasts renewable energy adoption
- Continuous Learning: Models updated with new data patterns
- Validation: Comprehensive testing with confusion matrices
- Monitoring: Real-time performance tracking and alerts
- Lightweight & Fast: Optimized for low latency and high throughput
- RESTful API: Clean, well-documented endpoints
- Real-time Data: Integration with ElectricityMaps API
- Auto-scaling: Docker containerization for cloud deployment
- CORS Enabled: Secure cross-origin resource sharing
GET /api/carbon-intensity # Carbon emissions predictions
GET /api/renewable-percentage # Renewable energy forecasts
GET /docs # Interactive API documentation- Environment-based configuration management
- API key security with proper secret handling
- Error handling and graceful degradation
- Health checks and monitoring capabilities
- Data Source: ElectricityMaps API for Portugal (PT)
- Update Frequency: Every 5 minutes for real-time accuracy
- Data Validation: Automated quality checks and error handling
- Caching: Smart caching strategies for optimal performance
- ETL Pipeline: Extract, Transform, Load processes
- Feature Engineering: Advanced preprocessing for ML models
- Data Storage: Optimized data structures for fast retrieval
- Backup Systems: Redundant data sources and fallback mechanisms
Streamlit Dashboard (Production Ready)
# Clone and setup Streamlit dashboard
git clone https://github.com/guilhermegranchopro/Eco-AI.ly.git
cd Eco-AI.ly/frontend/streamlit
python -m venv .venv && source .venv/bin/activate # or .\.venv\Scripts\activate on Windows
pip install -r requirements.txt
streamlit run Home.pyโ Access at http://localhost:8501
Next.js Modern Web App
# Setup modern Next.js dashboard
cd frontend/my-next-app
npm install
npm run devโ Access at http://localhost:3000
FastAPI Backend Service
# Setup API service
cd backend/api/CI_RP
python -m venv .venv && source .venv/bin/activate
pip install -r requirements.txt
# Create .env file with ELECTRICITYMAP_API_KEY=your_key
uvicorn app.main:app --reload --port 8080โ Access API docs at http://localhost:8080/docs
| Component | URL | Status | Description |
|---|---|---|---|
| Streamlit Dashboard | http://localhost:8501 | โ Production Ready | Full-featured analytics dashboard |
| Next.js Web App | http://localhost:3000 | โ Enhanced UI | Modern web interface with advanced features |
| FastAPI Backend | http://localhost:8080 | โ Operational | AI prediction API service |
| Live Deployment | ecoaily.streamlit.app | ๐ Live | Public Streamlit deployment |
System Requirements:
- Python: 3.10 or higher
- Node.js: 18+ (for Next.js frontend)
- RAM: 4GB minimum (8GB recommended)
- Storage: 2GB free space
- OS: Windows 10+, macOS 10.15+, Ubuntu 18.04+
Required API Keys:
- ElectricityMaps API: For real-time energy data (Get API Key)
Development Tools (Optional):
- Docker & Docker Compose
- Git
- VS Code with Python extension
The production-ready analytics platform
# 1. Clone repository
git clone https://github.com/guilhermegranchopro/Eco-AI.ly.git
cd Eco-AI.ly
# 2. Navigate to Streamlit directory
cd frontend/streamlit
# 3. Create and activate virtual environment
python -m venv .venv
# Activate virtual environment
source .venv/bin/activate # Unix/MacOS
# .\.venv\Scripts\activate # Windows PowerShell
# 4. Install dependencies
pip install -r requirements.txt
# 5. Configure environment (optional)
# Create .env file for custom API keys if needed
cp .env.example .env # if .env.example exists
# 6. Launch dashboard
streamlit run Home.pyAccess: http://localhost:8501
Enhanced UI with modern components and animations
# 1. Navigate to Next.js directory
cd frontend/my-next-app
# 2. Install Node.js dependencies
npm install
# or using yarn: yarn install
# or using pnpm: pnpm install
# 3. Set up environment variables
cp .env.local.example .env.local # if example exists
# 4. Start development server
npm run dev
# or: yarn dev / pnpm dev
# 5. Build for production (optional)
npm run build
npm startDevelopment: http://localhost:3000
AI prediction API with real-time data processing
# 1. Navigate to API directory
cd backend/api/CI_RP
# 2. Create virtual environment
python -m venv .venv
source .venv/bin/activate # Unix/MacOS
# .\.venv\Scripts\activate # Windows
# 3. Install Python dependencies
pip install -r requirements.txt
# 4. Configure environment variables
# Create .env file with your ElectricityMaps API key
echo "ELECTRICITYMAP_API_KEY=your_api_key_here" > .env
# 5. Start API server
uvicorn app.main:app --reload --port 8080
# 6. Access API documentation
# Open http://localhost:8080/docsContainerized deployment for production
# Backend API Docker setup
cd backend/api/CI_RP
# Build Docker image
docker build -t eco-ai-ly-api .
# Run container
docker run --env-file .env -p 8080:8080 eco-ai-ly-api
# Docker Compose (if available)
docker-compose up -dFor contributors and advanced users
# Install development tools
pip install ruff black pytest
npm install -g eslint prettier
# Pre-commit hooks (optional)
pip install pre-commit
pre-commit install
# Run tests
python -m pytest # Python tests
npm test # Node.js tests
# Code formatting
ruff format . # Python formatting
ruff check . # Python linting
npm run lint # Next.js lintingFile Location: frontend/streamlit/.env
# ElectricityMaps API (if calling directly from frontend)
ELECTRICITYMAP_API_KEY=your_electricitymaps_api_key
# Optional: Backend API endpoints
BACKEND_API_URL=http://localhost:8080
# Application Settings
DEBUG=False
LOG_LEVEL=INFO
STREAMLIT_SERVER_PORT=8501
# Optional: Additional API integrations
OPENAI_API_KEY=your_openai_api_key
WEATHER_API_KEY=your_weather_api_keyFile Location: frontend/my-next-app/.env.local
# API Endpoints
NEXT_PUBLIC_API_URL=http://localhost:8080
NEXT_PUBLIC_STREAMLIT_URL=http://localhost:8501
# Application Settings
NEXT_PUBLIC_APP_NAME=Eco AI.ly
NEXT_PUBLIC_APP_VERSION=2.0.0
# Development Settings
NODE_ENV=development
NEXT_TELEMETRY_DISABLED=1
# Optional: Analytics and Monitoring
NEXT_PUBLIC_ANALYTICS_ID=your_analytics_idFile Location: backend/api/CI_RP/.env
# Required: ElectricityMaps API Configuration
ELECTRICITYMAP_API_KEY=your_electricitymaps_api_key
# Optional: ElectricityMaps API Settings (defaults provided)
ELECTRICITYMAP_BASE_URL=https://api.electricitymap.org/v3
ELECTRICITYMAP_REGION=PT
# Model Paths (defaults provided)
SCALER_RP_PATH=models/renewable_percentage/scaler_renewable_percentage.pkl
MODEL_RP_PATH=models/renewable_percentage/model_renewable_percentage.keras
SCALER_CI_PATH=models/carbon_intensity/scaler_carbon_intensity.pkl
MODEL_CI_PATH=models/carbon_intensity/model_carbon_intensity.keras
# Server Configuration
PORT=8080
HOST=0.0.0.0
RELOAD=true
# Security Settings
CORS_ORIGINS=["http://localhost:3000", "http://localhost:3002", "http://localhost:8501"]
API_RATE_LIMIT=100
# Logging and Monitoring
LOG_LEVEL=INFO
ENABLE_METRICS=trueAPI Key Management:
# Never commit API keys to version control
echo ".env" >> .gitignore
echo "*.env" >> .gitignore
# Use environment variables in production
export ELECTRICITYMAP_API_KEY="your_key"
# For Docker deployment
docker run --env-file .env your_imageCORS Configuration:
- Development:
http://localhost:3000,http://localhost:8501 - Production: Your domain URLs only
- API endpoints: Whitelist specific origins
- Get API Key: Visit ElectricityMaps API
- Choose Plan: Free tier available for development
- Configure: Add key to
.envfiles - Test Connection: Use API documentation endpoints
API Rate Limits:
- Free Tier: 1000 requests/month
- Paid Plans: Higher limits available
- Caching: Implemented to reduce API calls
Eco-AI.ly/
โโโ LICENSE
โโโ README.md # This file - Main project documentation
โโโ backend/ # Backend services, models, and data processing
โ โโโ api/ # API services
โ โ โโโ CI_RP/ # FastAPI for Carbon Intensity & Renewable Percentage
โ โ โโโ app/ # FastAPI application source code
โ โ โ โโโ main.py # FastAPI endpoints definition
โ โ โ โโโ utils.py # Utility functions (data fetching, preprocessing, model loading)
โ โ โ โโโ models/ # Directory for pre-trained ML models and scalers
โ โ โ โโโ renewable_percentage/
โ โ โ โ โโโ model_renewable_percentage.keras
โ โ โ โ โโโ scaler_renewable_percentage.pkl
โ โ โ โโโ carbon_intensity/
โ โ โ โโโ model_carbon_intensity.keras
โ โ โ โโโ scaler_carbon_intensity.pkl
โ โ โโโ Dockerfile # Docker configuration for the API service
โ โ โโโ README.md # Detailed README for the API service
โ โ โโโ requirements.txt # Python dependencies for the API
โ โ โโโ .env.example # Example environment variables for the API
โ โโโ mvp/ # Minimum Viable Product: Jupyter notebooks, initial models, data exploration
โ โโโ carbon_intensity/ # Notebooks related to carbon intensity analysis/modeling
โ โ โโโ ... # (e.g., Live_Predictions_LSTM_Carbon_Intensity.ipynb)
โ โโโ power_breakdown/ # Notebooks for power import/export analysis
โ โ โโโ Power_Export_Breakdown.ipynb
โ โ โโโ Power_Import_Breakdown.ipynb
โ โโโ renewable_percentage/ # Notebooks for renewable percentage analysis/modeling
โ โโโ Live_Predictions_LSTM_Renewable_Percentage.ipynb
โโโ branding/ # Branding assets (logos, color palettes, images)
โ โโโ gaia/ # Main gaia logos and visual assets
โโโ frontend/ # Frontend applications
โ โโโ next/ # Next.js frontend (placeholder or future development)
โ โ โโโ main.py # Example file
โ โ โโโ ...
โ โโโ streamlit/ # Streamlit dashboard application
โ โโโ Home.py # Main Streamlit application entry point
โ โโโ README.md # Streamlit application-specific README
โ โโโ assets/ # Static assets for Streamlit (images, custom styles)
โ โโโ backend/ # Helper modules and backend logic specific to Streamlit
โ โโโ pages/ # Individual pages/modules of the Streamlit dashboard
โ โโโ requirements.txt # Python dependencies for the Streamlit application
โ โโโ pyproject.toml # Project configuration for Streamlit (e.g., for linters/formatters)
โ โโโ .env.example # Example environment variables for the Streamlit app
โโโ .devcontainer/ # Development container configuration (e.g., for VS Code Remote - Containers)
โโโ .gitignore # Specifies intentionally untracked files that Git should ignore
โโโ requirements.txt # Root level Python dependencies (if any, typically for dev tools)
โโโ pyproject.toml # Root level project configuration (e.g., for Ruff, Black)
โโโ uv.lock # Dependency lock file (if using uv package manager at root)
Accessing the Dashboard:
cd frontend/streamlit
source .venv/bin/activate # Activate virtual environment
streamlit run Home.pyโ Open http://localhost:8501
Dashboard Features:
- Real-time Monitoring: Current carbon intensity (gCOโeq/kWh) for Portugal
- AI Predictions: 24-hour forecasts with 90.9% accuracy using LSTM models
- Interactive Controls:
- Time range selection (1-24 hours)
- Prediction confidence thresholds
- Data refresh intervals
- Visual Analytics: Line charts, trend indicators, confidence bands
- Export Options: Download predictions as CSV or generate PDF reports
- Live Metrics: Current renewable energy percentage in Portugal's grid
- Forecasting: AI-powered predictions for renewable energy trends
- Historical Analysis: Compare current values with past performance
- Energy Optimization: Recommendations for optimal energy usage times
- Reporting: Comprehensive PDF reports with trend analysis
- Real-time Grid Status: Live power production breakdown by source
- Consumption Patterns: Detailed analysis of energy consumption
- Interactive Visualizations:
- Pie charts with hover details
- Time-based comparisons
- Source-specific breakdowns
- Flexible Time Ranges: 1, 3, 6, 12, or 24-hour analysis periods
- Energy Balance: Production surplus/deficit calculations
- Cross-border Energy Flow: Real-time import/export data
- Country Breakdown: Detailed flow analysis by neighboring countries
- Energy Independence: Self-sufficiency metrics and trends
- Interactive Maps: Geographic visualization of energy flows
- Balance Analysis: Import/export equilibrium calculations
Navigation Tips:
- Use the sidebar menu to switch between analysis pages
- Hover over charts for detailed tooltips and data points
- Click refresh buttons to get the latest data
- Use the settings panel to customize display preferences
Accessing the Modern Interface:
cd frontend/my-next-app
npm run devโ Open http://localhost:3000
Modern Features:
- Reactive Animations: Particle systems and liquid energy flow visualizations
- Glass Morphism Design: Modern, translucent interface elements
- Dark/Light Themes: Seamless theme switching with user preference storage
- Responsive Design: Optimized for desktop, tablet, and mobile devices
- Real-time Metrics Cards: Live energy data with beautiful card layouts
- Interactive Charts: Recharts visualizations with smooth animations
- Area charts for historical trends
- Line charts for real-time data
- Pie charts for energy source breakdown
- AI Status Indicators: Color-coded prediction confidence badges
- Auto-refresh: Automatic data updates every 5 minutes
- Energy Flow Animations: Visual representation of energy movement
- 3D Visualizations: React Three Fiber components for immersive experience
- Micro-interactions: Framer Motion animations for enhanced UX
- Progressive Loading: Smart loading states and skeleton screens
Starting the API Service:
cd backend/api/CI_RP
source .venv/bin/activate
uvicorn app.main:app --reload --port 8080โ Access API docs at http://localhost:8080/docs
API Endpoints:
# Get current and predicted carbon intensity
GET /api/carbon-intensity
# Example Response:
{
"current_value": 245.8,
"predicted_class": 3,
"confidence": 0.89,
"prediction_time": "2025-01-22T10:30:00Z",
"units": "gCOโeq/kWh"
}# Get renewable energy percentage predictions
GET /api/renewable-percentage
# Example Response:
{
"current_percentage": 67.2,
"predicted_class": 4,
"confidence": 0.91,
"prediction_time": "2025-01-22T10:30:00Z",
"trend": "increasing"
}Python Integration:
import requests
# Get carbon intensity prediction
response = requests.get('http://localhost:8080/api/carbon-intensity')
data = response.json()
print(f"Current CI: {data['current_value']} gCOโeq/kWh")JavaScript/Node.js Integration:
// Fetch renewable percentage data
fetch('http://localhost:8080/api/renewable-percentage')
.then(response => response.json())
.then(data => {
console.log(`Renewable %: ${data.current_percentage}%`);
});cURL Commands:
# Test carbon intensity endpoint
curl -X GET "http://localhost:8080/api/carbon-intensity"
# Test renewable percentage endpoint
curl -X GET "http://localhost:8080/api/renewable-percentage"- Swagger UI: Available at
/docsendpoint - ReDoc: Alternative docs at
/redocendpoint - Schema: OpenAPI 3.0 specification at
/openapi.json - Testing: Built-in API testing interface with example requests
# Example: Automated data collection script
python scripts/collect_data.py --interval 300 # Every 5 minutes
# Example: Model retraining pipeline
python scripts/retrain_models.py --data-path ./data --epochs 50- Data Export: Export historical data in CSV, JSON, or Excel formats
- Custom Dashboards: Create personalized views with selected metrics
- API Integration: Connect with external systems using RESTful endpoints
- Webhook Support: Real-time notifications for threshold breaches
- Threshold Alerts: Set custom alerts for carbon intensity or renewable percentage
- Performance Monitoring: Track model accuracy and system performance
- Health Checks: Automated system health monitoring and reporting
- Logging: Comprehensive logging for debugging and analysis
Gaia leverages state-of-the-art LSTM (Long Short-Term Memory) neural networks to provide accurate environmental predictions. Our AI system processes 24-hour historical data windows to generate reliable forecasts for carbon intensity and renewable energy percentage.
Model Specifications:
- Architecture: Deep LSTM with optimized hyperparameters
- Input Features: 24-hour rolling window of carbon intensity data
- Output: 6-class classification (0-5 scale representing intensity levels)
- Test Accuracy: 90.9% on validation dataset
- Training Data: Historical carbon intensity data from ElectricityMaps API
- Update Frequency: Models retrained monthly with new data
Performance Metrics:
Model Performance Summary:
โโโ Test Accuracy: 90.9%
โโโ Precision: 0.91
โโโ Recall: 0.89
โโโ F1-Score: 0.90
โโโ Inference Time: <100ms
โโโ Confidence Scoring: ImplementedClassification Scale:
- Class 0: Very Low (0-100 gCOโeq/kWh) ๐ข
- Class 1: Low (100-200 gCOโeq/kWh) ๐ก
- Class 2: Moderate (200-300 gCOโeq/kWh) ๐
- Class 3: High (300-400 gCOโeq/kWh) ๐ด
- Class 4: Very High (400-500 gCOโeq/kWh) ๐ฃ
- Class 5: Extreme (500+ gCOโeq/kWh) โซ
Model Specifications:
- Architecture: Deep LSTM neural network with dropout regularization
- Input Features: 24-hour renewable energy percentage history
- Output: 6-class classification representing renewable energy levels
- Test Accuracy: 90.9% on validation dataset
- Training Data: Historical renewable percentage data from Portugal's grid
- Real-time Processing: Sub-second prediction response times
Performance Metrics:
Renewable Percentage Model:
โโโ Test Accuracy: 90.9%
โโโ Precision: 0.90
โโโ Recall: 0.91
โโโ F1-Score: 0.91
โโโ MAE: 3.2%
โโโ RMSE: 4.8%Classification Ranges:
- Class 0: Very Low Renewable (0-20%) ๐ด
- Class 1: Low Renewable (20-40%) ๐
- Class 2: Moderate Renewable (40-60%) ๐ก
- Class 3: High Renewable (60-80%) ๐ข
- Class 4: Very High Renewable (80-95%) ๐
- Class 5: Exceptional Renewable (95-100%) โญ
# Example preprocessing steps
def preprocess_data(raw_data):
# 1. Data cleaning and validation
cleaned_data = remove_outliers(raw_data)
# 2. Feature engineering
features = create_time_features(cleaned_data)
# 3. Normalization using MinMaxScaler
scaled_data = scaler.transform(features)
# 4. Sequence creation for LSTM
sequences = create_sequences(scaled_data, window_size=24)
return sequences# LSTM Model Architecture (Conceptual)
model = Sequential([
LSTM(50, return_sequences=True, input_shape=(24, n_features)),
Dropout(0.2),
LSTM(50, return_sequences=False),
Dropout(0.2),
Dense(25),
Dense(6, activation='softmax') # 6-class classification
])- Optimizer: Adam with learning rate scheduling
- Loss Function: Categorical crossentropy
- Batch Size: 32
- Epochs: 100 with early stopping
- Validation Split: 20%
- Cross-validation: 5-fold time series CV
Confusion Matrix Analysis:
- Both models show excellent diagonal performance
- Minimal misclassification between adjacent classes
- Strong performance across all classification ranges
Time Series Validation:
- Models tested on out-of-sample data (unseen time periods)
- Consistent performance across different seasons
- Robust to concept drift and data distribution changes
Top Contributing Features:
โโโ Historical values (last 6 hours): 35%
โโโ Time of day patterns: 25%
โโโ Day of week seasonality: 20%
โโโ Recent trend direction: 15%
โโโ Long-term moving averages: 5%- Hosting: FastAPI service with uvicorn ASGI server
- Scaling: Docker containerization for horizontal scaling
- Monitoring: Real-time performance tracking and alerts
- Caching: Intelligent caching to reduce inference latency
Inference Flow:
1. Data Collection โ ElectricityMaps API
2. Preprocessing โ Feature engineering & scaling
3. Model Prediction โ LSTM inference
4. Post-processing โ Classification & confidence scoring
5. API Response โ JSON output with predictions- Model Monitoring: Track prediction accuracy over time
- Data Drift Detection: Monitor for changes in data distribution
- Automated Retraining: Monthly model updates with new data
- A/B Testing: Compare model versions in production
The backend/mvp/ directory contains comprehensive Jupyter notebooks documenting our model development process:
Live_Predictions_LSTM_Carbon_Intensity.ipynb- Complete model development from data exploration to deployment
- Hyperparameter tuning and architecture experiments
- Validation strategies and performance analysis
- Feature engineering and data preprocessing techniques
Live_Predictions_LSTM_Renewable_Percentage.ipynb- Renewable energy forecasting model development
- Time series analysis and seasonality detection
- Model comparison and selection process
- Prediction uncertainty quantification
Power_Import_Breakdown.ipynb&Power_Export_Breakdown.ipynb- Cross-border energy flow analysis
- Power grid balance and stability metrics
- Import/export pattern recognition
- Energy independence calculations
- Temporal Patterns: Strong diurnal and weekly patterns in both metrics
- Seasonality Effects: Renewable percentage varies significantly by season
- Cross-correlations: Carbon intensity inversely correlated with renewable percentage
- Prediction Horizons: 24-hour forecasts provide optimal accuracy/utility balance
models/
โโโ carbon_intensity/
โ โโโ model_carbon_intensity.keras # Trained LSTM model
โ โโโ scaler_carbon_intensity.pkl # MinMaxScaler for preprocessing
โ โโโ metadata.json # Model metadata and metrics
โโโ renewable_percentage/
โโโ model_renewable_percentage.keras # Trained LSTM model
โโโ scaler_renewable_percentage.pkl # MinMaxScaler for preprocessing
โโโ metadata.json # Model metadata and metrics# Example model loading in FastAPI
import tensorflow as tf
import pickle
# Load models and scalers
ci_model = tf.keras.models.load_model('models/carbon_intensity/model_carbon_intensity.keras')
ci_scaler = pickle.load(open('models/carbon_intensity/scaler_carbon_intensity.pkl', 'rb'))
# Make predictions
def predict_carbon_intensity(data):
scaled_data = ci_scaler.transform(data)
prediction = ci_model.predict(scaled_data)
return prediction- Real-time Accuracy: Track prediction accuracy against actual values
- Latency Metrics: Monitor inference response times
- Error Analysis: Identify and analyze prediction errors
- Data Quality: Monitor input data quality and detect anomalies
We encourage any developer that as foud this project interesting to creates issues and pull requests! Helps us build the future of environmental monitoring of AI models.
# Required Software
- Python 3.10+ (with pip)
- Node.js 18+ (with npm/yarn)
- Git 2.30+
- Docker & Docker Compose (optional)
# Recommended Development Tools
- VS Code with Python & TypeScript extensions
- Jupyter Lab/Notebook for ML development
- Postman/Insomnia for API testing# 1. Fork and clone the repository
git clone https://github.com/your-username/Eco-AI.ly.git
cd Eco-AI.ly
# 2. Set up Python development environment
python -m venv .venv
source .venv/bin/activate # or .\.venv\Scripts\activate on Windows
# 3. Install root-level development dependencies
pip install -r requirements.txt
# 4. Install pre-commit hooks
pip install pre-commit
pre-commit install
# 5. Set up Streamlit frontend
cd frontend/streamlit
pip install -r requirements.txt
cd ../..
# 6. Set up Next.js frontend
cd frontend/my-next-app
npm install
cd ../..
# 7. Set up FastAPI backend
cd backend/api/CI_RP
pip install -r requirements.txt
cd ../../..
# 8. Configure environment variables
cp frontend/streamlit/.env.example frontend/streamlit/.env
cp frontend/my-next-app/.env.local.example frontend/my-next-app/.env.local
cp backend/api/CI_RP/.env.example backend/api/CI_RP/.env- Backend API: >85% code coverage required
- Frontend Components: >80% coverage for critical paths
- Model Functions: >90% coverage for ML pipelines
- Integration Tests: Full workflow testing
# Python code formatting with Ruff (preferred)
ruff format .
ruff check .
# Alternative: Black formatting
black .
isort .
# Python linting with additional tools
flake8 .
mypy backend/
# TypeScript/JavaScript linting
cd frontend/my-next-app
npm run lint
npm run type-check- Branch Creation: Create feature branch from
main - Development: Implement changes with tests
- Testing: Ensure all tests pass locally
- Code Review: Submit PR with clear description
- CI/CD: Automated testing and quality checks
- Merge: Squash and merge after approval
Architecture Components:
โโโ Frontend Layer
โ โโโ Streamlit Dashboard (Production Analytics)
โ โโโ Next.js Web App (Modern UI/UX)
โโโ Backend Layer
โ โโโ FastAPI Service (AI Predictions)
โ โโโ Data Pipeline (ETL Processing)
โโโ AI/ML Layer
โ โโโ LSTM Models (Carbon Intensity & Renewable %)
โ โโโ Training Pipeline (Model Development)
โโโ Infrastructure Layer
โโโ Docker Containers
โโโ CI/CD Pipeline- RESTful Design: Standard HTTP methods and status codes
- Versioning: API versioning with
/v1/prefix - Documentation: Auto-generated OpenAPI/Swagger docs
- Error Handling: Consistent error response format
- Rate Limiting: API rate limiting and authentication
๐ง Code Contributions:
- Bug fixes and performance improvements
- New features and enhancements
- Test coverage improvements
- Documentation updates
๐ Data Science Contributions:
- Model performance improvements
- New prediction algorithms
- Feature engineering enhancements
- Data analysis and insights
๐จ Design Contributions:
- UI/UX improvements
- Data visualization enhancements
- Branding and graphic design
- User experience optimization
๐ Documentation Contributions:
- README improvements
- API documentation
- Tutorial creation
- Code commenting
Before submitting a pull request, ensure:
- Code follows project style guidelines
- All tests pass locally
- New features include appropriate tests
- Documentation updated for significant changes
- Commit messages follow convention
- PR description clearly explains changes
- Breaking changes are clearly noted
When reporting bugs, please include:
- Environment Details: OS, Python version, Node.js version
- Steps to Reproduce: Clear, numbered steps
- Expected Behavior: What should happen
- Actual Behavior: What actually happens
- Error Messages: Full error logs and stack traces
- Screenshots: If applicable, visual evidence
For feature requests, provide:
- Problem Statement: What problem does this solve?
- Proposed Solution: Detailed description of the feature
- Use Cases: Who would benefit and how?
- Technical Considerations: Any implementation thoughts
- Alternative Solutions: Other approaches considered
- Major Releases: Quarterly (Q1, Q2, Q3, Q4)
- Minor Releases: Monthly feature updates
- Patch Releases: As needed for critical fixes
- Beta Releases: Two weeks before major releases
We follow Semantic Versioning (SemVer):
- Major (X.0.0): Breaking changes
- Minor (0.X.0): New features, backward compatible
- Patch (0.0.X): Bug fixes, backward compatible
- All tests passing on CI/CD
- Documentation updated
- Version numbers bumped
- Changelog updated
- Docker images built and tested
- Deployment tested on staging
- Security review completed
- API Keys: Never commit API keys to version control
- Environment Variables: Use
.envfiles for sensitive data - Dependencies: Regular security audits of dependencies
- Input Validation: Validate all user inputs
- CORS: Proper CORS configuration for production
For security vulnerabilities:
- Do NOT open a public issue
- Email directly: security@ecoai.ly
- Include detailed description and reproduction steps
- Allow reasonable time for response and fix
We recognize contributors through:
- GitHub Contributors Graph: Automatic recognition
- CONTRIBUTORS.md: Detailed contributor profiles
- Release Notes: Acknowledgment in release announcements
- Community Showcases: Featured contributions
- Early Access: Beta features and releases
- Community Discord: Access to contributor-only channels
- Mentorship: Connect with core team members
- Recommendations: LinkedIn recommendations for significant contributions
- GitHub Issues: Bug reports and feature requests
- GitHub Discussions: General questions and community chat
- Email: dev@ecoai.ly for development questions
- Office Hours: Weekly video calls (schedule TBD)
- Developer Documentation:
/docs/development/ - API Reference: Live docs at
/api/docs - Code Examples:
/examples/directory - Video Tutorials: YouTube channel (coming soon)
Gaia supports multiple deployment strategies, from local development to production cloud environments. Choose the deployment method that best fits your needs.
Currently Deployed:
- Streamlit Dashboard: https://ecoaily.streamlit.app/
- Status: โ Live and Operational
- Hosting: Streamlit Community Cloud
- Auto-deployment: Connected to GitHub main branch
- Features: Full analytics dashboard with AI predictions
# Navigate to API directory
cd backend/api/CI_RP
# Build Docker image
docker build -t eco-ai-ly-api:latest .
# Run container with environment variables
docker run -d \
--name eco-ai-ly-api \
-p 8080:8080 \
--env-file .env \
eco-ai-ly-api:latest
# Check container status
docker ps
docker logs eco-ai-ly-api# docker-compose.yml (example configuration)
version: '3.8'
services:
api:
build: ./backend/api/CI_RP
ports:
- "8080:8080"
environment:
- ELECTRICITYMAP_API_KEY=${ELECTRICITYMAP_API_KEY}
volumes:
- ./backend/api/CI_RP/models:/app/models
restart: unless-stopped
streamlit:
build: ./frontend/streamlit
ports:
- "8501:8501"
depends_on:
- api
environment:
- BACKEND_API_URL=http://api:8080
restart: unless-stopped
nextjs:
build: ./frontend/my-next-app
ports:
- "3000:3000"
depends_on:
- api
environment:
- NEXT_PUBLIC_API_URL=http://localhost:8080
restart: unless-stoppedDeploy with Docker Compose:
# Deploy all services
docker-compose up -d
# View logs
docker-compose logs -f
# Scale services
docker-compose up -d --scale api=2
# Stop services
docker-compose down# Deployment Steps:
1. Fork/Clone repository to your GitHub
2. Visit https://share.streamlit.io/
3. Connect GitHub repository
4. Select frontend/streamlit/Home.py as main file
5. Add environment variables in advanced settings
6. Deploy with one clickConfiguration for Streamlit Cloud:
# .streamlit/config.toml
[server]
headless = true
port = 8501
[theme]
primaryColor = "#00C851"
backgroundColor = "#0E1117"
secondaryBackgroundColor = "#262730"
textColor = "#FAFAFA"# Deploy Next.js app to Vercel
cd frontend/my-next-app
# Install Vercel CLI
npm install -g vercel
# Deploy
vercel --prod
# Configure environment variables in Vercel dashboard
# NEXT_PUBLIC_API_URL=your-api-url# Deploy FastAPI to Google Cloud Run
cd backend/api/CI_RP
# Build and deploy
gcloud run deploy eco-ai-ly-api \
--source . \
--platform managed \
--region us-central1 \
--allow-unauthenticated \
--set-env-vars ELECTRICITYMAP_API_KEY=${API_KEY}# FastAPI on Heroku
cd backend/api/CI_RP
# Create Heroku app
heroku create eco-ai-ly-api
# Add environment variables
heroku config:set ELECTRICITYMAP_API_KEY=your_key
# Deploy
git push heroku main# Deploy on AWS EC2
# 1. Launch EC2 instance (Ubuntu 20.04 LTS)
# 2. Install Docker and Docker Compose
sudo apt update
sudo apt install docker.io docker-compose
# 3. Clone repository
git clone https://github.com/guilhermegranchopro/Eco-AI.ly.git
cd Eco-AI.ly
# 4. Configure environment variables
cp backend/api/CI_RP/.env.example backend/api/CI_RP/.env
# Edit .env with your API keys
# 5. Deploy with Docker Compose
docker-compose up -d
# 6. Configure reverse proxy (nginx)
sudo apt install nginx
# Configure nginx for domain routing# Production Environment Variables
ENVIRONMENT=production
DEBUG=False
LOG_LEVEL=WARNING
# Security Headers
SECURE_SSL_REDIRECT=True
SECURE_BROWSER_XSS_FILTER=True
SECURE_CONTENT_TYPE_NOSNIFF=True
# API Rate Limiting
API_RATE_LIMIT=1000
API_BURST_LIMIT=50# Application Monitoring
SENTRY_DSN=your_sentry_dsn
NEW_RELIC_LICENSE_KEY=your_key
# Health Check Endpoints
GET /health # Application health
GET /metrics # Prometheus metrics
GET /status # Detailed statusGitHub Actions Workflow (.github/workflows/deploy.yml):
name: Deploy to Production
on:
push:
branches: [main]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Run Tests
run: |
python -m pytest tests/
deploy-api:
needs: test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Deploy to Google Cloud Run
uses: google-github-actions/deploy-cloudrun@v1
with:
service: eco-ai-ly-api
source: ./backend/api/CI_RP
deploy-streamlit:
needs: test
runs-on: ubuntu-latest
steps:
- name: Deploy to Streamlit Cloud
# Automatic deployment via GitHub integration
run: echo "Streamlit deployment triggered"# FastAPI Optimizations
workers: 4 # Gunicorn workers
max_requests: 1000 # Worker recycling
timeout: 30 # Request timeout
keepalive: 5 # Connection keepalive
# Caching Strategy
redis_cache: enabled # Redis for API caching
cache_ttl: 300 # 5-minute cache TTL# Model Storage
model_cache: enabled # Cache loaded models
model_warm_up: true # Pre-load models on startup
prediction_cache: 1800 # Cache predictions for 30 min# API Health Checks
GET /health # Basic health status
GET /health/detailed # Detailed component status
GET /health/models # Model loading status
GET /health/dependencies # External dependency status# Monitoring Tools Integration
- Prometheus metrics collection
- Grafana dashboards
- PagerDuty incident management
- Slack notifications for critical alerts
# Key Metrics to Monitor
- API response times
- Model prediction accuracy
- Error rates and exceptions
- Resource utilization (CPU, Memory)
- ElectricityMaps API quota usage- No Personal Data Collection: Gaia does not collect personal user information
- Public Energy Data: Uses publicly available energy grid data from ElectricityMaps
- API Data: ElectricityMaps data subject to their terms of service
- Local Processing: All AI predictions processed locally/server-side
- No Data Retention: Real-time data not stored permanently
- Minimal Data Processing: Only energy grid data, no personal information
- Transparent Processing: Open source with clear data flow documentation
- User Rights: Users can access all code and understand data processing
- No Cookies: Streamlit/Next.js applications use minimal essential cookies only
- ElectricityMaps API: Subject to ElectricityMaps Terms of Service
- Usage Requirements: Valid API key required for data access
- Rate Limits: API calls subject to plan-based rate limiting
- Attribution: ElectricityMaps data attribution included in UI
All dependencies listed in requirements.txt and package.json files maintain their respective licenses:
Key Dependencies:
- Streamlit: Apache License 2.0
- FastAPI: MIT License
- TensorFlow: Apache License 2.0
- Next.js: MIT License
- React: MIT License
By contributing to Gaia, you agree that:
- Your contributions are your original work or properly attributed
- You have the right to submit your contributions under the MIT License
- Your contributions will be licensed under the same MIT License
- You understand this is a voluntary contribution without compensation expectation
Guilherme Grancho
- Email: guilhermegranchopro@gmail.com
- GitHub: @guilhermegranchopro
- LinkedIn: guilherme-grancho
- Location: Portugal ๐ต๐น
- Live Application: https://ecoaily.streamlit.app/
- GitHub Repository: https://github.com/guilhermegranchopro/Eco-AI.ly
- Project Website: https://eco-ai.ly (coming soon)
- Documentation: Available in repository
/docsfolder
Getting Help:
- GitHub Issues: Report bugs or request features
- GitHub Discussions: Community questions and discussions
- Email Support: support@ecoai.ly (coming soon)
Community Guidelines:
- Be Respectful: Treat all community members with respect and kindness
- Stay On-Topic: Keep discussions relevant to environmental monitoring and AI
- Help Others: Share knowledge and assist fellow developers
- Report Issues: Help maintain a positive community environment
For business partnerships, licensing, or commercial support:
- Business Email: business@ecoai.ly (coming soon)
- Consulting: Custom environmental AI solutions available
- Enterprise Support: Dedicated support packages available
- Training: Workshops and training sessions on environmental AI
Organizations We Partner With:
- Environmental research institutions
- Sustainable energy companies
- Climate technology startups
- Academic research programs
- Government environmental agencies
Collaboration Opportunities:
- Research partnerships
- Data sharing agreements
- Joint development projects
- Educational initiatives
- Open source contributions
- Enhanced Next.js Dashboard: Complete modern UI implementation
- Additional Countries: Expand beyond Portugal to EU countries
- Mobile App: React Native mobile application
- Real-time Alerts: Push notifications for environmental thresholds
- Advanced AI Models: Transformer-based prediction models
- Weather Integration: Weather data correlation with energy patterns
- Carbon Offset Calculator: Personal and business carbon footprint tools
- API Marketplace: Public API for third-party integrations
- Global Coverage: Worldwide environmental monitoring platform
- Multi-modal AI: Computer vision for satellite environmental data
- Blockchain Integration: Carbon credit and renewable energy certificates
- Enterprise Platform: B2B environmental monitoring solutions
- ElectricityMaps: For providing excellent real-time energy data API
- Streamlit Team: For creating an amazing framework for data applications
- Open Source Community: For all the incredible tools and libraries
- Beta Testers: Early users who provided valuable feedback
- Environmental Scientists: Domain experts who guided our approach
- Python Ecosystem: NumPy, Pandas, Scikit-learn, TensorFlow
- Web Technologies: HTML5, CSS3, JavaScript ES6+, TypeScript
- AI/ML Tools: Jupyter, Keras, Matplotlib, Plotly
- DevOps Tools: Docker, GitHub Actions, Git
- Design Tools: Figma, Adobe Creative Suite
Gaia is committed to environmental responsibility:
- Carbon Neutral Hosting: Using green hosting providers when possible
- Efficient Code: Optimized algorithms to minimize computational resources
- Sustainable Development: Remote-first development to reduce travel emissions
- Open Source: Sharing knowledge to accelerate environmental solutions globally
GAIA is the first product of Eco AI.ly - a forward-thinking startup dedicated to making AI and technology more environmentally sustainable.
Founded by Duarte Alexandrino and Guilherme Grancho, two Imperial College London students passionate about combining cutting-edge AI with environmental responsibility.
"Technology is best when it brings people together for a sustainable tomorrow."
๐ Join us in building the future of sustainable AI
