Finish step1

This commit is contained in:
dvirlabs 2026-02-15 20:04:11 +02:00
parent 24e1e7aae2
commit eff48933d3
12 changed files with 574 additions and 66 deletions

15
Dockerfile Normal file
View File

@ -0,0 +1,15 @@
FROM python:3.11-slim
ENV PYTHONDONTWRITEBYTECODE=1 \
PYTHONUNBUFFERED=1
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
EXPOSE 8000
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]

195
README.md Normal file
View File

@ -0,0 +1,195 @@
# Open-Meteo Coordinates Service
A FastAPI-based microservice that queries the Open-Meteo Geocoding API to retrieve coordinates for various cities. The service includes caching, Prometheus metrics, and Grafana dashboards for monitoring.
## Features
- **RESTful API** for retrieving city coordinates
- **Intelligent caching** to reduce external API calls
- **Prometheus metrics** for observability
- **Pre-configured Grafana dashboards** with 5 panels
- **Docker Compose** setup for easy deployment
## Prerequisites
- Docker
- Docker Compose
## Quick Start
1. **Clone the repository**
```bash
cd open-meteo-service
```
2. **Start all services**
```bash
docker compose up --build
```
3. **Access the services**
- API: http://localhost:8000/docs
- Prometheus: http://localhost:9090
- Grafana: http://localhost:3000 (admin/admin)
## API Documentation
### Endpoints
#### `GET /coordinates`
Retrieve coordinates for all configured cities.
**Response:**
```json
{
"source": "cache",
"data": {
"Tel Aviv": {
"name": "Tel Aviv",
"latitude": 32.08088,
"longitude": 34.78057,
"country": "Israel"
},
...
}
}
```
#### `GET /coordinates/{city}`
Retrieve coordinates for a specific city.
**Parameters:**
- `city` (path) - City name (e.g., "Ashkelon", "London")
**Example:**
```bash
curl http://localhost:8000/coordinates/Paris
```
**Response:**
```json
{
"source": "open-meteo",
"data": {
"name": "Paris",
"latitude": 48.85341,
"longitude": 2.3488,
"country": "France"
}
}
```
#### `GET /metrics`
Prometheus metrics endpoint exposing service metrics.
**Example:**
```bash
curl http://localhost:8000/metrics
```
#### `GET /healthz`
Health check endpoint.
**Response:**
```json
{
"status": "ok"
}
```
## Metrics
The service exposes the following Prometheus metrics:
### HTTP Metrics
- **`http_requests_total`** - Counter of total HTTP requests
- Labels: `endpoint`, `method`, `status`
- **`http_request_duration_seconds`** - Histogram of request durations
- Labels: `endpoint`, `method`
### Cache Metrics
- **`coordinates_cache_hits_total`** - Counter of cache hits
- **`coordinates_cache_misses_total`** - Counter of cache misses
### External API Metrics
- **`openmeteo_api_calls_total`** - Counter of calls to Open-Meteo Geocoding API
- Labels: `city`
## Grafana Dashboard
The pre-configured dashboard includes 5 panels:
1. **Request Rate** - Requests per second by endpoint
2. **Request Duration p95** - 95th percentile latency
3. **Cache Hits vs Misses** - Cache effectiveness
4. **Open-Meteo Calls by City** - External API usage per city
5. **Requests by Status** - HTTP status code distribution
Access the dashboard at http://localhost:3000 after logging in with `admin/admin`.
## Caching
The service uses a local JSON file (`coordinates_cache.json`) to cache city coordinates:
- Reduces external API calls
- Shared across all API endpoints
- Persists between requests (not container restarts)
- Automatically updated when new cities are queried
## Development
### Project Structure
```
.
├── app/
│ ├── main.py # FastAPI application
│ ├── service.py # Business logic & caching
│ └── metrics.py # Prometheus metrics definitions
├── grafana/
│ ├── provisioning/ # Auto-configured datasources & dashboards
│ └── dashboards/ # Dashboard JSON definitions
├── docker-compose.yml # Service orchestration
├── Dockerfile # Python app container
├── prometheus.yml # Prometheus scrape configuration
└── requirements.txt # Python dependencies
```
### Stop Services
```bash
docker compose down
```
### View Logs
```bash
docker compose logs -f open-meteo-service
```
### Rebuild After Code Changes
```bash
docker compose up --build
```
## Configuration
### Environment Variables
- `CACHE_FILE` - Path to cache file (default: `coordinates_cache.json`)
### Scrape Interval
Edit `prometheus.yml` to adjust the scrape interval (default: 15s).
## Testing
Generate test traffic to populate metrics:
```bash
# Test all endpoints
curl http://localhost:8000/coordinates
curl http://localhost:8000/coordinates/Paris
curl http://localhost:8000/coordinates/London
# Generate load
for i in {1..10}; do curl -s http://localhost:8000/coordinates > /dev/null; done
```
## License
MIT

36
app/main.py Normal file
View File

@ -0,0 +1,36 @@
from fastapi import FastAPI, Response
from prometheus_client import generate_latest, CONTENT_TYPE_LATEST
from .service import get_all_coordinates, get_coordinates_for_city
from .metrics import RequestTimer
app = FastAPI(title="Open-Meteo Coordinates Service")
@app.get("/coordinates")
def coordinates():
with RequestTimer(endpoint="/coordinates", method="GET") as t:
try:
return get_all_coordinates()
except Exception:
t.set_status("500")
raise
@app.get("/coordinates/{city}")
def coordinates_for_city(city: str):
with RequestTimer(endpoint="/coordinates/{city}", method="GET") as t:
try:
return get_coordinates_for_city(city)
except Exception:
t.set_status("500")
raise
@app.get("/metrics")
def metrics():
return Response(generate_latest(), media_type=CONTENT_TYPE_LATEST)
@app.get("/healthz")
def healthz():
return {"status": "ok"}

54
app/metrics.py Normal file
View File

@ -0,0 +1,54 @@
import time
from prometheus_client import Counter, Histogram
HTTP_REQUESTS_TOTAL = Counter(
"http_requests_total",
"Total HTTP requests",
["endpoint", "method", "status"],
)
HTTP_REQUEST_DURATION_SECONDS = Histogram(
"http_request_duration_seconds",
"HTTP request duration in seconds",
["endpoint", "method"],
)
CACHE_HITS_TOTAL = Counter(
"coordinates_cache_hits_total",
"Total cache hits for coordinates",
)
CACHE_MISSES_TOTAL = Counter(
"coordinates_cache_misses_total",
"Total cache misses for coordinates",
)
OPENMETEO_CALLS_TOTAL = Counter(
"openmeteo_api_calls_total",
"Total calls made to Open-Meteo Geocoding API",
["city"],
)
class RequestTimer:
"""Small helper to measure request duration and emit metrics."""
def __init__(self, endpoint: str, method: str):
self.endpoint = endpoint
self.method = method
self.start = None
self.status = "200"
def __enter__(self):
self.start = time.time()
return self
def set_status(self, status: str):
self.status = status
def __exit__(self, exc_type, exc, tb):
HTTP_REQUESTS_TOTAL.labels(
endpoint=self.endpoint, method=self.method, status=self.status
).inc()
HTTP_REQUEST_DURATION_SECONDS.labels(
endpoint=self.endpoint, method=self.method
).observe(time.time() - self.start)

78
app/service.py Normal file
View File

@ -0,0 +1,78 @@
import json
import os
from typing import Dict, Any
import requests
from .metrics import CACHE_HITS_TOTAL, CACHE_MISSES_TOTAL, OPENMETEO_CALLS_TOTAL
API_URL = "https://geocoding-api.open-meteo.com/v1/search"
CITIES = ["Tel Aviv", "Beer Sheva", "Jerusalem", "Szeged"]
CACHE_FILE = os.environ.get("CACHE_FILE", "coordinates_cache.json")
def _fetch_coordinates(city: str) -> Dict[str, Any]:
OPENMETEO_CALLS_TOTAL.labels(city=city).inc()
params = {"name": city, "count": 1}
r = requests.get(API_URL, params=params, timeout=10)
r.raise_for_status()
data = r.json()
if "results" not in data or not data["results"]:
raise ValueError(f"No results found for {city}")
result = data["results"][0]
return {
"name": result.get("name"),
"latitude": result.get("latitude"),
"longitude": result.get("longitude"),
"country": result.get("country"),
}
def _load_cache() -> Dict[str, Any] | None:
if os.path.exists(CACHE_FILE):
with open(CACHE_FILE, "r", encoding="utf-8") as f:
return json.load(f)
return None
def _save_cache(data: Dict[str, Any]) -> None:
with open(CACHE_FILE, "w", encoding="utf-8") as f:
json.dump(data, f, indent=2)
def get_all_coordinates() -> Dict[str, Any]:
cached = _load_cache()
if cached:
CACHE_HITS_TOTAL.inc()
return {"source": "cache", "data": cached}
CACHE_MISSES_TOTAL.inc()
results: Dict[str, Any] = {}
for city in CITIES:
results[city] = _fetch_coordinates(city)
_save_cache(results)
return {"source": "open-meteo", "data": results}
def get_coordinates_for_city(city: str) -> Dict[str, Any]:
cached = _load_cache()
if cached and city in cached:
CACHE_HITS_TOTAL.inc()
return {"source": "cache", "data": cached[city]}
CACHE_MISSES_TOTAL.inc()
result = _fetch_coordinates(city)
# Update cache with the new city
if cached is None:
cached = {}
cached[city] = result
_save_cache(cached)
return {"source": "open-meteo", "data": result}

54
docker-compose.yml Normal file
View File

@ -0,0 +1,54 @@
version: '3.8'
services:
open-meteo-service:
build:
context: .
dockerfile: Dockerfile
image: open-meteo-service:local
container_name: open-meteo-service
ports:
- "8000:8000"
networks:
- monitoring
prometheus:
image: prom/prometheus:latest
container_name: prometheus
ports:
- "9090:9090"
volumes:
- ./prometheus.yml:/etc/prometheus/prometheus.yml
- prometheus_data:/prometheus
command:
- "--config.file=/etc/prometheus/prometheus.yml"
- "--storage.tsdb.path=/prometheus"
networks:
- monitoring
depends_on:
- open-meteo-service
grafana:
image: grafana/grafana:latest
container_name: grafana
ports:
- "3000:3000"
environment:
- GF_SECURITY_ADMIN_PASSWORD=admin
- GF_SECURITY_ADMIN_USER=admin
volumes:
- grafana_data:/var/lib/grafana
- ./grafana/provisioning:/etc/grafana/provisioning
- ./grafana/dashboards:/var/lib/grafana/dashboards
networks:
- monitoring
depends_on:
- prometheus
networks:
monitoring:
driver: bridge
volumes:
prometheus_data:
grafana_data:

View File

@ -0,0 +1,107 @@
{
"uid": "open-meteo-service",
"title": "Open-Meteo Service",
"timezone": "browser",
"schemaVersion": 38,
"version": 1,
"refresh": "10s",
"time": {
"from": "now-15m",
"to": "now"
},
"panels": [
{
"id": 1,
"type": "timeseries",
"title": "Request Rate",
"datasource": {
"type": "prometheus",
"uid": "prometheus"
},
"gridPos": { "x": 0, "y": 0, "w": 12, "h": 8 },
"targets": [
{
"expr": "sum(rate(http_requests_total[5m])) by (endpoint, method)",
"legendFormat": "{{endpoint}} {{method}}",
"refId": "A"
}
]
},
{
"id": 2,
"type": "timeseries",
"title": "Request Duration p95",
"datasource": {
"type": "prometheus",
"uid": "prometheus"
},
"gridPos": { "x": 12, "y": 0, "w": 12, "h": 8 },
"targets": [
{
"expr": "histogram_quantile(0.95, sum(rate(http_request_duration_seconds_bucket[5m])) by (le, endpoint, method))",
"legendFormat": "{{endpoint}} {{method}}",
"refId": "A"
}
]
},
{
"id": 3,
"type": "timeseries",
"title": "Cache Hits vs Misses",
"datasource": {
"type": "prometheus",
"uid": "prometheus"
},
"gridPos": { "x": 0, "y": 8, "w": 12, "h": 8 },
"targets": [
{
"expr": "rate(coordinates_cache_hits_total[5m])",
"legendFormat": "hits",
"refId": "A"
},
{
"expr": "rate(coordinates_cache_misses_total[5m])",
"legendFormat": "misses",
"refId": "B"
}
]
},
{
"id": 4,
"type": "timeseries",
"title": "Open-Meteo Calls by City",
"datasource": {
"type": "prometheus",
"uid": "prometheus"
},
"gridPos": { "x": 12, "y": 8, "w": 12, "h": 8 },
"targets": [
{
"expr": "sum(rate(openmeteo_api_calls_total[5m])) by (city)",
"legendFormat": "{{city}}",
"refId": "A"
}
]
},
{
"id": 5,
"type": "timeseries",
"title": "Requests by Status",
"datasource": {
"type": "prometheus",
"uid": "prometheus"
},
"gridPos": { "x": 0, "y": 16, "w": 24, "h": 8 },
"targets": [
{
"expr": "sum(rate(http_requests_total[5m])) by (status)",
"legendFormat": "{{status}}",
"refId": "A"
}
]
}
],
"templating": {
"list": []
}
}

View File

@ -0,0 +1,10 @@
apiVersion: 1
providers:
- name: default
type: file
disableDeletion: false
editable: true
updateIntervalSeconds: 10
options:
path: /var/lib/grafana/dashboards

View File

@ -0,0 +1,9 @@
apiVersion: 1
datasources:
- name: Prometheus
type: prometheus
access: proxy
url: http://prometheus:9090
isDefault: true
uid: prometheus

66
main.py
View File

@ -1,66 +0,0 @@
import requests
import json
import os
API_URL = "https://geocoding-api.open-meteo.com/v1/search"
CITIES = ["Tel Aviv", "Beersheba", "Jerusalem", "Szeged"]
CACHE_FILE = "coordinates_cache.json"
def fetch_coordinates(city):
params = {
"name": city,
"count": 1
}
response = requests.get(API_URL, params=params)
response.raise_for_status()
data = response.json()
if "results" not in data:
raise ValueError(f"No results found for {city}")
result = data["results"][0]
return {
"name": result["name"],
"latitude": result["latitude"],
"longitude": result["longitude"],
"country": result["country"]
}
def load_cache():
if os.path.exists(CACHE_FILE):
with open(CACHE_FILE, "r") as f:
return json.load(f)
return None
def save_cache(data):
with open(CACHE_FILE, "w") as f:
json.dump(data, f, indent=4)
def main():
# Try loading cached data first
cached_data = load_cache()
if cached_data:
print("Loaded from cache:")
print(json.dumps(cached_data, indent=4))
return
# If no cache, fetch from API
print("Fetching from Open-Meteo API...")
results = {}
for city in CITIES:
results[city] = fetch_coordinates(city)
save_cache(results)
print("Saved to cache:")
print(json.dumps(results, indent=4))
if __name__ == "__main__":
main()

12
prometheus.yml Normal file
View File

@ -0,0 +1,12 @@
global:
scrape_interval: 15s
scrape_configs:
- job_name: "prometheus"
static_configs:
- targets: ["prometheus:9090"]
- job_name: "open-meteo-service"
metrics_path: "/metrics"
static_configs:
- targets: ["open-meteo-service:8000"]

4
requirements.txt Normal file
View File

@ -0,0 +1,4 @@
fastapi==0.115.8
uvicorn[standard]==0.30.6
requests==2.32.3
prometheus-client==0.21.1