Compare commits

...

38 Commits

Author SHA1 Message Date
1df1c6bd83 [P] Add primary use-case activity diagram
All checks were successful
Component testing / Hub testing (push) Successful in 24s
Component testing / Store testing (push) Successful in 24s
Component testing / Integration smoke testing (push) Successful in 2m29s
2026-03-24 16:04:08 +02:00
8160f030cd [P] Simplify general sequence diagram 2026-03-24 16:04:08 +02:00
a5dcf09bf2 [P] Move diagrams into a dedicated directory 2026-03-24 16:04:08 +02:00
8548714537 [P] Convert use case diagram files to PlantUML format 2026-03-24 16:04:08 +02:00
942ce98c4b [P] Convert sequence diagram files to PlantUML format 2026-03-24 16:04:08 +02:00
d4e369d5f8 Add use-case and sequence diagrams 2026-03-24 16:04:08 +02:00
764fb77f27 Merge from upstream 'github/dev' into dev
All checks were successful
Component testing / Hub testing (push) Successful in 21s
Component testing / Store testing (push) Successful in 18s
Component testing / Integration smoke testing (push) Successful in 2m14s
2026-03-24 15:21:56 +02:00
VladiusVostokus
a55fc17711 Merge pull request #27 from Rhinemann/lab4/hrynko-SCRUM-80-hubadapter
implement Edge-Hub integration with user_id validation (SCRUM-93, SCRUM-94)
2026-03-24 13:20:39 +00:00
b34e385128 [L4] Remove irrelevant environment variables 2026-03-24 14:42:13 +02:00
a8a0ef5e15 [L4] Remove unused import 2026-03-24 14:09:53 +02:00
00b037a243 [L4] Remove uncessecary environment variable left over after print -> logging usage switch 2026-03-24 14:07:49 +02:00
d1b6c0eed1 [L4] Fix logging level on error message 2026-03-24 14:03:29 +02:00
5e890d4f03 [L4] Remove obvious single code line comments to reduce risk of misleading comments in the future 2026-03-24 14:02:03 +02:00
a8e50d0386 [L4] Remove excessive library import and clean up edge codebase 2026-03-24 13:58:32 +02:00
1b42be264d [L4] Remove misleading batch_size parameter from AgentMQTTAdapter 2026-03-24 13:37:33 +02:00
esk4nz
b12bdc334c fix: improve logging clarity and ensure data delivery in AgentMQTTAdapter 2026-03-23 23:41:25 +02:00
esk4nz
e8ff1c6cbd Replaced busy-wait loop with threading.Event to fix 100% CPU load 2026-03-23 23:28:17 +02:00
esk4nz
ad70519f47 SCRUM-80
Removed manual user_id fallback in data processing
2026-03-23 23:00:19 +02:00
esk4nz
b10aec1020 SCRUM-80
Changed print to logging in agent main
2026-03-23 22:57:15 +02:00
esk4nz
c085a49c8c implement Edge-Hub integration with user_id validation (SCRUM-93, SCRUM-94)
- Agent: Updated config and main
- Edge: Implemented adapter factory in main.py to switch between MQTT and HTTP.
- Edge: Updated AgentData entity and processing logic to support user_id.
- Infrastructure: Configured docker-compose for dynamic protocol switching and environment management.
2026-03-23 21:31:31 +02:00
0b8d2eb18b [P] Add container rebuilding on every rerun
All checks were successful
Component testing / Hub testing (push) Successful in 18s
Component testing / Store testing (push) Successful in 22s
Component testing / Integration smoke testing (push) Successful in 2m33s
2026-03-23 18:31:43 +02:00
2846130e4e [P] Add docker reset workflow
All checks were successful
Component testing / Hub testing (push) Successful in 20s
Component testing / Store testing (push) Successful in 21s
Component testing / Integration smoke testing (push) Successful in 1m34s
2026-03-23 18:26:38 +02:00
30af132033 [P] Add general smoke test and Store incremental test
All checks were successful
Component testing / Hub testing (push) Successful in 22s
Component testing / Store testing (push) Successful in 19s
Component testing / Integration smoke testing (push) Successful in 1m28s
2026-03-23 18:01:41 +02:00
60a846d8b8 [P] Refactor testing code 2026-03-23 16:10:09 +02:00
fe6bb6ab3a [P] Add CI for updated Hub component part
All checks were successful
Hub component testing / hub-test (push) Successful in 25s
2026-03-23 15:53:44 +02:00
ІО-23 Shmuliar Oleh
30f81ec1ae Merge pull request #26 from Rhinemann/lab4/shved-SCRUM-95-test-repo-functionality
set up global docker-compose
2026-03-22 21:59:07 +02:00
1b6f47fa0d [L4] Fix relative paths after file move 2026-03-22 21:13:44 +02:00
b1e6ad7c94 set up global docker-compose 2026-03-22 14:07:29 +01:00
VladiusVostokus
1eddfd966b Merge pull request #24 from Rhinemann/lab5/shmuliar-SCRUM-92-mapview-store-integration
SCRUM 92: mapview store integration
2026-03-14 16:07:30 +00:00
8af68d6dd9 hotfix: index overflow on user_id 2026-03-13 20:47:09 +02:00
63aca15824 add multiuser rendering support 2026-03-13 20:41:16 +02:00
ee509f72a4 pull data in MapView/main.py from actual data source 2026-03-13 19:02:07 +02:00
da9fe69d4e add initial server->client update with all current DB data 2026-03-13 19:01:33 +02:00
1c856dca0e fix MapView/main.py crash due to wrong check condition 2026-03-13 18:58:28 +02:00
VladiusVostokus
17738d07fe Merge pull request #21 from Rhinemann/lab5/gryshaiev-SCRUM-90-set-bump-marker
SCRUM-90: implement set_bump_marker
2026-03-11 17:02:07 +00:00
VladiusVostokus
6b5831ff1b Merge branch 'dev' into lab5/gryshaiev-SCRUM-90-set-bump-marker 2026-03-11 17:01:54 +00:00
VladiusVostokus
54505db70e Merge pull request #23 from Rhinemann/lab5/gryshaiev-SCRUM-89-set-pothole-marker
SCRUM-89: implement set_pothole_marker()
2026-03-11 16:59:35 +00:00
SimonSanich
6f4b3b0ea6 SCRUM-90: implement set_bump_marker 2026-03-11 18:36:40 +02:00
28 changed files with 455 additions and 249 deletions

View File

@@ -0,0 +1,16 @@
name: Reset docker state
on: workflow_dispatch
jobs:
reset:
runs-on: host-arch-x86_64
name: Reset docker state
steps:
- name: Stop all containers
run: docker stop $(docker ps -a | cut -d " " -f 1 | tail -n +2)
- name: Remove all containers
run: docker rm $(docker ps -a | cut -d " " -f 1 | tail -n +2)
- name: Remove extra volumes
run: docker volume rm road_vision_postgres_data road_vision_pgadmin-data

View File

@@ -0,0 +1,71 @@
name: Component testing
on: [push, workflow_dispatch]
jobs:
hub-test:
name: Hub testing
runs-on: host-arch-x86_64
steps:
- name: Clone repository
run: git clone --revision ${{ gitea.sha }} --depth 1 ${{ gitea.server_url }}/${{ gitea.repository }}
- name: Build Hub testing container
working-directory: IoT-Systems
run: docker build -t local/hub/${{gitea.sha}} -f hub/Dockerfile-test .
- name: Run Hub tests
working-directory: IoT-Systems
run: docker run --rm -it local/hub/${{gitea.sha}}
- name: Clean up containers
if: ${{always()}}
run: docker image rm local/hub/${{gitea.sha}}
store-test:
name: Store testing
runs-on: host-arch-x86_64
steps:
- name: Clone repository
run: git clone --revision ${{ gitea.sha }} --depth 1 ${{ gitea.server_url }}/${{ gitea.repository }}
- name: Build Store testing container
working-directory: IoT-Systems
run: docker build -t local/store/${{gitea.sha}} -f store/Dockerfile-test .
- name: Run Store tests
working-directory: IoT-Systems
run: docker run --rm -it local/store/${{gitea.sha}}
- name: Clean up containers
if: ${{always()}}
run: docker image rm local/store/${{gitea.sha}}
integration-smoke-test:
name: Integration smoke testing
runs-on: host-arch-x86_64
needs:
- hub-test
- store-test
steps:
- name: Clone repository
run: git clone --revision ${{ gitea.sha }} --depth 1 ${{ gitea.server_url }}/${{ gitea.repository }}
- name: Build all production containers
working-directory: IoT-Systems
run: docker-compose build
- name: Start all production containers
working-directory: IoT-Systems
run: docker-compose up -d
- name: Wait for crashes to happen
run: sleep 30
- name: Check for dead containers
working-directory: IoT-Systems
run: docker ps -a | python3 utils/check-up.py
- name: Clean up
if: ${{always()}}
working-directory: IoT-Systems
run: docker-compose down -v

View File

@@ -75,6 +75,7 @@ class Datasource:
processed_agent_data.latitude, processed_agent_data.latitude,
processed_agent_data.longitude, processed_agent_data.longitude,
processed_agent_data.road_state, processed_agent_data.road_state,
processed_agent_data.user_id
) )
for processed_agent_data in processed_agent_data_list for processed_agent_data in processed_agent_data_list
] ]

View File

@@ -5,6 +5,14 @@ from kivy.clock import Clock
from lineMapLayer import LineMapLayer from lineMapLayer import LineMapLayer
from datasource import Datasource from datasource import Datasource
line_layer_colors = [
[1, 0, 0, 1],
[1, 0.5, 0, 1],
[0, 1, 0, 1],
[0, 1, 1, 1],
[0, 0, 1, 1],
[1, 0, 1, 1],
]
class MapViewApp(App): class MapViewApp(App):
def __init__(self, **kwargs): def __init__(self, **kwargs):
@@ -12,17 +20,19 @@ class MapViewApp(App):
self.mapview = None self.mapview = None
self.datasource = Datasource(user_id=1) self.datasource = Datasource(user_id=1)
self.line_layer = None self.line_layers = dict()
self.car_marker = None self.car_markers = dict()
# додати необхідні змінні # додати необхідні змінні
self.bump_markers = []
self.pothole_markers = [] self.pothole_markers = []
def on_start(self): def on_start(self):
""" """
Встановлює необхідні маркери, викликає функцію для оновлення мапи Встановлює необхідні маркери, викликає функцію для оновлення мапи
""" """
Clock.schedule_interval(self.update, 0.3) self.update()
Clock.schedule_interval(self.update, 5)
def update(self, *args): def update(self, *args):
""" """
@@ -35,13 +45,17 @@ class MapViewApp(App):
for point in new_points: for point in new_points:
lat, lon, road_state = point lat, lon, road_state, user_id = point
# Оновлює лінію маршрута # Оновлює лінію маршрута
self.line_layer.add_point((lat, lon)) if user_id not in self.line_layers:
self.line_layers[user_id] = LineMapLayer(color = line_layer_colors[user_id % len(line_layer_colors)])
self.mapview.add_layer(self.line_layers[user_id])
self.line_layers[user_id].add_point((lat, lon))
# Оновлює маркер маниши # Оновлює маркер маниши
self.update_car_marker((lat, lon)) self.update_car_marker(lat, lon, user_id)
# Перевіряємо стан дороги # Перевіряємо стан дороги
self.check_road_quality(point) self.check_road_quality(point)
@@ -54,26 +68,24 @@ class MapViewApp(App):
if len(point) < 3: if len(point) < 3:
return return
lat, lon, road_state = point lat, lon, road_state, user_id = point
if road_state == "pothole": if road_state == "pothole":
self.set_pothole_marker((lat, lon)) self.set_pothole_marker((lat, lon))
elif road_state == "bump": elif road_state == "bump":
self.set_bump_marker((lat, lon)) self.set_bump_marker((lat, lon))
def update_car_marker(self, point): def update_car_marker(self, lat, lon, user_id):
""" """
Оновлює відображення маркера машини на мапі Оновлює відображення маркера машини на мапі
:param point: GPS координати :param point: GPS координати
""" """
lat, lon = point[0], point[1] if user_id not in self.car_markers:
self.car_markers[user_id] = MapMarker(lat=lat, lon=lon, source='./images/car.png')
if not hasattr(self, 'car_marker'): self.mapview.add_marker(self.car_markers[user_id])
self.car_marker = MapMarker(lat=lat, lon=lon, source='./images/car')
self.mapview.add_marker(self.car_marker)
else: else:
self.car_marker.lat = lat self.car_markers[user_id].lat = lat
self.car_marker.lon = lon self.car_markers[user_id].lon = lon
self.mapview.center_on(lat, lon) self.mapview.center_on(lat, lon)
@@ -97,10 +109,24 @@ class MapViewApp(App):
self.pothole_markers.append(marker) self.pothole_markers.append(marker)
def set_bump_marker(self, point): def set_bump_marker(self, point):
""" if isinstance(point, dict):
Встановлює маркер для лежачого поліцейського lat = point.get("lat")
:param point: GPS координати lon = point.get("lon")
""" else:
lat, lon = point
if lat is None or lon is None:
return
marker = MapMarker(
lat=lat,
lon=lon,
source="images/bump.png"
)
self.mapview.add_marker(marker)
self.bump_markers.append(marker)
def build(self): def build(self):
""" """
@@ -113,9 +139,6 @@ class MapViewApp(App):
lon=30.5234 lon=30.5234
) )
self.line_layer = LineMapLayer()
self.mapview.add_layer(self.line_layer)
return self.mapview return self.mapview

View File

@@ -1,34 +0,0 @@
name: "road_vision"
services:
mqtt:
image: eclipse-mosquitto
container_name: mqtt
volumes:
- ./mosquitto:/mosquitto
- ./mosquitto/data:/mosquitto/data
- ./mosquitto/log:/mosquitto/log
ports:
- 1883:1883
- 9001:9001
networks:
mqtt_network:
fake_agent:
container_name: agent
build:
context: ../../
dockerfile: agent/Dockerfile
depends_on:
- mqtt
environment:
MQTT_BROKER_HOST: "mqtt"
MQTT_BROKER_PORT: 1883
MQTT_TOPIC: "agent_data_topic"
DELAY: 0.1
networks:
mqtt_network:
networks:
mqtt_network:

View File

@@ -8,7 +8,7 @@ def try_parse(type, value: str):
return None return None
USER_ID = 1 USER_ID = try_parse(int, os.environ.get("USER_ID")) or 1
# MQTT config # MQTT config
MQTT_BROKER_HOST = os.environ.get("MQTT_BROKER_HOST") or "mqtt" MQTT_BROKER_HOST = os.environ.get("MQTT_BROKER_HOST") or "mqtt"
MQTT_BROKER_PORT = try_parse(int, os.environ.get("MQTT_BROKER_PORT")) or 1883 MQTT_BROKER_PORT = try_parse(int, os.environ.get("MQTT_BROKER_PORT")) or 1883

View File

@@ -1,6 +1,7 @@
from paho.mqtt import client as mqtt_client from paho.mqtt import client as mqtt_client
from schema.aggregated_data_schema import AggregatedDataSchema from schema.aggregated_data_schema import AggregatedDataSchema
from file_datasource import FileDatasource from file_datasource import FileDatasource
import logging
import config import config
@@ -28,6 +29,7 @@ def publish(client, topic, datasource):
data = datasource.read() data = datasource.read()
msg = AggregatedDataSchema().dumps(data) msg = AggregatedDataSchema().dumps(data)
result = client.publish(topic, msg) result = client.publish(topic, msg)
logging.info(f"Published to {topic}: {msg[:50]}...")
status = result[0] status = result[0]
if status != 0: if status != 0:
print(f"Failed to send message to topic {topic}") print(f"Failed to send message to topic {topic}")

View File

@@ -0,0 +1,19 @@
@startuml
actor Worker as worker
participant Agent as agent
participant "Edge" as edge
participant "Hub" as hub
participant "Store" as store
participant "MapView" as mapview
actor Spectator as spectator
worker -> agent : Start agent
agent -> edge : Send sensor data
edge -> edge : Classify road state
edge -> hub : Send processed data
hub -> hub : Save processed data
hub -> store : Send processed data batch
store -> store : Save processed data batch
store -> mapview : Send new rows
mapview -> spectator : Update paths and markers
@enduml

View File

@@ -0,0 +1,15 @@
@startuml
|Користувач|
start
:Натискає на іконку ями;
|Система|
:Запитує підтвердження видалення ями;
|Користувач|
:Підтверджує видалення ями;
|Система|
:Модифікує запис про яму;
note right #ff9999: Можлива виключна ситуація 1
:Видаляє яму;
|Користувач|
stop
@enduml

23
diagrams/use-case.puml Normal file
View File

@@ -0,0 +1,23 @@
@startuml
rectangle IoT-Systems {
usecase "Collect telemetry (accelerometer + GPS)" as uc1
usecase "Send telemetry" as uc2
usecase "Process telemetry" as uc3
usecase "Determine road condition (pothole / bump /\nnormal)" as uc4
usecase "View road defect marks" as uc5
usecase "View route on map" as uc6
}
rectangle "The user is the card operator" as uc10
rectangle "Sensor Agent\n(Device/STM32/Emulator)" as uc11
uc11 - uc1
uc11 - uc2
uc10 - uc5
uc10 - uc6
uc2 -.|> uc3 : <<include>>
uc3 -.|> uc4 : <<include>>
@enduml

View File

@@ -1,12 +1,12 @@
name: "road_vision__hub" name: "road_vision"
services: services:
mqtt: mqtt:
image: eclipse-mosquitto image: eclipse-mosquitto
container_name: mqtt container_name: mqtt
volumes: volumes:
- ./mosquitto:/mosquitto - ./agent/docker/mosquitto:/mosquitto
- ./mosquitto/data:/mosquitto/data - ./agent/docker/mosquitto/data:/mosquitto/data
- ./mosquitto/log:/mosquitto/log - ./agent/docker/mosquitto/log:/mosquitto/log
ports: ports:
- 1883:1883 - 1883:1883
- 9001:9001 - 9001:9001
@@ -14,6 +14,43 @@ services:
mqtt_network: mqtt_network:
fake_agent:
container_name: agent
build:
context: .
dockerfile: agent/Dockerfile
depends_on:
- mqtt
environment:
PYTHONUNBUFFERED: 1
MQTT_BROKER_HOST: "mqtt"
MQTT_BROKER_PORT: 1883
MQTT_TOPIC: "agent_data_topic"
DELAY: 0.1
networks:
mqtt_network:
edge:
container_name: edge
build:
context: .
dockerfile: edge/Dockerfile
depends_on:
- mqtt
environment:
MQTT_BROKER_HOST: "mqtt"
MQTT_BROKER_PORT: 1883
MQTT_TOPIC: "agent_data_topic"
HUB_HOST: "hub"
HUB_PORT: 8000
HUB_CONNECTION_TYPE: "http"
HUB_MQTT_BROKER_HOST: "mqtt"
HUB_MQTT_BROKER_PORT: 1883
HUB_MQTT_TOPIC: "processed_data_topic"
networks:
mqtt_network:
edge_hub:
postgres_db: postgres_db:
image: postgres:17 image: postgres:17
container_name: postgres_db container_name: postgres_db
@@ -24,13 +61,12 @@ services:
POSTGRES_DB: test_db POSTGRES_DB: test_db
volumes: volumes:
- postgres_data:/var/lib/postgresql/data - postgres_data:/var/lib/postgresql/data
- ./db/structure.sql:/docker-entrypoint-initdb.d/structure.sql - ./store/docker/db/structure.sql:/docker-entrypoint-initdb.d/structure.sql
ports: ports:
- "5432:5432" - "5432:5432"
networks: networks:
db_network: db_network:
pgadmin: pgadmin:
container_name: pgadmin4 container_name: pgadmin4
image: dpage/pgadmin4 image: dpage/pgadmin4
@@ -49,12 +85,13 @@ services:
store: store:
container_name: store container_name: store
build: build:
context: ../../ context: .
dockerfile: store/Dockerfile dockerfile: store/Dockerfile
depends_on: depends_on:
- postgres_db - postgres_db
restart: always restart: always
environment: environment:
PYTHONUNBUFFERED: 1
POSTGRES_USER: user POSTGRES_USER: user
POSTGRES_PASSWORD: pass POSTGRES_PASSWORD: pass
POSTGRES_DB: test_db POSTGRES_DB: test_db
@@ -79,13 +116,14 @@ services:
hub: hub:
container_name: hub container_name: hub
build: build:
context: ../../ context: .
dockerfile: hub/Dockerfile dockerfile: hub/Dockerfile
depends_on: depends_on:
- mqtt - mqtt
- redis - redis
- store - store
environment: environment:
PYTHONUNBUFFERED: 1
STORE_API_HOST: "store" STORE_API_HOST: "store"
STORE_API_PORT: 8000 STORE_API_PORT: 8000
REDIS_HOST: "redis" REDIS_HOST: "redis"
@@ -101,10 +139,11 @@ services:
hub_store: hub_store:
hub_redis: hub_redis:
networks: networks:
mqtt_network: mqtt_network:
db_network: db_network:
edge_hub:
hub:
hub_store: hub_store:
hub_redis: hub_redis:

View File

@@ -13,9 +13,7 @@ class AgentMQTTAdapter(AgentGateway):
broker_port, broker_port,
topic, topic,
hub_gateway: HubGateway, hub_gateway: HubGateway,
batch_size=10,
): ):
self.batch_size = batch_size
# MQTT # MQTT
self.broker_host = broker_host self.broker_host = broker_host
self.broker_port = broker_port self.broker_port = broker_port
@@ -35,42 +33,21 @@ class AgentMQTTAdapter(AgentGateway):
"""Processing agent data and sent it to hub gateway""" """Processing agent data and sent it to hub gateway"""
try: try:
payload: str = msg.payload.decode("utf-8") payload: str = msg.payload.decode("utf-8")
# Create AgentData instance with the received data
agent_data = AgentData.model_validate_json(payload, strict=True) agent_data = AgentData.model_validate_json(payload, strict=True)
# Process the received data (you can call a use case here if needed)
processed_data = process_agent_data(agent_data) processed_data = process_agent_data(agent_data)
# Store the agent_data in the database (you can send it to the data processing module)
if not self.hub_gateway.save_data(processed_data): if self.hub_gateway.save_data(processed_data):
logging.error("Hub is not available") logging.info("Processed data successfully forwarded to the Hub.")
else:
logging.error("Failed to send data: Hub gateway is unavailable.")
except Exception as e: except Exception as e:
logging.info(f"Error processing MQTT message: {e}") logging.error(f"Error processing MQTT message: {e}")
def connect(self): def connect(self):
self.client.on_connect = self.on_connect self.client.on_connect = self.on_connect
self.client.on_message = self.on_message self.client.on_message = self.on_message
self.client.connect(self.broker_host, self.broker_port, 60) self.client.connect(self.broker_host, self.broker_port, 60)
def start(self): def loop_forever(self):
self.client.loop_start() self.client.loop_forever()
def stop(self):
self.client.loop_stop()
# Usage example:
if __name__ == "__main__":
broker_host = "localhost"
broker_port = 1883
topic = "agent_data_topic"
# Assuming you have implemented the StoreGateway and passed it to the adapter
store_gateway = HubGateway()
adapter = AgentMQTTAdapter(broker_host, broker_port, topic, store_gateway)
adapter.connect()
adapter.start()
try:
# Keep the adapter running in the background
while True:
pass
except KeyboardInterrupt:
adapter.stop()
logging.info("Adapter stopped.")

View File

@@ -14,6 +14,7 @@ class GpsData(BaseModel):
class AgentData(BaseModel): class AgentData(BaseModel):
user_id: int
accelerometer: AccelerometerData accelerometer: AccelerometerData
gps: GpsData gps: GpsData
timestamp: datetime timestamp: datetime

View File

@@ -26,15 +26,8 @@ class AgentGateway(ABC):
pass pass
@abstractmethod @abstractmethod
def start(self): def loop_forever(self):
""" """
Method to start listening for messages from the agent. Method to await for new messages.
"""
pass
@abstractmethod
def stop(self):
"""
Method to stop the agent gateway and clean up resources.
""" """
pass pass

View File

@@ -13,3 +13,7 @@ def process_agent_data(
processed_data_batch (ProcessedAgentData): Processed data containing the classified state of the road surface and agent data. processed_data_batch (ProcessedAgentData): Processed data containing the classified state of the road surface and agent data.
""" """
# Implement it # Implement it
return ProcessedAgentData(
road_state="normal",
agent_data=agent_data
)

View File

@@ -16,9 +16,12 @@ MQTT_TOPIC = os.environ.get("MQTT_TOPIC") or "agent_data_topic"
# Configuration for hub MQTT # Configuration for hub MQTT
HUB_MQTT_BROKER_HOST = os.environ.get("HUB_MQTT_BROKER_HOST") or "localhost" HUB_MQTT_BROKER_HOST = os.environ.get("HUB_MQTT_BROKER_HOST") or "localhost"
HUB_MQTT_BROKER_PORT = try_parse_int(os.environ.get("HUB_MQTT_BROKER_PORT")) or 1883 HUB_MQTT_BROKER_PORT = try_parse_int(os.environ.get("HUB_MQTT_BROKER_PORT")) or 1883
HUB_MQTT_TOPIC = os.environ.get("HUB_MQTT_TOPIC") or "processed_agent_data_topic" HUB_MQTT_TOPIC = os.environ.get("HUB_MQTT_TOPIC") or "processed_data_topic"
# Configuration for the Hub # Configuration for the Hub
HUB_HOST = os.environ.get("HUB_HOST") or "localhost" HUB_HOST = os.environ.get("HUB_HOST") or "localhost"
HUB_PORT = try_parse_int(os.environ.get("HUB_PORT")) or 12000 HUB_PORT = try_parse_int(os.environ.get("HUB_PORT")) or 8000
HUB_URL = f"http://{HUB_HOST}:{HUB_PORT}" HUB_URL = f"http://{HUB_HOST}:{HUB_PORT}"
# For choosing type of connection
HUB_CONNECTION_TYPE = os.environ.get("HUB_CONNECTION_TYPE") or "mqtt"

View File

@@ -1,50 +0,0 @@
version: "3.9"
# name: "road_vision"
services:
mqtt:
image: eclipse-mosquitto
container_name: mqtt
volumes:
- ./mosquitto:/mosquitto
- ./mosquitto/data:/mosquitto/data
- ./mosquitto/log:/mosquitto/log
ports:
- 1883:1883
- 19001:9001
networks:
mqtt_network:
edge:
container_name: edge
build:
context: ../../
dockerfile: edge/Dockerfile
depends_on:
- mqtt
environment:
MQTT_BROKER_HOST: "mqtt"
MQTT_BROKER_PORT: 1883
MQTT_TOPIC: " "
HUB_HOST: "store"
HUB_PORT: 8000
HUB_MQTT_BROKER_HOST: "mqtt"
HUB_MQTT_BROKER_PORT: 1883
HUB_MQTT_TOPIC: "processed_data_topic"
networks:
mqtt_network:
edge_hub:
networks:
mqtt_network:
db_network:
edge_hub:
hub:
hub_store:
hub_redis:
volumes:
postgres_data:
pgadmin-data:

View File

@@ -10,42 +10,51 @@ from config import (
HUB_MQTT_BROKER_HOST, HUB_MQTT_BROKER_HOST,
HUB_MQTT_BROKER_PORT, HUB_MQTT_BROKER_PORT,
HUB_MQTT_TOPIC, HUB_MQTT_TOPIC,
HUB_CONNECTION_TYPE,
) )
if __name__ == "__main__": if __name__ == "__main__":
# Configure logging settings # Configure logging settings
logging.basicConfig( logging.basicConfig(
level=logging.INFO, # Set the log level to INFO (you can use logging.DEBUG for more detailed logs) level=logging.INFO,
format="[%(asctime)s] [%(levelname)s] [%(module)s] %(message)s", format="[%(asctime)s] [%(levelname)s] [%(module)s] %(message)s",
handlers=[ handlers=[
logging.StreamHandler(), # Output log messages to the console logging.StreamHandler(),
logging.FileHandler("app.log"), # Save log messages to a file logging.FileHandler("app.log"),
], ],
) )
# Create an instance of the StoreApiAdapter using the configuration
# hub_adapter = HubHttpAdapter( # Logic to select the adapter based on configuration (SCRUM-93 & SCRUM-94)
# api_base_url=HUB_URL, # This allows easy switching between HTTP and MQTT protocols
# ) if HUB_CONNECTION_TYPE.lower() == "http":
logging.info("Initializing HubHttpAdapter (SCRUM-93 integration)")
hub_adapter = HubHttpAdapter(
api_base_url=HUB_URL,
)
else:
logging.info("Initializing HubMqttAdapter (SCRUM-94 integration)")
hub_adapter = HubMqttAdapter( hub_adapter = HubMqttAdapter(
broker=HUB_MQTT_BROKER_HOST, broker=HUB_MQTT_BROKER_HOST,
port=HUB_MQTT_BROKER_PORT, port=HUB_MQTT_BROKER_PORT,
topic=HUB_MQTT_TOPIC, topic=HUB_MQTT_TOPIC,
) )
# Create an instance of the AgentMQTTAdapter using the configuration
# Create an instance of the AgentMQTTAdapter using the selected hub adapter
# This adapter acts as a bridge between the Agent and the Hub
agent_adapter = AgentMQTTAdapter( agent_adapter = AgentMQTTAdapter(
broker_host=MQTT_BROKER_HOST, broker_host=MQTT_BROKER_HOST,
broker_port=MQTT_BROKER_PORT, broker_port=MQTT_BROKER_PORT,
topic=MQTT_TOPIC, topic=MQTT_TOPIC,
hub_gateway=hub_adapter, hub_gateway=hub_adapter,
) )
try: try:
# Connect to the MQTT broker and start listening for messages logging.info(f"Connecting to MQTT broker at {MQTT_BROKER_HOST}:{MQTT_BROKER_PORT}")
agent_adapter.connect() agent_adapter.connect()
agent_adapter.start()
# Keep the system running indefinitely (you can add other logic as needed) logging.info("Broker connection success. Waiting for data...")
while True: agent_adapter.loop_forever()
pass
except KeyboardInterrupt: except KeyboardInterrupt:
# Stop the MQTT adapter and exit gracefully if interrupted by the user logging.info("Interrupt signal received. Shutting down...")
agent_adapter.stop() agent_adapter.disconnect()
logging.info("System stopped.") logging.info("Disconnected from MQTT broker.")

12
hub/Dockerfile-test Normal file
View File

@@ -0,0 +1,12 @@
# Use the official Python image as the base image
FROM python:3.9-slim
# Set the working directory inside the container
WORKDIR /app
# Copy the requirements.txt file and install dependencies
COPY hub/requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
# Copy the entire application into the container
COPY hub/. .
# Run the main.py script inside the container when it starts
CMD ["./test-entry.sh"]

View File

@@ -13,7 +13,7 @@ class StoreApiAdapter(StoreGateway):
def __init__(self, api_base_url): def __init__(self, api_base_url):
self.api_base_url = api_base_url self.api_base_url = api_base_url
def save_data(self, processed_agent_data_batch: List[ProcessedAgentData]): def processed_agent_data_batch_to_payload(self, processed_agent_data_batch: List[ProcessedAgentData]):
if not processed_agent_data_batch: if not processed_agent_data_batch:
return False return False
@@ -25,6 +25,14 @@ class StoreApiAdapter(StoreGateway):
"user_id": user_id "user_id": user_id
} }
return payload
def save_data(self, processed_agent_data_batch: List[ProcessedAgentData]):
payload = self.processed_agent_data_batch_to_payload(processed_agent_data_batch)
if payload == False:
return False
try: try:
# Perform a POST request to the Store API with a 10-second timeout # Perform a POST request to the Store API with a 10-second timeout
response = requests.post( response = requests.post(

View File

@@ -0,0 +1,41 @@
from app.adapters.store_api_adapter import StoreApiAdapter
from app.entities.agent_data import AccelerometerData, AgentData, GpsData
from app.entities.processed_agent_data import ProcessedAgentData
def _test_processed_agent_data_batch_to_payload():
processed_data_batch = [
ProcessedAgentData(road_state = "normal",
agent_data = AgentData(user_id = 1,
accelerometer = AccelerometerData(x = 0.1, y = 0.2, z = 0.3),
gps = GpsData(latitude = 10.123, longitude = 20.456),
timestamp = "2023-07-21T12:34:56Z")
),
ProcessedAgentData(road_state = "normal",
agent_data = AgentData(user_id = 2,
accelerometer = AccelerometerData(x = 0.1, y = 0.2, z = 0.3),
gps = GpsData(latitude = 10.123, longitude = 20.456),
timestamp = "2023-07-21T12:34:56Z")
),
ProcessedAgentData(road_state = "normal",
agent_data = AgentData(user_id = 3,
accelerometer = AccelerometerData(x = 0.1, y = 0.2, z = 0.3),
gps = GpsData(latitude = 10.123, longitude = 20.456),
timestamp = "2023-07-21T12:34:56Z")
),
]
res = StoreApiAdapter(None).processed_agent_data_batch_to_payload(processed_data_batch)
assert res["data"][0]["agent_data"]["user_id"] == 1
assert res["data"][1]["agent_data"]["user_id"] == 2
assert res["data"][2]["agent_data"]["user_id"] == 3
assert StoreApiAdapter(None).processed_agent_data_batch_to_payload([]) == False
if __name__ == "__main__":
test_functions = [i for i in dir() if i.startswith('_test_')]
for i in test_functions:
print(i)
eval(i)()

3
hub/test-entry.sh Executable file
View File

@@ -0,0 +1,3 @@
#!/bin/sh
PYTHONPATH=$PWD python3 app/adapters/store_api_adapter_test.py

13
store/Dockerfile-test Normal file
View File

@@ -0,0 +1,13 @@
# Use the official Python image as the base image
FROM python:latest
# Set the working directory inside the container
WORKDIR /app
# Copy the requirements.txt file and install dependencies
COPY store/requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
# Copy the entire application into the container
COPY store/. .
# Run the main.py script inside the container when it starts
#CMD ["uvicorn", "main:app", "--host", "0.0.0.0"]
CMD ["./test-entry.sh"]

View File

@@ -1,61 +0,0 @@
name: "road_vision__database"
services:
postgres_db:
image: postgres:17
container_name: postgres_db
restart: always
environment:
POSTGRES_USER: user
POSTGRES_PASSWORD: pass
POSTGRES_DB: test_db
volumes:
- postgres_data:/var/lib/postgresql/data
- ./db/structure.sql:/docker-entrypoint-initdb.d/structure.sql
ports:
- "5432:5432"
networks:
db_network:
pgadmin:
container_name: pgadmin4
image: dpage/pgadmin4
restart: always
environment:
PGADMIN_DEFAULT_EMAIL: admin@admin.com
PGADMIN_DEFAULT_PASSWORD: root
volumes:
- pgadmin-data:/var/lib/pgadmin
ports:
- "5050:80"
networks:
db_network:
store:
container_name: store
build:
context: ../../
dockerfile: store/Dockerfile
depends_on:
- postgres_db
restart: always
environment:
POSTGRES_USER: user
POSTGRES_PASSWORD: pass
POSTGRES_DB: test_db
POSTGRES_HOST: postgres_db
POSTGRES_PORT: 5432
ports:
- "8000:8000"
networks:
db_network:
networks:
db_network:
volumes:
postgres_data:
pgadmin-data:

View File

@@ -40,10 +40,24 @@ subscriptions: Dict[int, Set[WebSocket]] = {}
@app.websocket("/ws/{user_id}") @app.websocket("/ws/{user_id}")
async def websocket_endpoint(websocket: WebSocket, user_id: int): async def websocket_endpoint(websocket: WebSocket, user_id: int):
await websocket.accept() await websocket.accept()
if user_id not in subscriptions: if user_id not in subscriptions:
subscriptions[user_id] = set() subscriptions[user_id] = set()
subscriptions[user_id].add(websocket) subscriptions[user_id].add(websocket)
try: try:
# send already available data
r = processed_agent_data.select()
stored_data = SessionLocal().execute(r).fetchall()
jsonable_data = [{c.name: getattr(i, c.name) for c in processed_agent_data.columns} for i in stored_data]
for i in jsonable_data:
i['timestamp'] = i['timestamp'].strftime("%Y-%m-%dT%H:%M:%SZ")
await websocket.send_json(json.dumps(jsonable_data))
# receive forever
while True: while True:
await websocket.receive_text() await websocket.receive_text()
except WebSocketDisconnect: except WebSocketDisconnect:
@@ -59,15 +73,11 @@ async def send_data_to_subscribers(user_id: int, data):
# FastAPI CRUDL endpoints # FastAPI CRUDL endpoints
def ProcessedAgentData_to_td(data: List[ProcessedAgentData]):
@app.post("/processed_agent_data/") return [
async def create_processed_agent_data(data: List[ProcessedAgentData], user_id: int = Body(..., embed=True)):
session = SessionLocal()
try:
created_data = [
{ {
"road_state": item.road_state, "road_state": item.road_state,
"user_id": user_id, "user_id": item.agent_data.user_id,
"x": item.agent_data.accelerometer.x, "x": item.agent_data.accelerometer.x,
"y": item.agent_data.accelerometer.y, "y": item.agent_data.accelerometer.y,
"z": item.agent_data.accelerometer.z, "z": item.agent_data.accelerometer.z,
@@ -77,6 +87,13 @@ async def create_processed_agent_data(data: List[ProcessedAgentData], user_id: i
} }
for item in data for item in data
] ]
@app.post("/processed_agent_data/")
async def create_processed_agent_data(data: List[ProcessedAgentData], user_id: int = Body(..., embed=True)):
session = SessionLocal()
try:
created_data = ProcessedAgentData_to_td(data)
stmt = processed_agent_data.insert().values(created_data).returning(processed_agent_data) stmt = processed_agent_data.insert().values(created_data).returning(processed_agent_data)
result = session.execute(stmt) result = session.execute(stmt)
created_records = [dict(row._mapping) for row in result.fetchall()] created_records = [dict(row._mapping) for row in result.fetchall()]

3
store/test-entry.sh Executable file
View File

@@ -0,0 +1,3 @@
#!/bin/sh
PYTHONPATH=$PWD python3 test/main_test.py

39
store/test/main_test.py Normal file
View File

@@ -0,0 +1,39 @@
from schemas import AccelerometerData, AgentData, GpsData, ProcessedAgentData
import main
def _test_ProcessedAgentData_to_td():
processed_data_batch = [
ProcessedAgentData(road_state = "normal",
agent_data = AgentData(user_id = 1,
accelerometer = AccelerometerData(x = 0.1, y = 0.2, z = 0.3),
gps = GpsData(latitude = 10.123, longitude = 20.456),
timestamp = "2023-07-21T12:34:56Z")
),
ProcessedAgentData(road_state = "normal",
agent_data = AgentData(user_id = 2,
accelerometer = AccelerometerData(x = 0.1, y = 0.2, z = 0.3),
gps = GpsData(latitude = 10.123, longitude = 20.456),
timestamp = "2023-07-21T12:34:56Z")
),
ProcessedAgentData(road_state = "normal",
agent_data = AgentData(user_id = 3,
accelerometer = AccelerometerData(x = 0.1, y = 0.2, z = 0.3),
gps = GpsData(latitude = 10.123, longitude = 20.456),
timestamp = "2023-07-21T12:34:56Z")
),
]
res = main.ProcessedAgentData_to_td(processed_data_batch)
assert res[0]["user_id"] == 1
assert res[1]["user_id"] == 2
assert res[2]["user_id"] == 3
if __name__ == "__main__":
test_functions = [i for i in dir() if i.startswith('_test_')]
for i in test_functions:
print(i)
eval(i)()

19
utils/check-up.py Normal file
View File

@@ -0,0 +1,19 @@
import sys
print("Checking for dead containers...")
l = [i for i in sys.stdin.read().split("\n") if i]
header, statuses = l[0], l[1:]
status_index = header.find('STATUS')
name_index = header.find('NAMES')
exit_code = 0
for i in statuses:
if not i[status_index:].startswith("Up "):
service_name = i[name_index:]
print(f"Crash detected in {service_name}")
exit_code = 1
sys.exit(exit_code)