Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Triangulation POC #460

Open
garnser opened this issue Feb 6, 2025 · 2 comments
Open

Triangulation POC #460

garnser opened this issue Feb 6, 2025 · 2 comments

Comments

@garnser
Copy link

garnser commented Feb 6, 2025

Howdy all,

I started on a POC for actually triangulating devices within my house. I've attached a code-sample below:

import yaml
import numpy as np
import argparse
import matplotlib.pyplot as plt
from scipy.optimize import least_squares
from mpl_toolkits.mplot3d import Axes3D
from mpl_toolkits.mplot3d.art3d import Poly3DCollection, Line3DCollection
import matplotlib.patches as mpatches
import random
import hashlib
from shapely.geometry import Polygon, Point
from shapely.ops import unary_union

# Load YAML file
def load_yaml(file_path):
    with open(file_path, 'r') as file:
        data = yaml.safe_load(file)
    return {k.lower(): v for k, v in data.items()}  # Normalize MAC addresses to lowercase

def generate_room_color(room_name):
    """Generate a stable color for a room based on its name."""
    hash_val = int(hashlib.md5(room_name.encode()).hexdigest(), 16)  # Convert room name to a hash
    r = (hash_val % 256) / 255  # Extract Red component
    g = ((hash_val >> 8) % 256) / 255  # Extract Green component
    b = ((hash_val >> 16) % 256) / 255  # Extract Blue component
    return (r, g, b, 0.5)  # Use alpha=0.5 for transparency

# Trilateration function
def trilaterate(sensors, distances):
    def residuals(pos, sensors, distances):
        return [np.linalg.norm(pos - np.array(sensor)) - dist for sensor, dist in zip(sensors, distances)]

    # Initial guess (center of all sensors)
    initial_guess = np.mean(sensors, axis=0)
    result = least_squares(residuals, initial_guess, args=(sensors, distances))
    return result.x

floor_boundaries = {
    0: np.array([[10.1, 0.0], [0.0, 0.0], [0.0, 6.34], [3.7, 6.34], [3.7, 10.1], [10.1, 10.1], [10.1, 0.0]]),  # Ground floor
    1: np.array([[10.1, 0.0], [0.0, 0.0], [0.0, 6.34], [10.1, 6.34], [10.1, 0.0]]),
}

# Define room positions and sizes
room_data = [
    {"name": "Kitchen", "floor": 0, "corners": [(0, 0), (3.2, 0), (3.2, 2.7), (0, 2.7)]},
    {"name": "Dining Area", "floor": 0, "corners": [(0, 2.7), (4.2, 2.7), (5.7, 4), (5.7, 6.34), (0, 6.34)]},
    {"name": "Living Room", "floor": 0, "corners": [(3.7, 6.34), (10.1, 6.34), (10.1, 10.1), (3.7, 10.1)]},
    {"name": "Hallway", "floor": 0, "corners": [(3.2,0), (7.0, 0), (7.0, 2.7), (8.5, 2.7), (8.5, 4), (5.7, 4.0), (4.2, 2.7), (3.2, 2.7)]},
    {"name": "Ara room", "floor": 0, "corners": [(7.0, 0), (10.1, 0), (10.1, 2.7), (7.0, 2.7)]},
    {"name": "Toilet", "floor": 0, "corners": [(5.7, 4), (7.6, 4), (7.6, 6.34), (5.7, 6.34)]},
    {"name": "Washroom", "floor": 0, "corners": [(7.6, 4), (8.5, 4), (8.5, 2.7), (10.1, 2.7), (10.1, 6.34), (7.6, 6.34)]},
    {"name": "Melissa room", "floor": 1, "corners": [(0,0), (0, 2.7), (3.65, 2.7), (3.65, 0)]},
    {"name": "Boys room", "floor": 1, "corners": [(0, 2.7), (0, 6.34), (3.65, 6.34), (3.65, 2.7)]},
    {"name": "TV room", "floor": 1, "corners": [(3.65, 0), (3.65, 6.34), (7, 6.34), (7, 5), (8.3, 5), (8.3, 3.3), (7, 3.3), (7, 0)]},
    {"name": "Bedroom", "floor": 1, "corners": [(7, 0), (7, 3.3), (10.1, 3.3), (10.1, 0)]},
    {"name": "Wardrobe", "floor": 1, "corners": [(7, 5), (7, 6.34), (8.3, 6.34), (8.3, 5)]},
    {"name": "Bathroom", "floor": 1, "corners": [(8.3, 3.3), (8.3, 6.34), (10.1, 6.34), (10.1, 3.3)]},
]

sensor_data = {
    "1c:69:20:cc:f2:c4": {"name": "Kitchen Sensor", "position": (0, 0, 1)},
    "2c:bc:bb:0d:09:74": {"name": "Washroom Sensor", "position": (9, 2.7, 1)},
    "f8:b3:b7:2a:d8:60": {"name": "Living Room Sensor", "position": (8.2, 9.9, 1)},
    "f8:b3:b7:2a:d5:bc": {"name": "Allrum Sensor", "position": (3.8, 5.8, 3.2)},
    "2c:bc:bb:0e:20:88": {"name": "Bedroom Sensor", "position": (8, 0, 4)}
}

# Generate unique colors for rooms
room_colors = {room["name"]: generate_room_color(room["name"]) for room in room_data}

def remove_overlapping_ceiling(floor_boundaries):
    """Removes overlapping sections of the ceiling based on the floor above."""
    ceiling_polygons = {}

    for floor in floor_boundaries:
        if floor + 1 in floor_boundaries:  # Check if there's a floor above
            lower_floor_polygon = Polygon(floor_boundaries[floor])
            upper_floor_polygon = Polygon(floor_boundaries[floor + 1])
            non_overlapping_ceiling = lower_floor_polygon.difference(upper_floor_polygon)
            ceiling_polygons[floor] = non_overlapping_ceiling
        else:
            ceiling_polygons[floor] = Polygon(floor_boundaries[floor])  # No floor above, keep full ceiling

    return ceiling_polygons

def find_room(estimated_position):
    """Determine which room the estimated position is in."""
    estimated_point = Point(estimated_position[:2])
    estimated_floor = int(estimated_position[2] // 3)  # Assuming each floor is 3m high

    for room in room_data:
        if room["floor"] == estimated_floor:
            room_polygon = Polygon(room["corners"])
            if room_polygon.contains(estimated_point):
                return room["name"]

    return "Unknown Room"

# Visualization function
def plot_3d_position(sensors, estimated_position):
    fig = plt.figure(figsize=(10, 7))
    ax = fig.add_subplot(111, projection='3d')

    # Get max floor height dynamically
    floor_heights = {room["floor"]: room["floor"] * 3 for room in room_data}  # Assuming 3m per floor
    max_floor = max(floor_heights.keys())  # Highest floor
    max_floor_height = floor_heights[max_floor]  # Top floor height

    # Compute non-overlapping ceiling sections
    non_overlapping_ceilings = remove_overlapping_ceiling(floor_boundaries)
    custom_legend_handles = []

    for floor, boundary in floor_boundaries.items():
        floor_z = floor_heights.get(floor, 0)

        # Draw floor outline
        ax.plot(boundary[:, 0], boundary[:, 1], np.full_like(boundary[:, 0], floor_z),
                color='black', linestyle='dashed', label=f'Floor {floor}')

        # Draw walls between floors
        for i in range(len(boundary)):
            ax.plot([boundary[i, 0], boundary[i, 0]],
                    [boundary[i, 1], boundary[i, 1]],
                    [floor_z, floor_z + 3], color='black', linestyle='dotted')

        # Draw non-overlapping ceiling
        if floor in non_overlapping_ceilings and not non_overlapping_ceilings[floor].is_empty:
            x, y = non_overlapping_ceilings[floor].exterior.xy
            ceiling_corners = [(x[i], y[i], (floor + 1) * 3) for i in range(len(x))]
            ceiling_surface = Poly3DCollection([ceiling_corners], color='gray', alpha=0.0, label="Ceiling")
            ax.add_collection3d(ceiling_surface)

    # **Draw rooms as flat surfaces with custom shapes**
    for room in room_data:
        floor_z = floor_heights[room["floor"]]
        room_corners = [(x, y, floor_z) for x, y in room["corners"]]

        # Draw the custom-shaped room
        room_surface = Poly3DCollection([room_corners], color=room_colors[room["name"]], alpha=0.5)
        ax.add_collection3d(room_surface)

        # Add to legend
        custom_legend_handles.append(mpatches.Patch(color=room_colors[room["name"]], label=f'{room["name"]} (Floor {room["floor"]})'))

    # **Draw ceilings**
    for floor in floor_boundaries:
        ceiling_corners = [(x, y, (floor + 1) * 3) for x, y in floor_boundaries[floor]]
        ceiling_surface = Poly3DCollection([ceiling_corners], color='gray', alpha=0.3, label="Ceiling")
        ax.add_collection3d(ceiling_surface)

    # **Plot sensors**
    for mac, data in sensor_data.items():
        pos = data["position"]
        name = data["name"]
        ax.scatter(*pos, c='blue', marker='o', s=80, label="Sensor" if mac == list(sensor_data.keys())[0] else "")
        ax.text(pos[0], pos[1], pos[2], name, color='black', fontsize=8)

    # **Plot estimated position**
    if estimated_position is not None:
        ax.scatter(*estimated_position, c='red', marker='x', s=100, label='Estimated Device Position')
        room_name = find_room(estimated_position)
        ax.text(estimated_position[0] + 0.3, estimated_position[1], estimated_position[2], room_name, color='red', fontsize=10, fontweight='bold')

    # **Restore legend with rooms + sensors + estimated position**
    ax.legend(handles=custom_legend_handles + [
        mpatches.Patch(color='blue', label="Sensor"),
        mpatches.Patch(color='red', label="Estimated Position"),
        mpatches.Patch(color='black', label="House Boundaries"),
        mpatches.Patch(color='gray', alpha=0.3, label="Ceiling")
    ], loc='upper left', bbox_to_anchor=(1.05, 1), fontsize='small')

    ax.set_xlabel('X Axis')
    ax.set_ylabel('Y Axis')
    ax.set_zlabel('Z Axis')

    plt.show()

# Main function to process data
def get_device_position(yaml_file, sensor_data, mac_address):
    data = load_yaml(file_path=yaml_file)
    mac_address = mac_address.lower()  # Normalize input MAC address

    if mac_address not in data:
        print(f"MAC address {mac_address} not found in dataset.")
        return None

    device_data = data[mac_address].get('scanners', {})
    readings = []
    sensor_names = []

    for sensor_mac, sensor_info in device_data.items():
        source_address = sensor_info.get('source', '').lower()  # Normalize source MAC address
        distance = sensor_info.get('rssi_distance')

        if distance is not None and source_address in sensor_data:
            readings.append((sensor_data[source_address]["position"], distance))
            sensor_names.append(sensor_data[source_address]["name"])

    if len(readings) < 3:
        print(f"Not enough valid sensor readings for MAC {mac_address} to perform trilateration.")
        return None

    sensors, distances = zip(*readings)
    estimated_position = trilaterate(np.array(sensors), np.array(distances))
    plot_3d_position(sensors, estimated_position)
    return estimated_position

# Argument parser
if __name__ == "__main__":
    parser = argparse.ArgumentParser(description="Trilateration of a device using RSSI data.")
    parser.add_argument("mac_address", help="MAC address of the device to locate")
    args = parser.parse_args()

    yaml_file = 'test.yml'
    position = get_device_position(yaml_file, sensor_data, args.mac_address)
    if position is not None:
        print(f"Estimated position for {args.mac_address}: {position}")

Input looks like this:

"28:ff:3c:90:07:4b": # device
  scanners:
    "f8:b3:b7:2a:d5:bc": # scanner mac
      source: "f8:b3:b7:2a:d5:bc"  # scanner mac
      rssi_distance: 2.4 # distance to scanner
    "1c:69:20:cc:f2:c4":
      source: "1c:69:20:cc:f2:c4"
      rssi_distance: 3.3
    "f8:b3:b7:2a:d8:60":
      source: "f8:b3:b7:2a:d8:60"
      rssi_distance: 2.2
    "2c:bc:bb:0d:09:74":
      source: "2c:bc:bb:0d:09:74"
      rssi_distance: 4.8

The result is a 3d model which plots floors, rooms and the location of the device.

Video-sample available here
https://drive.google.com/file/d/1zeSL-F9ZstPWr1dL6h5YzwMMJIoQCOm6/view?usp=sharing

Screenshots

Image Image

If I get some time over I'll try to implement this with three.js as a lovelace card.

@agittins
Copy link
Owner

agittins commented Feb 7, 2025

Very cool!

I like how you've handled rooms and floors, too. I just started making some movement again on trilat, in the last day(!) so I'm likely to pop in to chat about ideas etc, but for now I need to get some inter-scanner measuring happening and sorting things into numpy, at which point I'll be ready to really start playing with this sort of stuff. I'll be quiet for a bit though as I have a habit of getting bogged down talking instead of coding, so trying to direct my focus at bit right now!

But rest assured this is some exciting stuff and I'll be back to pilfer from your ideas! 😁

@andreasgily
Copy link

Hi, this looks quite similar to the Espresense project I have been following for a while (would be nice to have a compatible map configuration): https://github.com/ESPresense/ESPresense

Some insights from my Espresense setup at home:

  • BLE sensors are quite difficult to calibrate (they're on some furniture or attached to lamps and therefore often have different readings depending on how you're wearing e.g. your Apple Watch and how you're positioned in the room, and your own body shades you from the sensors (unless you can put the sensors somewhere on the ceiling where they have the best coverage of your devices).
  • Walls... depending on how thick and what materials your walls are made of, they also have a huge impact on your range readings - especially between different rooms.
  • Floors... also have a huge impact if they are made of concrete etc.

So in the end - I haven't been able to use Espresense for live positioning or automations yet - I'm currently just using the closest BLE sensor reading as Bermuda does - but I really appreciate your efforts and POC and look forward to what's to come.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants