Featured image of post Using Azure to Collect Data from Raspberry Pi Pico for $0 in Maintenance

Using Azure to Collect Data from Raspberry Pi Pico for $0 in Maintenance

TLDR: I bought a Raspberry Pi Pico W and a temperature sensor for around $20. It’s hanging on my balcony and sends telemetry (temperature and barometric pressure) to Azure every minute. From there, a self-hosted Grafana dashboard picks and presents the data, and all this costs zero dollars per month. Below is what February-March 2024 looked like in Malmö, Sweden:

Weather Data

The Hardware

For the hardware part, I take no credit at all. I did the whole thing using this open-source GitHub repo. Even the following image, showing in detail the connectivity, comes from that repository:

Hardware Connectivity

The Software

It happens in two steps.

Step 1 - Collecting the data from the sensor

The Pico (connected to my home WiFi) collects the measurements using the sensor and sends the values to an Azure Function every minute. Here’s the code for that:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
import time
from machine import I2C, Pin, SPI
import network
import utime
from bmp280 import BMP280SPI
import urequests

while True:
    try:    # connect to wifi
        wlan = network.WLAN(network.STA_IF)
        led_onboard = Pin("LED", Pin.OUT)

        if not wlan.isconnected():
            print('Connecting...')
            # Enable the interface
            wlan.active(True)
            # Connect
            wlan.connect("{WiFi-Name}", "{WiFi-Password}")
            # Wait till connection is established
            while not wlan.isconnected():
                 led_onboard.value(1)
                 time.sleep_ms(300)
                 led_onboard.value(0)
                 time.sleep_ms(300)
            print('Connection established.')

        print('Network Settings:', wlan.ifconfig())

        def send_data_to_azure(temp, pressure):
            azure_function_url = "{https://func-xxxxxxx.azurewebsites.net/api/Function1?code=xxxxxxxxxxxxxxx}"
            data = {'temperature': temp, 'pressure': pressure}
            headers = {'Content-Type': 'application/json'}
            response = urequests.post(azure_function_url, json=data, headers=headers)
            print("Azure Function Response:", response.text)

        while True:
            spi1_sck = Pin(10)
            spi1_tx = Pin(11)
            spi1_rx = Pin(12)
            spi1_csn = Pin(13, Pin.OUT, value=1)
            spi1 = SPI(1, sck=spi1_sck, mosi=spi1_tx, miso=spi1_rx)
            bmp280_spi = BMP280SPI(spi1, spi1_csn)
            readout = bmp280_spi.measurements
            temperature = readout['t']
            pressure = readout['p']
            print(f"Temperature: {temperature} °C, pressure: {pressure} hPa.")
            send_data_to_azure(temperature, pressure)
            led_onboard.value(1)
            utime.sleep(60)
            led_onboard.value(0)
            utime.sleep_ms(500)
    except Exception as e:
        # Print the exception and restart the application
        print("Error:", e)
        print("Restarting application...")
        time.sleep(5)  # Wait for 5 seconds before restarting

Step 2 - Streaming the data to Azure (Table Storage)

The Azure Function receives the measurements and saves them to Azure Table Storage. Here’s the code for that:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
[Function("Function1")]
public async Task<IActionResult> Run([HttpTrigger(AuthorizationLevel.Function, "post")] HttpRequest req, [FromBody] string temperature, [FromBody] string pressure)
{
    _logger.LogInformation($"Reported temperature: {temperature} °C and pressure: {pressure} hPa.");

    string storageConnectionString = Environment.GetEnvironmentVariable("TableStorageConnString")!;
    string tableName = Environment.GetEnvironmentVariable("TableName")!;

    var tableClient = new TableClient(
        new Uri(storageConnectionString),
        tableName,
        new DefaultAzureCredential());

    DateTime now = DateTime.UtcNow;

    var tableEntity = new TableEntity(now.ToString("yyyy-MM-dd"), Guid.NewGuid().ToString())
    {
        { "Timestamp", now.ToString("s") + "Z" },
        { "Temperature", double.Parse(temperature) },
        { "Pressure", double.Parse(pressure) }
    };

    await tableClient.AddEntityAsync(tableEntity);

    _logger.LogInformation(
        $"Created row {tableEntity.RowKey}: Temperature: {tableEntity["Temperature"]}°C. Pressure: ${tableEntity.GetDouble("Pressure")} hPa.");


    return new OkObjectResult($"Data saved to Azure Table Storage");
}

Here’s what the values look like—nothing weird:

Telemetry Data

Step 3 - Visualizing the Data

I love Power BI, I do, but it costs, while Grafana’s free, so…

Weather Data

For some reason, Azure Table Storage is not one of the default available data sources, and I haven’t used Azure Data Explorer that much yet. So, I went for the second-best option: I used the JSON API plugin as a data source. Long story short, it needs a base URL and expects certain endpoints to exist and deliver values in a specific format—nothing too complicated. So what did I do? Even more Azure Functions! I created that API in the Azure Function app. Here’s the code for those endpoints:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
[Function("Function2")]
public async Task RunF2([TimerTrigger("0 0 4 * * *")] FunctionContext context)
{
    _logger.LogInformation("Time to remove sone entries...");

    string storageConnectionString = Environment.GetEnvironmentVariable("TableStorageConnString")!;
    string tableName = Environment.GetEnvironmentVariable("TableName")!;

    var tableClient = new TableClient(
        new Uri(storageConnectionString),
        tableName,
        new DefaultAzureCredential());

    DateTime now = DateTime.UtcNow;

    var query = tableClient.QueryAsync<TableEntity>(
        filter: $"PartitionKey eq '{now.AddYears(-1).ToString("yyyy-MM-dd")}'",
        select: new[] { "RowKey" });

    await foreach (TableEntity entity in query)
    {
        tableClient.DeleteEntity(entity.PartitionKey, entity.RowKey);
        await tableClient.DeleteEntityAsync(entity.PartitionKey, entity.RowKey);

        _logger.LogInformation($"Deleted row {entity.RowKey} with PartitionId {entity.PartitionKey}");
    }
}

Pricing

Although this setup runs all day long, I’m still under the radar. All those Azure Functions run in the same Azure Function App, which is hosted in a serverless Linux environment (Y1). I host Grafana myself, and the Table Storage costs nothing as I’m under the limits of the Free Tiers.

Future Improvements

The whole API thing is a bit clunky, and I do want to test Azure Data Explorer. When I do, I’ll make sure to blog about it. I also want to try MQTT instead of the Functions, just to make it more relevant to Azure IoT.

All rights reserved
Built with Hugo
Theme Stack designed by Jimmy