S5D9 Anomaly Monitoring Tutorial Walkthrough

s5d9

#1

After going through the S5D9 Diagnostics Intelligence Tutorial, I decided to go through the Anomaly Monitoring Quick Start Guide.

If you don’t have a S5D9 already, make sure you get the $20 discount coupon, good through Sept 30, 2017

$20 off S5D9 IoT Fast Prototyping Kit

These tutorials use the Renesas IoT Sandbox Data Intelligence cloud powered by MediumOne.

I first activated the kit and then moved the new binary onto the S5D9. My progress is documented here.

Login and connect to your S5D9 project in the upper right corner.

Create Data Stream: In the left panel, click on Config -> Data Streams

Assign Data Stream type normal_range

Use Workflow Studio to Create New Workflow

Under Tags and Triggers, select raw

Move tags to workspace

temp3.avg, humidity.avg, pressure.avg, x_accel.avg, y_accel.avg, z_accel.avg

Add Base Python Module, Found Under Foundation

Add 6 Inputs

Add Processed Stream - Single

Connect Workflow

Assign Processed Stream to normal_range

Under the script section, go to Datastreams, select normal_range from the dropdown menu to the right of the box. The Label will be automatically filled out.

If the Datastreams does not appear at all, you can read @Dan’s note about activating the stream. Although I followed Dan’s note, I’m not sure that it was necessary to send an event with the mobile app. Prior to assigning normal_range to the Processed Stream, my datastream looked like this:

Add Python to Base Python

Copying and pasting from the Medium One blog did not work due to the quotes on the sample code getting messed up on the blog.

Here’s the code I used that worked.

OUTLIER_FILTER_BIN_COUNT_PCT = 10
OUTLIER_FILTER_TOTAL_PCT = 10
HISTORY_DATA_RANGE = 60
MINIMUM_DATA_POINTS = 1

import Analytics
import DateRange
import datetime
import Filter
from datetime import timedelta

# This function sorts a dictionary so it can be compared
def ordered(obj):
    if isinstance(obj, dict):
        return sorted((k, ordered(v)) for k, v in obj.items())
    if isinstance(obj, list):
        return sorted(ordered(x) for x in obj)
    else:
        return obj

# This function returns key stats on a list (min, max, event count, number of buckets)
def get_list_stats(list):
    total_count = 0
    total_buckets = len(list)
    absolute_min = None
    absolute_max = None
    for item in list:
        total_count += item['count']
        if absolute_min is None:
            absolute_min = item['value']
        elif item['value'] < absolute_min:
              absolute_min = item['value']

        if absolute_max is None:
              absolute_max = item['value']
        elif item['value'] > absolute_max:
              absolute_max = item['value']
               
    if isinstance(absolute_min, float) == True:
        #absolute_min = int(round(absolute_min))
        absolute_min = round(absolute_min,2)
    if isinstance(absolute_max, float) == True:
        #absolute_max = int(round(absolute_max))
        absolute_max = round(absolute_max,2)
    return [total_count, total_buckets, absolute_min, absolute_max]

# This function will reduce a list by filtering outlier
def filter_outlier(list):
    list.reverse()
    bin_count_tally = 0
    percentage_tally = 0
    total_buckets = len(list)
    for item in list[:]:
      
        # exit if bin_count_tally reached the threshold outlier limit
        if bin_count_tally / total_buckets >= OUTLIER_FILTER_BIN_COUNT_PCT :
            log("existing loop, bucket limit reached")
            break

        # exit if percentage_tally reached the outlier threshold limit
        if percentage_tally+item['percent'] >= OUTLIER_FILTER_TOTAL_PCT:
            log("existing loop, percentage_tally reached")
            break

        # remove list item and update tally
        list.remove(item)
        bin_count_tally += 1
        percentage_tally += item['percent']
       
# get last normal event
try:
    last_normal_range_event = Analytics.events('normal_range', 
                             Filter.string_tag('processed.normal_range'),
                             None, 1, ['event_rcv', 'DESC'])
except Exception:
    last_normal_range_event = {}
    
# check if one exists, otherwise initialize dict
if len(last_normal_range_event) > 0:
    last_normal_range_event = last_normal_range_event[0]['event_data']['normal_range']
else:
    last_normal_range_event = {}

# set query window 
daterange = DateRange.date_range(datetime.datetime.utcnow() - timedelta(days=HISTORY_DATA_RANGE), datetime.datetime.utcnow() )

# query bins 
humidity_list = Analytics.bin_by_value("raw.humidity.avg", daterange)
pressure_list = Analytics.bin_by_value("raw.pressure.avg", daterange)
temperature_list = Analytics.bin_by_value("raw.temp3.avg", daterange)
x_accel_list = Analytics.bin_by_value("raw.x_accel.avg", daterange)
y_accel_list = Analytics.bin_by_value("raw.y_accel.avg", daterange)
z_accel_list = Analytics.bin_by_value("raw.z_accel.avg", daterange)

# find key stats for each sensor type
initial_humidity_list_stats = get_list_stats(humidity_list)
initial_pressure_list_stats = get_list_stats(pressure_list)
initial_temperature_list_stats = get_list_stats(temperature_list)
initial_x_accel_list_stats = get_list_stats(x_accel_list)
initial_y_accel_list_stats = get_list_stats(y_accel_list)
initial_z_accel_list_stats = get_list_stats(z_accel_list)

log("initial_humidity_list_stats "+str(initial_humidity_list_stats))
log("initial_pressure_list_stats "+str(initial_pressure_list_stats))
log("initial_temperature_list_stats "+str(initial_temperature_list_stats))
log("initial_x_accel_list_stats "+str(initial_x_accel_list_stats))
log("initial_y_accel_list_stats "+str(initial_y_accel_list_stats))
log("initial_z_accel_list_stats "+str(initial_z_accel_list_stats))

# filter outliers from stats list
filter_outlier(humidity_list)
filter_outlier(pressure_list)
filter_outlier(temperature_list)
filter_outlier(x_accel_list)
filter_outlier(y_accel_list)
filter_outlier(z_accel_list)

# save new filtered stats
filtered_humidity_list_stats = get_list_stats(humidity_list)
filtered_pressure_list_stats = get_list_stats(pressure_list)
filtered_temperature_list_stats = get_list_stats(temperature_list)
filtered_x_accel_list_stats = get_list_stats(x_accel_list)
filtered_y_accel_list_stats = get_list_stats(y_accel_list)
filtered_z_accel_list_stats = get_list_stats(z_accel_list)

log("filtered_humidity_list_stats "+str(filtered_humidity_list_stats))
log("filtered_pressure_list_stats "+str(filtered_pressure_list_stats))
log("filtered_temperature_list_stats "+str(filtered_temperature_list_stats))
log("filtered_x_accel_list_stats "+str(filtered_x_accel_list_stats))
log("filtered_y_accel_list_stats "+str(filtered_y_accel_list_stats))
log("filtered_z_accel_list_stats "+str(filtered_z_accel_list_stats))

normal_range = {}

# buid normal dict / json
# If initial count is < MINIMUM_DATA_POINTS don't use the filtered stats, otherwise use the filter stats
if initial_humidity_list_stats[2] is not None and initial_humidity_list_stats[0] < MINIMUM_DATA_POINTS:
    # set min and max normal range
    normal_range['humidity'] = [initial_humidity_list_stats[2],initial_humidity_list_stats[3]]
elif initial_humidity_list_stats[2] is not None:
    normal_range['humidity'] = [filtered_humidity_list_stats[2],filtered_humidity_list_stats[3]]

if initial_pressure_list_stats[2] is not None and initial_pressure_list_stats[0] < MINIMUM_DATA_POINTS:
    normal_range['pressure'] = [initial_pressure_list_stats[2],initial_pressure_list_stats[3]]
elif initial_pressure_list_stats[2] is not None:
    log(initial_humidity_list_stats[2])
    normal_range['pressure'] = [filtered_pressure_list_stats[2],filtered_pressure_list_stats[3]]

if initial_x_accel_list_stats[2] is not None and initial_x_accel_list_stats[0] < MINIMUM_DATA_POINTS:
    normal_range['x_accel'] = [initial_x_accel_list_stats[2],initial_x_accel_list_stats[3]]
elif initial_x_accel_list_stats[2] is not None:
    normal_range['x_accel'] = [filtered_x_accel_list_stats[2],filtered_x_accel_list_stats[3]]

if initial_y_accel_list_stats[2] is not None and initial_y_accel_list_stats[0] < MINIMUM_DATA_POINTS:
    normal_range['y_accel'] = [initial_y_accel_list_stats[2],initial_y_accel_list_stats[3]]
elif initial_y_accel_list_stats[2] is not None:
    normal_range['y_accel'] = [filtered_y_accel_list_stats[2],filtered_y_accel_list_stats[3]]

if initial_z_accel_list_stats[2] is not None and initial_z_accel_list_stats[0] < MINIMUM_DATA_POINTS:
    normal_range['z_accel'] = [initial_z_accel_list_stats[2],initial_z_accel_list_stats[3]]
elif initial_z_accel_list_stats[2] is not None:
    normal_range['z_accel'] = [filtered_z_accel_list_stats[2],filtered_z_accel_list_stats[3]]

if initial_temperature_list_stats[2] is not None and initial_temperature_list_stats[0] < MINIMUM_DATA_POINTS:
    normal_range['temperature'] = [initial_temperature_list_stats[2],initial_temperature_list_stats[3]]
elif initial_temperature_list_stats[2] is not None:
    normal_range['temperature'] = [filtered_temperature_list_stats[2],filtered_temperature_list_stats[3]]

# only update the normal range if different
if ordered(last_normal_range_event) != ordered(normal_range):
    IONode.set_output('out1', {'normal_range': normal_range})

Check the syntax and verify it passes. Press Save and Activate

Verify Data Stream in Data Viewer

The S5D9 IoT Fast Prototyping Kit board must be plugged into both Ethernet and USB for power. The Ethernet must be connected to the public Internet.

Configure Data Viewer

Save it.

Create Temperature Monitoring Workflow

Add temp3.avg, Base Python, Processed Stream Single to new Workflow

Add Code to Base Python

import Analytics
import Filter
import Store
import json

outputMsgList = []
temperature = IONode.get_input('in1')['event_data']['value']

#~~~~~~~~~~~~~~~~~~~~~~~
#
# Get Last Normal Range
#
#~~~~~~~~~~~~~~~~~~~~~~~~

# Get the normal range needed for one of the rules
normal_range = Analytics.events('normal_range', 
                             Filter.string_tag('normal_range.normal_range.temperature'),
                             None, 1, ['event_rcv', 'DESC'])

#~~~~~~~~~~~~~~~~~~~~~~~~~~~~
#
# Process each rule
#
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~
# check if normal range exists
if len(normal_range) > 0:
                
    # check if json is valid
    if 'temperature' in normal_range[0]['event_data']['normal_range']:
        min_threshold = normal_range[0]['event_data']['normal_range']['temperature'][0] 
        max_threshold = normal_range[0]['event_data']['normal_range']['temperature'][1]

        last_state = Store.get("temperature_outside_normal")
        if last_state is None:
            last_state = "false"
            
        if temperature < min_threshold or temperature > max_threshold: 
            log("Outside range")
            if last_state != "true":
                IONode.set_output('out1', {"alert": "Temperature outside normal range"})
                Store.set_data("temperature_outside_normal","true",-1)
                log("Alert transmitted")
        else:
            log("Inside range")
            if last_state != "false":
                Store.set_data("temperature_outside_normal","false",-1)
            log("Alert not transmitted")

Debug

After activating all the modules individually, I was able to see the debug output.

Create Vibration Monitoring Workflow

Base Python Code

import Analytics
import Filter
import Store
import json

outputMsgList = []
x_accel = IONode.get_input('in1')['event_data']['value']
y_accel = IONode.get_input('in2')['event_data']['value']
z_accel = IONode.get_input('in3')['event_data']['value']

#~~~~~~~~~~~~~~~~~~~~~~~
#
# Get Last Normal Range
#
#~~~~~~~~~~~~~~~~~~~~~~~~

# Get the normal range needed for one of the rules
normal_range = Analytics.events('normal_range', 
                             Filter.string_tag('normal_range.normal_range.x_accel'),
                             None, 1, ['event_rcv', 'DESC'])

#~~~~~~~~~~~~~~~~~~~~~~~~~~~~
#
# Process each rule
#
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~
# check if normal range exists
if len(normal_range) > 0:
                
    # check if json is valid
    if 'x_accel' in normal_range[0]['event_data']['normal_range']:
        x_accel_min_threshold = normal_range[0]['event_data']['normal_range']['x_accel'][0] 
        x_accel_max_threshold = normal_range[0]['event_data']['normal_range']['x_accel'][1]

        y_accel_min_threshold = normal_range[0]['event_data']['normal_range']['y_accel'][0] 
        y_accel_max_threshold = normal_range[0]['event_data']['normal_range']['y_accel'][1]

        z_accel_min_threshold = normal_range[0]['event_data']['normal_range']['z_accel'][0] 
        z_accel_max_threshold = normal_range[0]['event_data']['normal_range']['z_accel'][1]

        last_state = Store.get("vibration_outside_normal")
        if last_state is None:
            last_state = "false"
            
        if x_accel < x_accel_min_threshold or x_accel > x_accel_max_threshold \
            or y_accel < y_accel_min_threshold or y_accel > y_accel_max_threshold \
            or z_accel < z_accel_min_threshold or z_accel > z_accel_max_threshold :
            log("Outside range")
            if last_state != "true":
                IONode.set_output('out1', {"alert": "Vibration outside normal range"})
                Store.set_data("vibration_outside_normal","true",-1)
                log("Alert transmitted")
        else:
            log("Inside range")
            if last_state != "false":
                Store.set_data("vibration_outside_normal","false",-1)
            log("Alert not transmitted")

Debug Vibration Monitoring

Tap the board to generate vibration data exceeding average. Enable Debug Logging. Send.

Refresh logs

Click on logs to see data.

Create Alerts

Click on Dashboard

Click on Single User Table widget.

Click on gear box

Select processed:alert

Save Dashboard View at Top of Page

Configure Email Reports

Python code from Medium One is here

In Daily, set up users and schedule

Debug

Payload is {"sample_report": 0}

Toggle “Debug Logging Enabled”. Click Send

Verify Email Report

From my Gmail account.

Snippets from the report



$20 off S5D9 IoT Fast Prototyping Kit

Contact @jcasman for code or post your request below

Coupon Deadline Extended Until Sept 30, 2017

Codes are single use. We have a lot of codes.


S5D9 IoT Fast Prototyping Kit Coupons - $20 off!
Learn IoT Meetup - September 27, San Jose