This project turns your S3A7 IoT sandbox into a home temperature monitor. It takes a temperature measurement at a time interval of your choice and stores it in a persistent log file. After a specified number of readings, the data are emailed to you. I used the system to monitor the temperature in my home while I was away for the holidays. The temperature was measured every hour, and the data were emailed to me every 24 hours.
This was my ‘Hello World’ project for the IoT sandbox. The goal was to familiarize myself with the system so I could do more complex projects later. This is intended for beginners.
In this project you will:
- Create new workflows
- Modify existing workflows
- Learn how to customize workflow triggers
- Learn how to use the Renesas ‘Store’ Library to save information
Step 1: Complete the Smart Chef demo
The Home Temperature Monitor is nothing more than a modification to the code for the Smart Chef demo. Once you set up the S3A7 board for Smart Chef, it will be connected to the internet and you will have the API user/key that you need to run the temperature monitor.
The Smart Chef tutorial is located here: http://renesas-blog.mediumone.com/renesas-s3a7-fast-iot-prototyping-kit-with-smart-chef-demo-quick-start-guide/
Step 2: De-activate all Smart Chef workflows except two
The two that we will re-use are titled, “Temperature and Humidity Sampling” and “Temperature and Humidity Processing”.
On the left hand side of the sandbox navigate to ‘Workflow Studio.’ That will show you a grid of boxes representing all of the Smart Chef workflows. For each workflow that you wish to deactivate, first click on it. You will see the workflow code module with its triggers and optional processed output streams.
On the right hand side of the screen, navigate to ‘Revisions.’ Clicking on that will show you the active revision and allow you to click on the x to deactivate it. Follow these steps for the roughly 25 Smart Chef workflows except the two mentioned above.
Step 3: Deactivate all unnecessary data streams
On the left hand side of the Sandbox screen, navigate to “Config – Data Streams” and you’ll see the following screen.
Moving the cursor into the ‘Actions’ column will reveal an ‘Edit’ pencil which you then click to reveal the data stream tags. In the ‘processed’ stream, un-check the ‘Active’ box next to every tag except ‘temperature’ and ‘humidity.’ Proceed to un-check every ‘Active’ box in the geo_location, notifications, recipes, settings, and weather data streams. In the raw events stream, you need to un-check every stream except ‘temp_and_humidity’ and ‘connected.’ Finally, check only ‘temp’ and ‘humidity’ in the ‘sensor_read’ data stream. The sensor reports both temperature and humidity for every read, so I’ve left the humidity tags active.
Step 4: Create the Temp Project Initialization workflow
This workflow initiates operation of the home temperature monitor. When the IoT board is powered up, it sends a ‘connected’ message which triggers this workflow. The code in the workflow creates a virtual file where temperature data is logged. It also sends you an email so you know the process has started.
In the Workflow Studio, click on the ‘Create’ workflow box then title the workflow “Temp Project Initialization.” On the right hand side of the empty work area, select ‘Tags & Triggers’ and drag a raw:connected trigger box into the work area.
Next, drag a Modules:Foundation:Base Python module into the work area and connect it to the trigger as shown.
Here is the code to cut and paste into the Base Python module:
''' This workflow initiates the home temperature monitor. It is triggered when the IoT board with modified SmartChef code first comes online. The trigger is raw:connected. The workflow creates the global persistent storage value key "temp_data" and initializes it. This can be accessed from other workflows. Read the Renesas IoT Sandbox 'Store Library' documentation for all of the details. This workflow also sends a notification via email that temperature monitoring has started. Created December 15, 2016 by DK ''' import Email import Store # Fill in your to/from email addresses here TO_EMAIL = "email@example.com" FROM_EMAIL = "firstname.lastname@example.org" # Initialize data storage with key 'temp_data.' # By default ttl=1 week which is more than enough for this project which records in 24 hour blocks # ttl can be made infinite if necessary - see library docs. Store.set_global_data("temp_data", "") input_dictionary = IONode.get_input('in1') is_connected = input_dictionary['event_data']['value'] is_connected_str = str(is_connected) # Note that times are in UTC. US Pacific Standard is 8 hours behind connected_time_str = input_dictionary['observed_at'] # The Exception message below should show up in the debug log when debugging is enabled. # I haven't had much success with the debugging log, but I left it in for now. try: email_text = "Connected = "+is_connected_str+" Time: "+connected_time_str+"\r" test_email = Email.Email(FROM_EMAIL,'El Guapo',TO_EMAIL,'Temperature Logging Initiated') test_email.text_message(email_text) test_email.send() except Exception: log("Unable to send Email")
Be sure to click ‘Save and Activate’ in both the trigger and code modules before exiting.
Step 5: Modify the “Temperature and Humidity Handling” workflow
In this case, we don’t change any of the code in the Python module. The code requests a temperature/humidity reading from the ENS210 temperature and humidity sensor. All we need to do is change the triggers. Smart Chef has many different events trigger a temp/humidity read. We only want the reads to happen according to a single schedule. The original workflow looks as follows.
First, delete all four triggers by clicking on the x in the upper right hand corner of each one. Then, open the Base Python module and click on the arrow next to “Inputs/Outputs.” Delete all of them except ‘in1’ by clicking the x’s to the right.
Now, drag a single hourly scheduled trigger on to the work space and connect it to the workflow as follows.
You must make a few choices about precisely when to schedule the triggers. At the end, the workflow should look like this. As long as the IoT board is powered up, it will take a raw temperature and humidity measurement according to the trigger schedule you set.
Step 6: Modify the Temperature and Humidity Processing workflow if desired
The ENS210 temperature and humidity sensor will report a raw measurement to the S3A7 according to the schedule set in step 5. The raw measurement event triggers this workflow which decodes the measurement and outputs a processed stream with the temperature in units you choose. If degrees Celsius work for you, no code change is required. I think in Fahrenheit, so I made the following small modification to the Base Python module code.
# out['temperature'] = round(float(t_data) / 64 - 273.15,2) # I commented out the above line and added the four lines below to output in Fahrenheit TinK = float(t_data) / 64 TinC = TinK - 273.15 TinF = round(TinC * 1.8 + 32,2) out['temperature'] = TinF
Step 7: Create the final Temp and Humidity Data Handling workflow
As in step 4, create a new workflow and name it “Temp and Humidity Data Handling.” This final workflow will be triggered by a processed temperature and humidity event generated by the last workflow. The final workflow looks like this.
Here’s how you add the processed temperature trigger. As before, just drag the crossed arrows cursor onto the work space and the trigger box will appear. Do the same with the Base Python module from the modules:foundation drop down menu.
Finally, here is the Python code to paste into the module. This records the data every hour for 24 hours and then emails it. After emailing, it wipes the data clean and starts over.
''' This workflow takes processed temperature data and stores it in a persistent global location using the Renesas Store class library. Data is logged at an interval set by the workflow trigger definition (one hour in this case) and emailed after a certain number of datapoints are recorded. The data file is wiped clean after the file is emailed. Created on: December 15, 2016 by DK ''' import Email import Store # Enter the email addresses you'd like to use FROM_EMAIL = "email@example.com" TO_EMAIL = "firstname.lastname@example.org" # Number of data readings before the log is emailed and erased NUM_READINGS = 24 # The call to global key "temp_data" returns saved data_log in unicode data_log = Store.get_global("temp_data") # Convert to a str object ... now the file is a long string that can be appended # The instructions for doing this are in the 'Store Library' documentation on the Medium One website data_log_str = "" # Initialize the string variable which will hold file contents data_log_str = data_log.encode("latin-1") input_dictionary = IONode.get_input('in1') temp_in_f = input_dictionary['event_data']['value'] reading_time_str = input_dictionary['observed_at'] temp_in_f_str = str(temp_in_f) # Note that time is in UTC. Pacific Standard Time is 8 hours behind UTC reading = "Time: "+reading_time_str+" Temperature: "+temp_in_f_str+" \r" # Now, append the global file string with the new reading data_log_str = data_log_str+reading # Count the number of readings in the string by counting "\r" return characters number_of_readings = data_log_str.count("\r") if number_of_readings < NUM_READINGS: #Less than required number of readings Store.set_global_data("temp_data", data_log_str) #Save the appended file else: #Number of readings = NUM_READINGS. Time to email and reset try: test_email = Email.Email(FROM_EMAIL,'El Guapo',TO_EMAIL,'Temperature Data Log') test_email.text_message(data_log_str) test_email.send() Store.set_global_data("temp_data", "") # Clear the data so the process can start over except Exception: log("Unable to send Email") Store.set_global_data("temp_data", "") # Clear the data so the process can start over
That’s it! When you power up the board, it should send you an email that looks something like this.
Then you should receive data logs according to the intervals you specify in code. While I was away for 6 days, I received 6 of these emails. I didn’t have time to write Python to translate the time stamps into a more readable format, so for now they are in UTC which is 8 hours ahead of PST.
Fortunately, there is very little code here to debug. I was not able to find a way to set break points etc. or output to a screen during debugging. For lack of a better solution, I used emails from within the code to email me data etc. if there was a place in the code I needed to test. Although cumbersome, this did work.
Here’s an example of how to use the debugger to simulate a raw:connected event from the workflow studio without having to power up the IoT board. This should trigger the first workflow we created, Temp Project Initialization, to send you an email as outlined in step 4 above. This can also be used to simulate other events such as ‘processed temperature’ which trigger other workflows.
I’m interested in adding the ability to use my smart phone to check the temperature at any time. I’ll probably also add humidity since that comes for free. The possibilities are endless!