ESP8266 NodeMCU firmware updating

My ESP-01 board has not being used for a few weeks due to other projects, and today, after updating my local nodemcu firmware repository I’ve found out that the latest build directory where the binary for the firmware was located was gone.
No problem.
We can find now the pre-compiled binaries on this link: NodeMcu Firmware Releases

There is also a cloud compiling option on this link: NodeMcu custom builds
If by any chance you are having second thoughts in providing your e-mail, use one from Fake Mails and wait for the firmware on the provided (temporary) e-mail.

Anyway, the procedure is always the same:

  • Close anything that using the serial port, for example the ESPlorer tool
  • Put the ESP8266 into firmware flashing mode by connecting the GPIO0 to ground and reset the board or power cycling.
  • Flash the firmware with ./esptool.py -p /dev/ttyUSB0 write_flash 0x000000 nodemcu_float_0.9.6-dev_20150704.bin
  • Remove the GPIO0 from GND and power cycle or reset
  • Connect to serial port and check the NodeMCU version

Update done!

Mosquitto Broker with Websockets enabled on the Synology NAS

Synology NAS are great devices, mainly due to the bundled software (CloudStation, Photo Station, to name a few), that also can run other software that is provided on non official package sources.
One of such providers is the SynoCommunity that provides a Mosquitto MQTT broker packaged for the Synology NAS. While I’ve successfully was able to cross compile the Mosquitto broker for my DS212+ NAS with websockets enabled (I do have a draft post with the steps that I’ve never published), the SynoCommunity package allows a simpler way to install and use a Mosquitto broker that has the WebSockets protocol support compiled in.

So, what is needed to be done?

Setting up:

  • 1st: Add to your Synology server the SynoCommunity package source. See the detailed instructions on the community web page.
  • 2nd: Install the Mosquitto package from the Synology package manager.
  • 3rd: Edit the file located at /usr/local/mosquitto/var/mosquitto.conf. You need to connect through ssh as root to do this.

At the end of this file add the following lines:

Listen 1883
Listen 9001
Protocol websockets

Note that the 1883 is the standard MQTT broker port, and 9001 is the port that I’ve choose have the websockets server listening/answering.

After the change, stop and start the Mosquitto broker on the package manager action dropdown box for the Mosquitto package.

Still using ssh, and logged in as root, check now that the port 9001 is active:

root@DiskStation:/volume1/homes/root# netstat -na | grep 9001 
tcp        0      0 0.0.0.0:9001            0.0.0.0:*               LISTEN       
tcp        0      0 192.168.1.250:9001      192.168.1.32:55331      ESTABLISHED

In the above case we can see that there is a websocket client connected to the broker from my workstation.

Testing:

For testing we can use the HiveMQ websocket client: http://www.hivemq.com/demos/websocket-client/ that allows to communicate using the websockets transport.

Despite the fact that the page is available on an external, internet based server, the websocket client will be running on YOUR web browser, on your machine in your own (internal) network, so at the connection settings for the broker, for Host I use my internal IP address: 192.168.1.250, and for the Port: I use the 9001 value.

I can now subscribe and publish to topics using the browser, and also using MQTT-SPY we can see and check the MQTT Websockets communication.

And that’s it!

ESP8266 – Updating the SDK to the latest version

So version 1.0 of the ESP8266 SDK as been released: https://espressif.com/new-sdk-release-2/.
This means that I need to update my cross compiling tools for the ESP8266 to the latest version.
I’ve installed and configured my environment according to this post: https://primalcortex.wordpress.com/2015/01/09/esp8266-setting-up-native-sdks-on-linux-and-exploring-some-applications/ and this post documents how I’ve updated my environment.

Updating the crosstools
This is quite easy, just go to the crosstool-NG directory and execute git pull. This will bring into your machine the latest release of the cross compiling tools.

Downloading and installing the Espressif ESP8266 SDK
At the date of this post I’ve downloaded this SDK version: http://bbs.espressif.com/viewtopic.php?f=5&t=321 This post refers to version 1.0.1_b1_15_04_02. At the bottom of this post there is a link for downloading the SDK. Just download the SDK: http://bbs.espressif.com/download/file.php?id=276

On my configuration, my SDK is on the directory ESP8266_SDK at /opt/Espressif. So I just make a backup of this SDK version and expand the new one:

cd /opt/Espressif
mv ESP8266_SDK ESP8266_SDK_old
unzip ~/Downloads/esp_iot_sdk_v1.0.1_b1_15_04_02.zip
mv esp_iot_sdk_v1.0.1_b1/ ESP8266_SDK
mv License ESP8266_SDK

We need finally to install some libraries:

cd /opt/Espressif/ESP8266_SDK
wget -O lib/libc.a https://github.com/esp8266/esp8266-wiki/raw/master/libs/libc.a
wget -O lib/libhal.a https://github.com/esp8266/esp8266-wiki/raw/master/libs/libhal.a
wget -O include.tgz https://github.com/esp8266/esp8266-wiki/raw/master/include.tgz
tar -xvzf include.tgz

And that’s it.

Compiling application and demos.
For compiling the IoT_Demo sample:

cd /opt/Espressif/ESP8266_SDK
mv example/IoT_Demo .
sed -i -e 's/xt-ar/xtensa-lx106-elf-ar/' -e 's/xt-xcc/xtensa-lx106-elf-gcc/' -e 's/xt-objcopy/xtensa-lx106-elf-objcopy/' -e 's/xt-objdump/xtensa-lx106-elf-objdump/' -e 's/xt-nm/xtensa-lx106-elf-nm/' Makefile
export COMPILE=GCC
export PATH=/opt/Espressif/crosstool-NG/builds/xtensa-lx106-elf/bin/:$PATH

But before running make, check what version of python is the default:

python -V

If it is version 3, then the python scripts run only on python 2, and we need a workaround:


cd /opt/Espressif/ESP8266_SDK/bin
ln -s /usr/bin/python2 python
export PATH=/opt/Espressif/ESP8266_SDK/bin:/opt/Espressif/crosstool-NG/builds/xtensa-lx106-elf/bin/:$PATH

Running now the python -V command should report version 2.7.

We can now run make:

...
...
...
make[2]: Entering directory '/opt/Espressif/ESP8266_SDK/IoT_Demo/driver'
make[2]: Leaving directory '/opt/Espressif/ESP8266_SDK/IoT_Demo/driver'

!!!
No boot needed.
Generate eagle.flash.bin and eagle.irom0text.bin successfully in folder bin.
eagle.flash.bin-------->0x00000
eagle.irom0text.bin---->0x40000
!!!
make[1]: Leaving directory '/opt/Espressif/ESP8266_SDK/IoT_Demo'

Success. We can now flash the IoTDemo.

md5deep and hashdeep on Synology DS212+

My personal photos are located at my desktop computer, and I have backups of them on my Synology DS212+ and my old faithful Linksys NSLU2 with an external disk.

The issue that I have now is that to keep backup times short I only backup the current year, because, well, the other years are static data. Previous years are already backed up and that data is not to get modified. Right? So what if corruption happens? How do I detected it? How can I be warned of such event? Right now I have no method to check if my digital photos of 2003 are totally intact in all of my three copies. And if  I detect corruption on one bad file/photo, which file is the correct one?

So I’m investigating this issue, and one of the tools available to create checksums and then to verify if everything is ok and no corruption has appeared is the md5deep and hashdeep programs. These are available as sources at this link: http://md5deep.sourceforge.net/start-hashdeep.html

These are the instructions for cross compiling these tools for the ARM based Synology DS212+. As usual this is done on a Linux Desktop/server machine.

1st) Set up the cross-compiling environment: Check out this link on the topic: https://primalcortex.wordpress.com/2015/01/04/synology-ds-crosscompiling-eclipseibm-rsmb-mqtt-broker/

2nd) Setting up the cross compiling environment variables is now a bit different:

export INSTALLDIR=/usr/local/arm-marvell-linux-gnueabi
export PATH=$INSTALLDIR/bin:$PATH
export TARGETMACH=arm-marvell-linux-gnueabi
export BUILDMACH=i686-pc-linux-gnu
export CROSS=arm-marvell-linux-gnueabi
export CC=${CROSS}-g++
export LD=${CROSS}-ld
export AS=${CROSS}-as
export AR=${CROSS}-ar
export GCC=${CROSS}-g++
export CXX=${CROSS}-g++

We will use the C++ compiler.

3rd) Create a working directory and clone the repository to your local machine: git clone https://github.com/jessek/hashdeep.git
4th) Change to the hashdeep directory and execute the following commands:

[pcortex@pcortex:hashdeep]$ sh bootstrap.sh
[pcortex@pcortex:hashdeep]$ ./configure –host=arm-marvell-linux-gnueabi
[pcortex@pcortex:hashdeep]$ make

And that’s it. If everything went well then at hashdeep/src directory the commands are available:

[pcortex@pcortex:src]$ file hashdeep
hashdeep: ELF 32-bit LSB executable, ARM, EABI5 version 1 (SYSV), dynamically linked, interpreter /lib/ld-linux.so.3, for GNU/Linux 2.6.16, not stripped

We can now copy the commands to the DiskStation and start using them.

Now all I need is to devise a method that creates and checks each year photo hash/signatures, and warns me if a difference is detected. I’m thing on using the audit mode of the hashdeep command, and do each day a check for one year, for example, on Mondays check 2003 and 2013, on Tuesday check 2004 and 2014 and so on.

Two Factor Authentication for Synology and others – Alternative to mobile Apps

One way to secure the access to our Synology Diskstation Web Management interface and File Manager tool is to enable the two factor authentication (2FA). This means that we need to have something we know (the username and password) and something that we have (a mobile phone) to access these interfaces.

Check out the following Synology page: https://www.synology.com/en-us/knowledgebase/tutorials/615

For that we need to install Google Authenticator that is a mobile application so that we can get the time depended code (TOTP)  needed on the two factor authentication process.

This works fine, but what if I loose my mobile phone? And what if I’m too lazy to get up and get my mobile phone or tablet to get the TOTP to login?

In the first case, if you have e-mail notification correctly configured on your Synology DiskStation you can get an emergency code to login again. But if you haven’t, only by accessing through ssh/telnet you can recover the 2FA key to get again a valid TOTP).  The keys and available emergency codes are located at /usr/syno/etc/preference/admin/google_authenticator

For the second situation there is a solution (well two, but I’ll use the simplest one) to achieve this. What we need is to install on our PC, a trusted one, at least, an application that mimics the mobile Google Authenticator application. This application is GAuth:  https://5apps.com/gbraad/gauth We can installed as an add-on on our browser or launch directly from a web site or in my case from a local directory. For this I download it and added a shortcut to the index.html file. The application is available at https://github.com/gbraad/gauth an we can get a copy with the command git clone https://github.com/gbraad/gauth

Accessing the application, it will store the Secret Key into the Browser Local Storage. It’s not stored anywhere else so it is safe. Now we only need to get the key from the above Synology directory and we are all set. We can check if the generated GAuth code is the same as the code generated by the mobile device, and if yes, we have a backup!

Node-Red: Push Notifications to Google Cloud Messaging GCM

The following instructions will allow Android devices to receive notifications from Node-red flows using the Google Cloud Messaging infra-structure. This is achieved by using the Node-js library for accessing GCM (Google Cloud Messaging) node-gcm.

Installing node-gcm:

We need to install the Google Cloud Messaging node supporting module either locally on Node-red or globally. If installing locally just cd to the node-red directory and do npm install node-gcm. Globally the command needs the -g switch added like this: sudo npm install node-gcm -g.

Configuring Node-red:

To be possible to invoke functions from Node-gcm from Node-red steps, like functions, we need to make the node-gcm module available. For that go to the base directory of Node-red and edit the settings.js file.

Modify the file so that the following configuration is now defined:

   functionGlobalContext: {
// os:require(‘os’),
// bonescript:require(‘bonescript’),
// arduino:require(‘duino’)
 gcm:require(‘node-gcm’)
},
Starting up Node-red and our first push notification to GCM:
Start up Node-red by using the command line: node red.js -s settings.js . It’s important and imperative that we now start uo with the modified settings.js file or some other file with the above configuration.
We can now add a function step with the following code to push a notification:
var gcm = context.global.gcm;

/// Define our message:
var message = new gcm.Message({
collapseKey: ‘node-red’,
delayWhileIdle: true,
timeToLive: 3,
data: {
key1: ‘Node-Red Notification’,
key2: ‘Hello android from Node-red’
}
});

// Set up the sender with you API key
var sender = new gcm.Sender(‘AIza…’);  // Replace with your KEY!!!!!

// Add the registration IDs of the devices you want to send to
var registrationIds = [];
registrationIds.push(‘APA…..’);  // Replace with your device registration ID!!!!!

// Send the message
// … trying only once
sender.sendNoRetry(message, registrationIds, function(err, result) {
if(err) console.error(err);
else    console.log(result);
});
return msg;

And that’s it:
I’m using an inject node to activate the above function and I can see at the console:
{ multicast_id: 9123452345452345,
success: 1,
failure: 0,
canonical_ids: 0,
results: [ { message_id: ‘0:7957832452%3676573df353’ } ] }

If you receive other messages, like failure:1, there is something wrong with your Google GCM key or your registration device id is invalid.

For example:

{ multicast_id: 5439642763222344000,
success: 0,
failure: 1,
canonical_ids: 0,
results: [ { error: ‘InvalidRegistration’ } ] }

This means that the registrationId.push value doesn’t belong to any registered device to our GCM project.

If you just receive 401 that means that you’re using an Invalid Browser GCM key. Check the Google developer console to get the correct key.

Testing:

I’ve done my basic testing with the sample Google GCM client available at https://code.google.com/p/gcm/source/checkout.

I’ve used Android Studio and the provided gcm-client source code to compile and test the communication.

The fact is that we need first to compile and make the GCM client application run at least once to get the registration id so we can then paste it on the above function.

We also can add to Node-red the registration process so that a devices informs the Node-red flows of it’s registation ID, and for that we need to change the gcm-client program.

IoT framework: Using NodeRed.

In my previous post https://primalcortex.wordpress.com/2015/02/25/setting-up-an-iot-frameworkdashboard-with-nodered-moscamosquitto-and-freeboard-io-dashboard/ we’ve installed and configured a MQTT broker and a web based dashboard to see our data in realtime.The real time MQTT data is viewed through the MQTT protocol over websockets from our browser using the freeboard dashboard. This configuration means that in reality our browser is now a MQTT client.

But I also talked about installing Nodered, but ended up not referring it again on the post in how to use it and why the reason that it belongs to this IoT framework. Well, mainly because the post was already too long, but another reason is that Nodered is not directly related with presenting data but consume it, processing it, store it and communicating with other systems/platforms. Nodered might be the central point to orchestrate our devices, systems and platforms. Also Nodered can provide native websocket protocol, if we can named it that, and not MQTT over websockets (that needs an enabled broker with MQTT over WS), allowing us to bridge MQTT over native websockets if we do not have a native MQTT websocket enabled broker.

Expanding Freeboard.io dashboard to consume websockets:

Before getting into Nodered, let’s expand our Freeboard.io dashboard to consume native websockets. This is needed because on the previous post we configured it to consume MQTT over websockets, and now we need native websockets protocol. We are installing our IoT server on a directory named IoTServer on our home directory.


cd ~/IoTServer
git clone https://github.com/Freeboard/plugins.git

We need to copy now the two support files from these freeboard plugins to the plugins directory:

cd ~/IoTServer/freeboard/plugins
cp ~/IoTServer/plugins/datasources/plugin_node.js .
cp ~/IoTServer/plugins/datasources/plugin_json_ws.js .

Now we need to change the index.html file to enable the two new plugins like this:

...
...
head.js("js/freeboard+plugins.min.js",
"plugins/mqtt/paho.mqtt.plugin.js",
"plugins/plugin_json_ws.js",
"plugins/plugin_node.js",
// *** Load more plugins here ***
function(){
...
...

And that’s it. We can now consume/display native websocket data on the freeboard dashboard using the ws:// URI.

Before we do anything with this, namely configure a dashboard widget to consume data from a Nodered websocket, we will just set up this little project, to make some sense from this:

We receive the current temperature value from a device over MQTT. From time to time, the device publishes the current temperature onto the MQTT /temperature topic.
We will use Nodered to receive the published data, and to store it onto a MySQL table. At that time Nodered will compute the mean value for all the received temperatures and publish that on a websocket. Simple, right?

For testing purposes I’m using Mosquitto 1.4 with websockets enabled, and I’m publishing data, on the same machine where the broker is running with the command: mosquitto_pub -t /temperature -m 20

So the code for Node-red is the following (to see it, just launch the Node-red web interface, and select Import from Clipboard, and paste the bellow code):

[{"id":"db4020c1.3b133","type":"websocket-listener","path":"/ws/data/meantemp","wholemsg":"false"},{"id":"39c70e78.3a446a","type":"mqtt-broker","broker":"localhost","port":"1883","clientid":"NodeRed"},{"id":"2319ac20.bdda04","type":"MySQLdatabase","host":"127.0.0.1","port":"3306","db":"nodered","tz":""},{"id":"5c60ad12.fd3c54","type":"mysql","mydb":"2319ac20.bdda04","name":"Query BD","x":524,"y":447,"z":"78c46bf9.04906c","wires":[["24e2e587.55d6b2"]]},{"id":"a7132398.e6dd38","type":"inject","name":"","topic":"select * from DataTable","payload":"","payloadType":"string","repeat":"","crontab":"","once":false,"x":284,"y":447,"z":"78c46bf9.04906c","wires":[["5c60ad12.fd3c54"]]},{"id":"24e2e587.55d6b2","type":"debug","name":"","active":true,"console":"false","complete":"false","x":759,"y":447,"z":"78c46bf9.04906c","wires":[]},{"id":"5e11fbbf.b94c04","type":"mqtt in","name":"Receive MQTT Temperature","topic":"/temperature","broker":"39c70e78.3a446a","x":226,"y":271,"z":"78c46bf9.04906c","wires":[["38bbe4cd.bc04e4","e91f382a.016928"]]},{"id":"9817c3a7.d2627","type":"mysql","mydb":"2319ac20.bdda04","name":"Store Temperature","x":757,"y":274,"z":"78c46bf9.04906c","wires":[[]]},{"id":"38bbe4cd.bc04e4","type":"function","name":"Build MySql Query","func":"var sdata = new Date().toISOString();\nvar query = \"Insert into DataTable values ( '\" + sdata + \"','temp',\" + msg.payload + \");\"\nmsg.topic = query;\nmsg.payload = {};\nreturn msg;","outputs":1,"x":508,"y":272,"z":"78c46bf9.04906c","wires":[["9817c3a7.d2627"]]},{"id":"e91f382a.016928","type":"function","name":"Calc Mean Temp","func":"var query = \"SELECT avg(t1.value) as median_val FROM (SELECT @rownum:=@rownum+1 as `row_number`, d.value FROM DataTable d, (SELECT @rownum:=0) r WHERE 1 ORDER BY d.value ) as t1, ( SELECT count(*) as total_rows FROM DataTable d WHERE 1 ) as t2 WHERE 1 AND t1.row_number in ( floor((total_rows+1)/2), floor((total_rows+2)/2) ); \";\nmsg.topic = query;\nreturn msg;","outputs":1,"x":499,"y":117,"z":"78c46bf9.04906c","wires":[["5438e8f2.12939"]]},{"id":"5438e8f2.12939","type":"mysql","mydb":"2319ac20.bdda04","name":"Get and Calc mean temp","x":757,"y":116,"z":"78c46bf9.04906c","wires":[["a3ca0282.6a518"]]},{"id":"21e6d8fe.7ab568","type":"websocket out","name":"WebSocket: Mean Temp","server":"db4020c1.3b133","client":"","x":1209,"y":116,"z":"78c46bf9.04906c","wires":[]},{"id":"a3ca0282.6a518","type":"function","name":"Extract mean","func":"var meanval = msg.payload[0];\nmsg.payload = meanval.median_val;\nreturn msg;","outputs":1,"x":987,"y":116,"z":"78c46bf9.04906c","wires":[["21e6d8fe.7ab568"]]}]

For watching the mean temperature value that is published on the websocket we will use the new freeboard plugins installed previously:

1st – Add a data source:

Goto your freeboard dashboard, select Add on the Datasources, and select the JSON Websocket Push Datasource. Give it a name, and then the URL.

VERY IMPORTANT: because this is a websocket URL the websocket has the ws:// prefix and NOT http://.

In our case the url for node red is: ws://node-red:1880/ws/data/meantemp

2nd – Add a widget: And now we can add a widget based on the above data source.

That’s it. We can consume now data from websockets from Node-red and MQTT websockets from Mosquitto/Mosca MQTT broker.