As we pointed out in part I of this series, the days when hardware and software were separate entities are long gone. Along with the ongoing worldwide digitization and digitalization, the integration between the virtual and physical worlds is getting stronger – making the Internet of Things (IoT) a real thing. Going further, we can leverage blockchain features like data immutability, distributed character, and exploit the smart contracts it offers to connect it with IoT applications to build a trusted and all-inclusive solution. We’ve created a two-part tutorial to showcase how to deploy and connect all the pieces. In part I of our tutorial, we set up a private Hyperledger Besu network and configured a Raspberry Pi to sign transactions with its own keys. We used this to write a Python program, hosted on Raspberry, sending Ether to other participants of the network. The final solution used quite a broad technological stack: Terraform, Amazon Web Services, MetaMask, and, finally, Raspberry Pi. In part II, we don’t want to lower the bar, so here we add some extra technological stack on the top of what we had.
Let us consider a contract between a supplier and a buyer. They both agree on some delivery time, conditions, pick up and drop off locations, and, of course, the price the buyer is willing to pay. Among the conditions, they can agree on some penalty for delayed delivery or that the parcel couldn’t be exposed to a temperature higher than some threshold longer than X minutes. While the delivery time can be verified, the monitoring of the second condition is not that easy to verify. Even if we suspect that the temperature threshold has been exceeded (because, for example, the delivered food is spoiled), how can we prove that transport conditions were inappropriate?
In part II, Ratified Smart Contract for Constrained Delivery with Oraclized Data, we created an automated solution covering the entire supply process, including:
- publishing the delivery offer with specified conditions,
- acceptance of the conditions by contractor,
- starting the delivery by an authorized person (using personalized hardware signature),
- monitoring the delivery temperature and contract voiding conditions, confirming the delivery, and
- calculating the final payment for the supplier.
To achieve this, we take advantage of the Hyperledger Besu network that was deployed in part I and now write a smart contract on the top of it that handles the delivery operations.
Here we propose an automated solution that will cover the entire supply process from publishing the delivery offer with specified conditions, accepting the conditions by contractor, starting delivery by an authorized person (using personalized hardware signature), monitoring the delivery temperature and contract voiding conditions, confirming delivery and, finally, calculating final payment for the supplier. To achieve this, we will use the Hyperledger Besu network that was deployed in the previous part of this post and write a smart contract that will handle the delivery operation.
Smart contract design
To develop and deploy the smart contract we will use Remix IDE. We will write it with pythonic Vyper language and its compiler. Instead of Remix, you could use Truffle Suite instead. To better visualize the purpose of the contract we are going to develop, let us recap the functionalities it should have:
- publish smart contract to the network with specified delivery conditions
- allow contractors to accept conditions and take the contract
- start the delivery by an authorized person
- monitor temperature of the package and check voiding conditions
- finalize the delivery
- calculate the final payment for the contractor
First, you have to install both Remix and the Vyper compiler. The first one could be used either as the web-hosted version or cloned from Github and deployed locally. I’m using the second option, the version with docker-compose (as in the installation guide on the repository). Vyper can be installed according to the instruction in the documentation, and I did it using pip. Now, I can just type vyper-serve to have a locally hosted compiler.
The functions mentioned above are designed to cover the whole process, which could be extended even more by placing bids by potential contractors, requiring an authorized person to finalize the delivery, adding additional conditions (e.g., monitoring not only the temperature but also the acceleration of the package to detect too high shocks) – but this is case dependent. You should be able to extend the template we provide here to suit your needs.
It’s quite long, but we have to consider everything here. Just to give you an overview, we have to store addresses of the contract owner and contractor, conditions of the delivery, all the measurements and measurement counter, delivery start and end times, time for which the temperature has been exceeded, booleans stating whether the contract has been voided or has ended, total penalty for the delayed delivery and final payment for the contractor. Additionally, we have three structures: Coordinates (with geographical coordinates describing locations), ContractConditions (with all the constraints and nominal payment), and Measurement (as time-stamped temperature). There is also one event LogConditions, which is used to check the conditions of the delivery. Finally, we have a constant array with two addresses, which will be used to identify the authorized personnel.
There are also some flavors that should be explained. Note that some variables are public, and some are not (i.e., they are private). All variables that are public have getter functions generated automatically, which means anyone can check their values. It’s important to note that all the variables are initialized to 0/False. Finally, the latest version of Vyper supports the decimal type for floating-point operations. Unfortunately, it is not widely supported (yet) by some intermediate libraries. Therefore we stick here to the fixed-point computations.
Now we can move to functions.
The constructor function __init__ is fired at the moment of contract deployment. It means that we have to provide all its parameters during its deployment. Note that there is a particular variable msg.sender – it gets the address of the transaction’s sender account. Once we have the conditions set, we should give the possibility to check them and to accept them as a contractor. The first one takes advantage of the event that we have created earlier and will show the conditions as a log. Function accept_conditions simply assigns the address that triggered this function to a contractor role.
Once a contractor accepts our conditions, we are all set and can proceed with the delivery. Once the package is picked up, one of the authorized persons should trigger the start_delivery function.
Note that the sender of the transaction must be in the list of valid approvers (i.e., be authorized) in order to start the delivery. Otherwise, the transaction fails, and the delivery cannot be started. Starting the delivery here is limited to setting the adequate variable to block.timestamp, which contains the block creation time. Note that if the function fails or has not been executed, the delivery_start variable is 0.
Alright, the delivery has started, what’s next? The cherry on the cake – condition monitoring. The code with the functions used here is listed below:
Here we have three functions, and two of them have the decorator @internal (and, so far, all the functions were marked as @external). Why’s that? Internal functions can be called only from another function inside the same contract, while external ones can be called by anyone. So let’s get through them. The first one is a simple utility function returning time difference between two timestamps and is used to increase the readability of the code. The second function is used to check whether the contract should be voided. It calculates the time for which the temperature threshold has been exceeded. If that time is longer than the maximal allowable exposition period, the contract is voided. The status of the variable denoting whether the delivery has ended is set to true. Note that the final payment is not changed here and is set to 0, which means that the contractor gets nothing for such delivery, as the delivered goods should be thrown away.
But to decide whether conditions were violated or not, we need some measurements, and that’s why we have the third function here: store_measurements. First, we have three assert statements to assure that contract has not ended yet, that the delivery has started, and that the measurements were collected after the delivery has begun. Then, subsequent measurements are added to the measurement storage we’ve declared earlier. At the end of each execution, the counter of the measurements is incremented, and contract voiding conditions are checked.
Assuming that the conditions of the delivery were as they should be, we need functions to finalize the delivery (saying that the package arrived at the destination point) and calculate the final payment for the contractor.
Again, we have one internal and one external function. The internal one is used to calculate the total penalty for the delayed delivery. If the package arrives on time, the penalty is zero, and the final payment is the same as the nominal payment included in the delivery conditions. If the delivery is delayed, the penalty is proportional to full hours of delay, and the final payment is decreased accordingly (no more than to 0). The second function is called once the package is delivered, and we check if the contract has not been voided (just in case). Here we could also add checking whether the sender is authorized to confirm the delivery.
That’s it. We have everything we need to set up, monitor, and control the delivery. To make it shine, however, we also need the hardware layer, which will be responsible for both signing the transactions using hardware signature and measuring the temperature.
Note also that every operation on Ethereum-based network costs gas. We are aware of it, and you should be too. Nevertheless, this shows the advantage of private networks. Ether here has no real value, so we can spend it freely without losing actual money as our network is configured to have free gas. But let’s not forget that the calculations have to be executed somewhere, so computational complexity is another limitation to keep in mind.
On the left-hand side, under “Deployed contracts”, you’ll see all the external functions and public variables from the contract. By clicking them, you can either call functions or check the values of the variables. Here, you have a sandbox environment for interacting with your smart contract to fine-tune what you need.
Let’s get to the hardware. First of all, we’ll need a smart card that will keep our private keys safe and an NFC reader. As the smart card we’ll use Blockchain Security 2 Go and uTrust 4701 F as the NFC reader. The card comes with python library, which we have used to develop Ethereum transaction signer, available on Github. We’ll use it as a module in our python script. First, we need to download the dependencies: blocksec2go and ecdsa, so we’ll use pip to do it. Note that we’ll be building our application on the top of what we had in the first part of this article, so I assume that you already have web3 and python-dotenv packages.
After installing the blocksec2go-ethereum module, you should be able to use your card to sign the transactions and send them through web3 to the blockchain. You can try it out using one of the examples from the repository. The problem is that the Hyperledger Besu we have implements PoA consensus, requiring the extraData field to be at most 32 bytes, which is not supported natively by blocksec2go module. Therefore, on top of it we have to write our own code, encoding the signed transactions, which (saved as security2go.py) you can find here:
To supplement the above, we need also customized web3 utils module (named web3_utils.py), which you can find here:
As the temperature sensor, we’ll use Altimu-10 v5 module (which, in fact, is an IMU module that happened to have a temperature sensor), but any other temperature sensor will be fine. We’ll use the I2C interface to get the temperature so, in order to get it working, we have to enable this interface on the raspberry pi first. To do it, we can enter raspi-config command, and in the “interfaces” menu, enable interface I2C. We’ll also need i2c-tools to see if everything is connected properly, which can be installed with sudo apt install i2c-tools. Once you have it, type i2cdetect -y 1 (if you connected to the I2C interface labeled as 1, otherwise substitute “1” accordingly), and you should see addresses of I2C-enabled devices
In our case we have three devices on the Altimu module: LSM6DS33 (accelerometer and gyroscope) with address 0x6b (1101011b), LIS3MDL (magnetometer) with address 0x1e (0011110b), and LPS25H (barometer) with address 0x5d (1011101b). We’ll use the first one only, but, as you can see, the possibilities are broader than just temperature measurements.
To use I2C from the python level, we’ll use the pigpio library, which can be installed using sudo apt install pigpio python-pigpio python3-pigpio. Here’s the complete code with a class for management of Altimu module:
In the constructor, we have to open the I2C port first. Then, we have to send the appropriate configuration to the module through the configure function. Finally, we have the function for reading the temperature. Since the module stores the temperature on two bytes and the value should be converted to two’s complement, we have to process it accordingly. Finally, we convert it to the temperature in Celsius degrees by dividing the value by 16 and adding an offset with value 25 (which is taken from the documentation).
Remember that you have to have the pigpio deamon running in the background. It can be turned on by typing sudo pigpiod in the terminal.
Here you can see, how the hardware setup looks like:
Now, we have to put all the parts together and see how it works. We’ll extend our API from part I by the contract manager and altimu classes. You can find the complete code below:
Note that we have added a new field in the file with the environmental variable: CONTRACT_ADDRESS in .env file, which is the address where the contract is deployed. We have to create also a new file “abi.json” containing the Application Binary Interface (ABI) of our contract (which we can copy from Remix).
Once the contract is deployed on our Besu network, the program can be started. First, it waits until a contractor accepts the delivery conditions. When it’s done, it starts the delivery (requiring the use of the smart card) and continuously sends the subsequent temperature measurements to the contract. Once the contract is finished (which happens outside the program), we’ll not be able to send the measurements anymore, and the final payment will be shown along with the information about the fact that delivery has ended. This gives us the possibility to close the I2C connection with the sensing module.
Right now all the measurements are configured to be signed by the smart card, but you can easily change it by substituting the signer of a transaction to the one from part I. However since we are using hardware-signed transaction to notarize the measurement data, we have Raspberry Pi configured as a data oracle.
We have shown a full stack that can be useful while integrating enterprise blockchain solutions, based on the Hyperledger Besu, with IoT devices. To add some flavor to our development, we have also shown a smart contract designed and developed for a specific use case, where, in turn, we’ve injected measurements from temperature sensors. We have also included a smart card to sign the transactions to be sure that only authorized persons could trigger some functionality of the smart contract. Nevertheless, it was a very concrete use case, and it was only the tip of the iceberg. There’s much more, and the possibilities are endless. We hope that this tutorial will give you the thrill to build your own blockchain application.
If you have any questions, feel free to contact us!
Krzysztof Radecki, CEO at rexs.io
Marek Tatara, R&D Projects Coordinator at DAC.digital