Smart contracts using Ethereum for Impatient Part-I


Ethereum is to date the most advanced blockchain platform ever created. In this blog series, I will be explaining how to deploy your own ethereum private network and write smart contracts on top of it.This tutorial series is for the impatient who already know the basics of blockchain and ethereum and want to quickly try writing smart contracts. In the first part of this tutorial, we will learn how to set up private Ethereum network,
while in the second part we will see how to write, deploy and publish smart contract on private ethereum network with an example.

Note: This blog assumes that you have a basic understanding of Blockchains,  Cryptocurrencies, and Smart contracts. If you don’t, will recommend to at least watch few videos on Ethereum and Smart contracts on youtube

What is Ethereum?

Ethereum can refer to both a a framework for deploying blockchain and the ethereum network created using ethereum framework. Ethereum bockchain is capable of running programming code of any decentralized application (also known as dApps). Similar to bitcoin, miners in ethereum network work to earn Ether (a token that fuels the network). Ethereum comes with a Turing complete execution environment  “Ethereum Virtual Machine” (EVM) which is used to execute the smart contracts.

Setting up private ethereum network:

In this section we will learn how to setup your own ethereum network. Private ethereum network means setting up a new blockchain and this should not be confused with the ethereum main network. To setup private ethereum network you need to perform following steps:

  1.  Getting and installing Geth ethereum client

    sudo apt-get install software-properties-common
    sudo add-apt-repository -y ppa:ethereum/ethereum
    sudo apt-get update
    sudo apt-get install ethereum

  2. Writing the Genesis block

    A genesis block is the first block of a block chain. In order to deploy a private blockchain you need to define configuration for the a genesis block. Following is the sample genesis block

“nonce”: “0x0000000000000042”,
“mixhash”: “0x0000000000000000000000000000000000000000000000000000000000000000”,
“difficulty”: “0x400”,
“alloc”: {},
“coinbase”: “0x0000000000000000000000000000000000000000”,
“timestamp”: “0x00”,
“parentHash”: “0x0000000000000000000000000000000000000000000000000000000000000000”,
“extraData”: “0x”,
“gasLimit”: “0xffffffff”,
“config”: {
“chainId”: 59,
“homesteadBlock”: 0,
“eip155Block”: 0,
“eip158Block”: 0


  • mixhash

A 256-bit hash which proves, combined with the nonce, that a sufficient amount of computation has been carried out on this block: the Proof-of-Work (PoF).

  • nonce

A scalar value equal to the number of transactions sent by the sender.

A 64-bit hash which proves combined with the mix-hash that a sufficient amount of computation has been carried out on this block.

The nonce is the cryptographically secure mining proof-of-work that proves beyond reasonable doubt that a particular amount of computation has been expended in the determination of this token value. (Yellowpager, 11.5. Mining Proof-of-Work).

  • difficulty

A scalar value corresponding to the difficulty level applied during the nonce discovering of this block. It defines the Mining Target, which can be calculated from the previous block’s difficulty level and the timestamp. The higher the difficulty, the statistically more calculations a Miner must perform to discover a valid block. This value is used to control the Block generation time of a Blockchain, keeping the Block generation frequency within a target range. On the test network, we keep this value low to avoid waiting during tests since the discovery of a valid Block is required to execute a transaction on the Blockchain.

  • alloc

Allows to define a list of pre-filled wallets. That’s a Ethereum specific functionality to handle the “Ether pre-sale” period. Since we can mine local Ether quickly, we don’t use this option.

  • coinbase

The 160-bit address to which all rewards (in Ether) collected from the successful mining of this block have been transferred. They are a sum of the mining eward itself and the Contract transaction execution refunds.

  • timestamp

A scalar value equal to the reasonable output of Unix’ time() function at this block inception.

  • parentHash

The Keccak 256-bit hash of the entire parent block’s header (including its nonce and mixhash). Pointer to the parent block, thus effectively building the chain of blocks. In the case of the Genesis block, and only in this case, it’s 0.

  • extraData

An optional free, but max. 32 byte long space to conserve smart things for ethernity on the Blockchain.

  • gasLimit

A scalar value equal to the current chain-wide limit of Gas expenditure per block. High in our case to avoid being limited by this threshold during tests. Note: this does not indicate that we should not pay attention to the Gas consumption of our Contracts.

The detailed information about the genesis block can be found in this yellow paper

3. Starting a ethereum (miner) node

To start ethereum node you need to first initialize the blockchain using genesis block as follows:

geth –networkid 300 –identity node1 –verbosity 3 –nodiscover –nat none –datadir=./privatenet init ./custom_genesis.json

More info about the options mentioned in above command can be found here .

The above command will initialize the block chain using genesis block specified using custom_genesis.json and will use privatenet as datadirectory to store blockchain data and keystore. Once the blockchain has been initialized you need to create a etherbase which is nothing but a default address where mining reward is going/credited in. This etherbase can be created using the below command:

geth –networkid 300 –identity node1 –verbosity 3 –nodiscover –nat none –datadir=./privatenet4 account new

Now you can start your ethereum miner using below command:

geth –networkid 300 –identity node1 –verbosity 3 –nodiscover –nat none –datadir=./privatenet –mine –ipcpath ./privatenet/geth.ipc

Congratulations you have successfully deployed you own private ethereum network. In the next part we will see how to interact with this private ethereum network using Mist browser.




First Step to Apache Zeppelin


In this post I will walk you through the Apache Zeppelin and explain how it works along with sample example with CPSE data(Central Public Sector Enterprises) provided by Apache Zeppelin is an Open Source python notebook that enables to do fast data analytics .It provides data processing analytical environment that works on the concept of Interpreters. It can be used as a Multi-purpose notebook that satisfies all the need of data analytics right from the data ingestion to Data visualization and Collaboration. It has many built-in interpreters like Scala, Python, Apache Spark ,SparkSQL, Hive, Markdown, Shell etc. It provides a single view of data across diverse set of data stores right from hive to streaming data. It can be used to perform exploratory data analytics.


zeppelin architecture
Apache Zeppelin Architecture

Apache Zeppelin has different components:

  1. Client: Apache zeppelin is built using python notebook,so it becomes easy for novice use to learn it quickly
  2. Server: Zeppelin server is nothing but the webservice to which client communicates using Rest API which in turn sends processing task to different intrepreters
  3. Interpreter: Interpreters are nothing but the actual data processing engine.It can be Apache Spark,Hive,Cassandra etc.

Using Apache Zeppelin:

To understand how to use Zappelin we will take an example of CPSE Data provided by

Following are the steps that needs to be performed for creating simple analytical reports:

  •  To start zeppelin run “” script with turns up the zeppelin server.
  •  The above step will starts up the zeppelin server and brings up the UI on port 8080.Following figure shows the landing page of zeppelin:


  • Now let move ahead and create a new notebook by clicking the “Create new notebook” under notebook tab. This will bring up the new empty notebook as shown below:


  • To start with we will use “md”(Markdown ) interpreter to create  first heading for a paragraph using “md”(Markdown) interpreter as follows:

%md ###First Analytical Chart

  • Now let’s crunch some data using Apache Spark interpreter and create a temporary table in to be used querying using spark sql. Following is the code for the above data.


  • Once a temporary table is created then we can create the charts by just querying the data using spark sql. So in the next paragraph switch to the Spark SQL interpreter by typing command “%sql” and lets use some analytical query to get the details of the relation between enterprises paid capital and profit and Loss and also the net worth of a company


  • Once you run this paragraph you will see the following:.



Leveraging the power of Twitter for Internet of Things (IoT)

Today we are witnessing the era of Internet of Things (IoT) and how it is changing the lives of people. However there are very few success stories so far people are still  exploring the platform to be used for IoT. In this post we will discuss the challenges with the current available platforms and how twitter can be used as general platform for developing quick Internet of Things applications. Also we will   discuss how this platform will be used in future for establishing Social networks between the devices connected to Internet of Things.

Introduction to Internet of Things:

The Internet of Things (IoT) is the network of physical objects or things embedded  with electronics, software, sensors , and network connectivity, which enables these objects to collect and exchange data [1].It goes beyond the legacy M2M (Machine to Machine) communication in the sense that it covers different protocols, system, devices etc. According to Gartner, Inc. (a technology research and advisory corporation), there will be nearly 26 billion devices on the Internet of Things by 2020 [2].Due to this nature of IoT most of  the IoT vendors are trying to create a generic platform to handle different needs.

Challenges Involved in IoT Platform:

A.    Interoperability Standards:

There are number of open and proprietary IoT solutions available which creates a great deal of confusion over which platform to choose and which will fit the requirement. IoT Platform should be such that it should be interoperable and use standard protocol which makes it easy for different devices from different vendors to communicate easily.

B.    Data Rate:

Data Rate plays a very crucial role in IoT and is one of the major factor to be consider while implementing the IoT platform. Some devices connected in IoT continuously send data at a very high data rate while some only sends data at some event. IoT platform should be capable of  handling such kind of data rate.

C.    Security

Data security and access is very important in IoT as interconnected devices takes decisions based on the messages communicated between them. Adding security features to wireless systems requires more overhead in each packet sent. It also means adding components within the electronics. IoT platform should have a good authentication mechanism to avoid any unauthenticated device or user to hack the system and also have a less overhead at electronics. The level of security usually depends upon the application, which requires anything from maintaining consumer privacy to limiting cyber attacks against utilities.

D.    Complexity:

As IoT architecture is complex the platform should be implemented in such a way that it should be simple not only for the developers but also for the end users and analysts.

Twitter as IoT Platform:

Twitter is most popular micro blogging site where people post short messages of 140 characters called tweets. It has million of active users who post millions of tweets daily.It not only allow users but the applications to post messages on user behalf using Twitter API.If this API’s are used as the channel for sensors to communicate then it will becomes very easy and fast way of deploying IoT application.It will remove all the complexity involved in maintaining and deploying altogether a new IoT application.Also all the overhead of  high data volume and data rate will be taken care by twitter. Twitter will not only help the devices in one IoT to communicate with each other but also the other devices which are in different networks as well as human which will further increase the power of the entire IoT. The sensors involved in IoT will not only take decisions using the sensors but also consider tweets from people as the sources of information.This solution will also solve the problem of interoperability as it will provide as common platform for all.

Sample Prototype:

To prove whether this will work or not I have create a small IoT application using beagle bone .This application will continuously monitor a sensor on port “P8_14” and will tweet if it detects a HIGH on that port.Similarly on the receiver side there will be again another sensor listening to the twitter account on which device is sending the tweet and will print that tweet. Following is the setup of these system:

In order for beagle bone to connect to internet I have shared internet on my laptop with beagle bone (for more information on how to set internet please refer to this link <TODO:Add github link> ).After setting this lets do some code in python:

Code at sensor:

import Adafruit_BBIO.GPIO as GPIO
from twython import Twython
import requests
GPIO.setup("P8_14", GPIO.IN)
USER_KEY="User key for twitter app"
USER_SECRET="User secret for twitter app"
ACCESS_TOKEN="Access token"
ACCESS_TOKEN_SEC="Access token secret"
client_args = {
 'verify': False
while True:
    if GPIO.input("P8_14"):
        twitter.update_status(status="Detected Sensor output")


Code at Receiver End:

In order for the receiver to continuously  listen to the twitter we need to use Twitter streaming api.For this we need to first create a subclass of “TwythonStreamer” as follows:

from twython import TwythonStreamer
class IoTStreamReader(TwythonStreamer):
 def on_success(self, data):
    print "Received Tweet from Sensor:"
    print data
def on_error(self, status_code, data):
    print status_code

Now initialize IoTStreamReader and track for the keyword in stream as follows:

USER_KEY="User key for twitter app"
USER_SECRET="User secret for twitter app"
ACCESS_TOKEN="Access token"
ACCESS_TOKEN_SEC="Access token secret"
client_args = {
 'verify': False

This code is comitted on github repository

Limitation of this solution:

This platform will not work for the use cases where data is critical as the data on twitter is available to all.

Also we have to totally rely on twitter for the data


In this post we have seen how  Twitter can we used  in IoT and how it can help not only connecting devices in IoT but also help us to create social network of humans with sensors.