Sony Arouje

a programmer's log

Expo react-native development in Docker

leave a comment »

I spent most of my free time learning developing applications in different platforms. Recently I was spending time in Expo, a platform to build react-native apps. Expo is a pretty good platform to kick start your react native development. One of the difficulty I always face is upgrading the versions, for e.g. some times expo releases multiple updates in a month. When upgrading in my Windows machine, there could be issues, either a file lock or some thing else. These installations issues leads to frustrations and fire fighting to return to a working state. Recently my friend Sendhil told me, how he use VS Code to remote develop using containers. I decided to take a look at it.

I kept myself away from docker for some time. Decided to try out docker again. It took me few mins to up and running a docker image maintained by node. Next step was to install the expo-cli and other dependencies to run my expo test application. I had to over come several errors popped up when running expo code in a container. Spend hours reading forums and posts to resolve it one by one. Here is the Dockerfile I came up, which can be used to develop any expo based applications.

The below workflow holds good for any kind of node or react or react-native, etc developments.

Dockerfile

FROM node:10.16-buster-slim
LABEL version=1.0.0

ENV USERNAME dev
RUN useradd -rm -d /home/dev -s /bin/bash -g root -G sudo -u 1005 ${USERNAME}

EXPOSE 19000
EXPOSE 19001
EXPOSE 19002

RUN apt update && apt install -y \
    git \
    procps

#used by react native builder to set the ip address, other wise 
#will use the ip address of the docker container.
ENV REACT_NATIVE_PACKAGER_HOSTNAME="10.0.0.2"

COPY *.sh /
RUN chmod +x /entrypoint.sh \
    && chmod +x /get-source.sh

#https://github.com/nodejs/docker-node/issues/479#issuecomment-319446283
#should not install any global npm packages as root, a new user 
#is created and used here
USER $USERNAME

#set the npm global location for dev user
ENV NPM_CONFIG_PREFIX="/home/$USERNAME/.npm-global"

RUN mkdir -p ~/src \
    && mkdir ~/.npm-global \
    && npm install expo-cli --global

#append the .npm-global to path, other wise globally installed packages 
#will not be available in bash
ENV PATH="/home/$USERNAME/.npm-global:/home/$USERNAME/.npm-global/bin:${PATH}"

ENTRYPOINT ["/entrypoint.sh"]
CMD ["--gitRepo","NOTSET","--pat","NOTSET"]

VS Code to develop inside a container

To enable VS Code to develop inside a container, we need to Install Remote Development Extension pack. Here is the more detailed write up from MS

To enable remote development we need two more files in our source folder.

  • docker-compose.yml
  • devcontainer.json

docker-compose.yml

version: '3.7'

services:
  testexpo:
    environment:
      - REACT_NATIVE_PACKAGER_HOSTNAME=10.0.0.2
    image: sonyarouje/expo-buster:latest
    extra_hosts:
      - "devserver:10.0.0.2"
    command: "--gitRepo sarouje.visualstudio.com/_git/expotest --pat z66cu5tlfasa7mbiqwrjpskia"
    expose:
      - "19000"
      - "19001"
      - "19002"
    ports:
      - "19000:19000"
      - "19001:19001"
      - "19002:19002"
    volumes:
      - myexpo:/home/node/src
volumes:
  myexpo:

  • REACT_NATIVE_PACKAGER_HOSTNAME: Will tell react-native builder to use the configured ip when exposing the bundler, else will use the docker container’s ip and will not be able to access from your phone.
  • command: Specify your git repo to get the source code and the pat code. When running docker-compose up, docker container will use these details to clone your repo to /home/dev/src directory of the container.
  • volumes: Containers are short lived and stopping the container will loose you data. For e.g. once the container is up we might install npm packages. If the packages are not able to persist then we need to reinstall packages every time we start the container. In order to persist the packages and changes, docker-compose creates a named volume and keep the files of /home/dev/src in the volume and can be accessible even after docker restart.

Keep in mind ‘docker-compose down’ will remove the volume and we need to reinstall all the packages again.

devcontainer.json

Create a new folder named .devcontainer and inside the new folder create a file named devcontainer.json. Below is the structure of the file.

{
    "name": "expo-test",
    "dockerComposeFile": "../docker-compose.yml",
    "service": "expo-test",
    "workspaceFolder": "/home/dev/src/expotest",
	"extensions": [
       "esbenp.prettier-vscode"
    ]
    "shutdownAction": "stopCompose"
}
  • dockerComposeFile: will tell where to find the docker-compose.yml file 
  • service: Service configured in docker-compose.yml file
  • workspaceFolder: Once VS Code attached to the container, will open this workspace folder.

    extensions: Mention what all the extensions need to be installed in VS Code running from the container.

Work flow

  • Download the latest version of docker
  • Open powershell/command prompt and run ‘docker pull sonyarouje/expo-buster’
  • Open your source folder and create docker-compose.yml and .devcontainer/devcontainer.json file
  • Modify docker-compose.yml and give the git repo and pat, etc
  • Open VS Code in source folder. VS Code will prompt to Reopen in Container, click Reopen in Container button. Wait for some time, and VS Code will launch from the container.
  • Once launched in container, all your code changes will be available only in the container. Make sure to push your changes to git before exiting the container.

Advantages of containerized approach

We can spawn a new container at ease and test our code against any new version of libraries we are using. We don’t need to put our dev machine at risk. Any break or compilation issues, we can always destroy the container and go back to the dev container and proceed with our development. No need to restore our dev machine to a working state. If the upgrade succeed then we can always destroy the current dev container and use the new container as the development container. No more hacking with our current working container.

Where is the source?

All the dockerfiles and scripts are pushed to git. Feel free to fork it or give me a pull request in case of any changes. I created two versions of docker file, one for alpine and one for buster. As of now stable VS Code release wont support alpine but you can always switched to VSCode insider build to use alpine.

Docker image is published to docker hub, can be pulled using sonyarouje/expo-buster or sonyarouje/expo-buster:3.0.6. Here 3.0.6 is the version of expo-cli.

      Advertisements

      Written by Sony Arouje

      August 2, 2019 at 7:18 pm

      Posted in .NET

      React-Native library for Azure AD B2C

      with 24 comments

      Last couple of days I was spending my free time learning Azure Functions and Authentication of these functions. I was concentrating on Azure Active Directory B2C as my authentication provider. May be I will write another post detailing how to setup Azure AD B2C and configuring Azure Functions.

      I could able to create access tokens and access my functions via postman. Next I wanted to create a react-native mobile app and try access the same function from it. I searched for libraries to enable AD B2C login, unfortunately I couldn’t find any. I tried MSAL js but will not work with react-native, as MSAL needs localStorage or sessionStorage to work and is not suitable for react-native world.

      Fortunately I found a library which does Azure AD login. The work flow of Azure AD and AD B2C is almost same. So I decided to use Azure AD library and add Azure AD B2C functionality. I trim down the Azure AD library and removed the option to store the tokens. Caching of the token should be handled by the caller. This library will do the login flow and return back the tokens, it can also refresh the access_token using the refresh token flow.

      Lets see how to use this library.

      import B2CAuthentication from "../auth-ad-js/ReactNativeADB2C"; import LoginView from "../auth-ad-js/LoginView"; const CLIENT_ID = "<provide your client id>"; export default class LoginScreen extends React.Component { static navigationOptions = { title: "Login" }; render() { const b2cLogin = new B2CAuthentication({ client_id: CLIENT_ID, client_secret: "<key set in application/key>", user_flow_policy: "B2C_1_signupsignin", token_uri: "https://saroujetmp.b2clogin.com/saroujetmp.onmicrosoft.com/oauth2/v2.0/token", authority_host: "https://saroujetmp.b2clogin.com/saroujetmp.onmicrosoft.com/oauth2/v2.0/authorize", redirect_uri: "https://functionapp120190131041619.azurewebsites.net/.auth/login/aad/callback", prompt: "login", scope: ["https://saroujetmp.onmicrosoft.com/api/offline_access", "offline_access"] }); return ( <LoginView context={b2cLogin} onSuccess={this.onLoginSuccess.bind(this)} /> ); } onLoginSuccess(credentials) { console.log("onLoginSuccess"); console.log(credentials); // use credentials.access_token.. } }

      Parameters passing to B2CAuthentication are the values I configured in Azure AD B2C. If the login succeeds then onLoginSuccess callback function will receive the tokens.

      The library is hosted in git.

      Written by Sony Arouje

      February 7, 2019 at 4:13 pm

      Posted in React

      Tagged with ,

      simdb a simple json db in GO

      leave a comment »

      Some days ago I decided to learn GO. GO is pretty easy to learn and could learn syntax and semantics in couple of hours. To completely learn a language I normally write a small app in that language. So in my free time, I rewrote the expense service created in nodejs to GO and is now live and we are using it. This whole exercise allow me to learn GO in detail.

      For me GO looks to be a great simple language with static type checking. Seems like I will be using GO for my future RPi projects than nodejs. In RPi more often I use a simple json as a db to store, retrieve and update execution rules, Sensor details, etc. In nodejs I use tingodb, couldn’t find some thing very similar in GO, so decided to write one, and is called simdb, a simple json db.

      Using simdb I can persist stuct or retrieve or update or delete them from the json db. The db file created by simdb is a simple json file. Let’s see some of the functions in simdb.

       

      Create a new instance of db

      driver, err:=db.New("customer")

      Insert a new Customer to db

      customer:=Customer { CustID:"CUST1", Name:"sarouje", Address: "address", Contact: Contact { Phone:"45533355", Email:"someone@gmail.com", }, } err=driver.Insert(customer) if(err!=nil){ panic(err) }

      Get a Customer

      var customerFirst Customer err=driver.Open(Customer{}).Where("custid","=","CUST1").First().AsEntity(&customerFirst) if(err!=nil){ panic(err) }

      Update a customer

      customerFirst.Name="Sony Arouje" err=driver.Update(customerFirst) if(err!=nil){ panic(err) }

      Delete a customer

      toDel:=Customer{ CustID:"CUST1", } err=driver.Delete(toDel) if(err!=nil){ panic(err) }

      Update and Delete operation uses the ID field of the struct to perform it’s operation.

       

      Let’s see the full code.

      package main import ( "github.com/sonyarouje/simdb/db" "fmt" ) type Customer struct { CustID string `json:"custid"` Name string `json:"name"` Address string `json:"address"` Contact Contact } type Contact struct { Phone string `json:"phone"` Email string `json:"email"` } //ID any struct that needs to persist should implement this function defined //in Entity interface. func (c Customer) ID() (jsonField string, value interface{}) { value=c.CustID jsonField="custid" return } func main(){ fmt.Println("starting....") driver, err:=db.New("dbs") if(err!=nil){ panic(err) } customer:=Customer { CustID:"CUST1", Name:"sarouje", Address: "address", Contact: Contact { Phone:"45533355", Email:"someone@gmail.com", }, } //creates a new Customer file inside the directory passed as the //parameter to New(). If the Customer file already exist //then insert operation will add the customer data to the array err=driver.Insert(customer) if(err!=nil){ panic(err) } //GET ALL Customer //opens the customer json file and filter all the customers with name sarouje. //AsEntity takes an address to Customer array and fills the result to it. //we can loop through the customers array and retireve the data. var customers []Customer err=driver.Open(Customer{}).Where("name","=","sarouje").Get().AsEntity(&customers) if(err!=nil){ panic(err) } // fmt.Printf("%#v \n", customers) //GET ONE Customer //First() will return the first record from the results //AsEntity takes the address to Customer variable (not an array pointer) var customerFirst Customer err=driver.Open(Customer{}).Where("custid","=","CUST1").First().AsEntity(&customerFirst) if(err!=nil){ panic(err) } //Update function uses the ID() to get the Id field/value to find the record and update the data. customerFirst.Name="Sony Arouje" err=driver.Update(customerFirst) if(err!=nil){ panic(err) } driver.Open(Customer{}).Where("custid","=","CUST1").First().AsEntity(&customerFirst) fmt.Printf("%#v \n", customerFirst) // Delete toDel:=Customer{ CustID:"CUST1", } err=driver.Delete(toDel) if(err!=nil){ panic(err) } }

      TODO

      The query syntax in simdb is not really great, I need to find a better approach.

       

      Source Code: https://github.com/sonyarouje/simdb

      Written by Sony Arouje

      August 6, 2018 at 2:30 pm

      Posted in GO

      Tagged with , , ,

      Slack App to track our expenses hosted in a Raspberry Pi

      with one comment

      At 4Mans Land we keep struggling to track our expenses. We normally note the expenses in Whatsapp and later move to a shared excel sheet. But this is always have difficulties, there will be lot of noises in the whatsapp with ongoing discussions. We also use slack for discussions, so I decided to integrate an expense app and evaluated some. Most of them are paid and don’t want to pay for an app at this stage. One night decided to write an app for slack and by morning finished the basic app that can store expenses in a mongodb. This whole experiment started as a fun to play with some thing new and also understand the slack api.

      Wrote the server in nodejs and choose mongo db for persistence. For testing the slack integration, I used ngrok to create a local tunnel. also evaluated localtunnel.me which is a free version, but very unstable. ngrok is good but we have to pay $5 per month for license.  Started evaluating other alternatives to host my server, heroku was another alternative and used it several times earlier. At the moment we wanted to spend less for hosting and infrastructure. At last decided to host the server at my home, I have a really good stable internet connection with more than 80mbps speed and a Gigabit router. Added a port forwarding at my router,  expected to access the port immediately but to my surprise it’s not working. Done port forwarding so many times earlier and could access stuffs inside my network without any issues, my ISP was different that time. Then called up my ISP support, they informed me that without a static ip I wont be able to do port forwarding, for static IP I have to pay a very minimal fee and is life long unless I switch the provider. Paid the amount and in 2 days got my static IP and could do port forwarding successfully. Also set up dynamic dns at noip.com to give a good name to my IP. With all these settings done, changed the url in slack app and used the one I setup at noip.com. Run the nodejs server at my dev machine and fired a command from slack app, viola received the command to the server running in my laptop. The server is ready and is running in my laptop, but required a system which could run 24/7. The cheap option came to my mind was to host the nodejs server in one of the Raspberrypi lying in my table, one night I decided to setup the pi with the latest Raspbian stretch.

      Setting up the Raspberry pi

      Expected to finish the whole setup in less than an hour, as did this several times earlier but not that day. After installing the stretch, apt-get update started throwing hash sum error, reinstalled the os couple of times. Same error again and again. This error sent me for a wild goose chase, reading one forum after another and tried all the suggestions but nothing changed. At last came across this stackoverflow post which fixed the issue and could update the os, phew.

      Next installing the nodejs, used nodejs version maintained here and issue the ubuntu command, for e.g. to install nodejs 10 issue the below command from your raspberry pi

      curl sL https://deb.nodesource.com/setup_10.x | sudo -E bash – sudo aptget install y nodejs
      
      

      Next task is installing Mongodb, apt-get install mongodb installed an earlier version of mongodb and is not suitable for the mongoose orm which I am using in the code, either I have to use an older version of mongoose or try to find a way to install a new version of Mongodb. I choose the later and try hunting for some clues to install some newer version of mongodb. At the end, come accross this post, which has all the details of installing mongodb in Rpi jessie, which is good enough in stretch as well, also he provided a link for stretch mongo binaries. Followed the instructions and at the end mongodb started running in my machine.

      When I try to connect from Compass to rpi’s mongodb instance, it’s not getting connected. Reading through some docs and forums realized that I have to comment bind_ip in /etc/mongodb.conf, commented out that config and able to connect from compass.

      Before going to bed at 2am, copied the nodejs code to rpi, did some testing and all went well. Could post expenses from slack to the service running in my pi. Went to bed with peace of mind.

      What this app does?

      The main focus of this app is to capture the expense, for now it uses slack command to forward text to the service. Initially created a strict pattern to record expense for e.g.

      /expense spent 4000 INR on Jill to fetch a pail of water by Jack

      Here spent should be first word followed by amount, after ‘on’ specify the description and after ‘by’ specified who paid the amount. The nodejs service will parse them and place them in separate fields in mongodb collection. Then to query the expense, some thing like

      /expense query date today

      /expense query date 05/12/2018 – 05/14/2018

      So here ‘spent’ and ‘query’ is a keyword, based on this keyword different handling logics will kicks in and complete the task.

      I felt the syntax is more restrictive, started analyzing options to process the sentence more naturally. I come across natural, a NLP library for nodejs. Using natural’s BayesClassifier trained the system to categorize the input text and derive the right keyword, which again calls the different logic and get the task done. After the classification training, the system can take inputs like

      /expense Jack paid 4000 by Card for ‘Jill to fetch a pail of water’ on 05/16/2018

      The above code will classified as spent, then added some code to extract relevant part from the text. Not pure NLP approach, any thing inside single quotes will be considered as expense. Any date in the text is considered as the payment date and so on. I am learning NLP, in future I might achieve a better translation of text.

      Querying command is modified as shown below

      /expense today

      /expense between 05/12/2018 – 05/14/2018

      Can also query and return expenses in csv format

      /expense today csv  will return the expense in comma seperated format.

      Slack output

      image

      After each expense created successfully, it returns the expense with the ID and the actual text we passed to the expense command. For e.g to create first expense passed the text like

      /expense Jack paid 4000 to ‘Jill to fetch a pail of water’

      Here is the output for querying todays expense, issue a command like

      /expense today

      image

       

      In csv format (/expense today csv)

      image

      We can also delete an expense, only the expense you created can be deleted.

      /expense delete today: delete all the expense you entered today

      /expense delete 116: delete the expense with id 116

      Had a great fun during this experiment and learned few things.

      Happy coding…

      Written by Sony Arouje

      May 18, 2018 at 5:17 pm

      react, redux and cycle – Introduction

      leave a comment »

      Last couple of months we were working mostly in react js to develop a flexible platform for one of the product we are working. I am planning to make a couple of post to explain how to start developing in react with redux and cycle. To setup my dev environment, I used the react boilerplate created by my friend Sendhil. This is a basic template to jumbstart react js development with linting.

      I am not going in depth of redux or redux-cycle, it’s just an attempt to help setup the initial phase of the development. If you are not familiar with redux or why we need redux, better learn it from the creator.

      Let’s use the template and incrementally add more features to it. I created a new github repository with some initial code to start with.

      Below are the new npm dependencies added to package.json

      • react-redux: "^5.0.4",
      • redux: "^3.6.0",
      • @cycle/http: "^13.3.0",
      • @cycle/run: "^3.1.0",
      • @cycle/time: "^0.8.0",
      • redux-cycles: "^0.4.1",
      • xstream: "^10.9.0",
      • prop-types: "^15.5.10"

      For any redux application we need to setup a store. I created a store as well, see the code below.

      import { applyMiddleware, createStore, compose } from 'redux'; import { createCycleMiddleware } from 'redux-cycles'; import { run } from '@cycle/run'; import { makeHTTPDriver } from '@cycle/http'; import { timeDriver } from '@cycle/time'; import logger from 'redux-logger'; // combined all the reducers in the application import reducers from './reducers'; // combined all the cycles in the application import cycles from './cycles'; //create cycle middleware to attach to redux store. const cycleMiddleware = createCycleMiddleware(); const { makeActionDriver, makeStateDriver } = cycleMiddleware; //we might use multiple middleware, here we used a logger //and cycle. We can add more middleware by adding them to //the below array. const middleWare = [ logger, cycleMiddleware, ]; const initState = {}; // more about compose here http://redux.js.org/docs/api/compose.html let composeEnhancers = compose; // adding redux dev tool to visualize the store state. // should be enabled only in development. if (process.env.NODE_ENV !== 'production') { const composeWithDevToolsExtension = window.__REDUX_DEVTOOLS_EXTENSION_COMPOSE__; if (typeof composeWithDevToolsExtension === 'function') { composeEnhancers = composeWithDevToolsExtension; } } const store = createStore( reducers, // all the available reducers combined initState, // initial state of the reducers composeEnhancers( // adding store enchancers applyMiddleware(...middleWare), // attaching the middleware ), ); // calling cycle's run() we are activating the cycles that we created // here all the different cycles are combined to one. run(cycles, { ACTION: makeActionDriver(), STATE: makeStateDriver(), Time: timeDriver, HTTP: makeHTTPDriver(), }); export default store;

       

      Added some inline comments to the code and hopefully it’s pretty self explanatory. Once we have the store created, we need to pass down the store to other components, so that they can access state or dispatch actions.

      Let’s edit index.jsx and add the below code.

      import React from 'react'; import { render } from 'react-dom'; import { Provider } from 'react-redux'; import store from './store/create-store'; import App from './App'; render( <Provider store={store}> <App /> </Provider> , document.getElementById('root'));

       

      Great, we created a basic react app with redux. In next post I will cover how to create reducer and handle side effects using redux-cycle.

       

      Happy coding…

      Written by Sony Arouje

      September 22, 2017 at 6:32 pm

      Posted in JavaScript, React

      Tagged with , ,

      RF Communication using nrf24L01 and Nodejs addon

      leave a comment »

      Recently I started experimenting with radio communication with low cost nrf24L01 modules. These modules are very cheap compare to XBee modules which I used earlier. With these nrf24 modules we could enable wireless communication between Arduinos and Raspberry pi very effectively and economically. For my experiment I used two nrf24 modules, one connected to an Arduino Uno and another to a Raspberry pi 1.  Here is the pin connection details

      Seq NRF24L01 RPi Arduino Uno
      1 GND 25 (GND) GND
      2 VCC 17 (3.3v) 3.3v
      3 CE 15 7
      4 CSN 24 8
      5 SCK 23 13
      6 MOSI 19 11
      7 MISO 21 12
      8 IRQ

       

      For testing the communication, I used the RF24Network library, which is very good and has good documentation. Also it comes with e.g for both Arduino and RPi. So I didn’t write any single code just used the e.g and able to see the communication working, initially I had some troubles and at the end every thing worked well, I can see the data coming from Arduino in RPi. 

      My intention is to use these modules in RPi and write code in nodejs. Unfortunately there is no nodejs support for this library. So last night I decided to write a nodejs addon for this C/C++ library. I didn’t had any experience in writing a nodejs addon, I spend an hour understanding the Nan and creating very simple addons. Then I started writing the addon for RF24Network, this task was very hard than trying with simple hello world addons.

      Node-gyp was keep on failing when it tries to compile the RFNetwork modules. In my searches I realized that node-gyp uses make utility and I need to add the C/C++ files of this library. At the end I could compile the node addon. See the binding.gyp file

      { "targets": [ { "target_name": "nrf24Node", "sources": [ "nrf24Node.cc", "RF24/RF24.cpp", "RF24/utility/RPi/spi.cpp", "RF24/utility/RPi/bcm2835.c", "RF24/utility/RPi/interrupt.cpp", "RF24Network/RF24Network.cpp", "RF24Network/Sync.cpp" ], "include_dirs": [ "<!(node -e \"require('nan')\")", "RF24Network", "RF24" ], "link_settings": { "libraries": [ "-RF24", "-RFNetwork" ] } } ] }

       

      I should say, I am just a beginner in node-gyp and this binding.gyp might need some improvements. Anyway with this gyp file, the compilation succeeded.

      Next is to create the addon file. Here I had to learn more about the data types of Nan and Callbacks. I started simple functions to begin with and compile again, then moved on to next. I took more time in understanding callbacks which allows the addon to call javascript callback functions. Also spend a lot of time in understanding threading and creating a module to continuous listening of incoming messages and trigger the callback function, so that nodejs can process those incoming messages. I use libuv for threading, it seems more easy to understand than Async worker modules in Nan.

      That whole night I spend learning and writing and refactoring the addon, I finished the module by early morning. By that time I could write a nodejs app and could listen to incoming messages.

      Here is the sample code in node js to listen and acknowledge the message back to the sender.

      var rf24 = require('./build/Release/nrf24Node.node'); rf24.begin(90,00); rf24.printDetails(); rf24.write(1,"Ack"); rf24.readAsync(function(from, data){ console.log(from); console.log(data); rf24.write(from,"Ack"); }); process.on('SIGINT', exitHandler); function exitHandler() { process.exit(); rf24.close(); }

       

      Here is the complete addon. The code is uploaded to github, with the steps to compile and use it your own nodejs applications.

      #include <nan.h> #include <v8.h> #include <RF24.h> #include <RF24Network.h> #include <iostream> #include <ctime> #include <stdio.h> #include <time.h> #include <string> using namespace Nan; using namespace v8; RF24 radio(RPI_V2_GPIO_P1_15, BCM2835_SPI_CS0, BCM2835_SPI_SPEED_8MHZ); RF24Network network(radio); Nan::Callback *cbPeriodic; uv_async_t* async; struct payload_t { // Structure of our payload char msg[24]; }; struct payload_pi { uint16_t fromNode; char msg[24]; }; //-------------------------------------------------------------------------- //Below functions are just replica of RF24Network functions. //No need to use these functions in you app. NAN_METHOD(BeginRadio) { radio.begin(); } NAN_METHOD(BeginNetwork){ uint16_t channel = info[0]->Uint32Value(); uint16_t thisNode = info[0]->Uint32Value(); network.begin(channel,thisNode); } NAN_METHOD(Update) { network.update(); } NAN_METHOD(Available) { v8::Local<v8::Boolean> status = Nan::New(network.available()); info.GetReturnValue().Set(status); } NAN_METHOD(Read) { payload_t payload; RF24NetworkHeader header; network.read(header,&payload,sizeof(payload)); info.GetReturnValue().Set(Nan::New(payload.msg).ToLocalChecked()); } //-------------------------------------------------------------------------------- NAN_METHOD(Begin){ if (info.Length() < 2) return Nan::ThrowTypeError("Should pass Channel and Node id"); uint16_t channel = info[0]->Uint32Value(); uint16_t thisNode = info[1]->Uint32Value(); radio.begin(); delay(5); network.begin(channel, thisNode); } NAN_METHOD(Write){ if (info.Length() < 2) return Nan::ThrowTypeError("Should pass Receiver Node Id and Message"); uint16_t otherNode = info[0]->Uint32Value(); v8::String::Utf8Value message(info[1]->ToString()); std::string msg = std::string(*message); payload_t payload; strncpy(payload.msg, msg.c_str(),24); RF24NetworkHeader header(otherNode); bool ok = network.write(header,&payload, sizeof(payload)); info.GetReturnValue().Set(ok); } void keepListen(void *arg) { while(1) { network.update(); while (network.available()) { RF24NetworkHeader header; payload_t payload; network.read(header,&payload,sizeof(payload)); payload_pi localPayload; localPayload.fromNode = header.from_node; strncpy(localPayload.msg, payload.msg, 24); async->data = (void *) &localPayload; uv_async_send(async); } delay(2000); } } void doCallback(uv_async_t *handle){ payload_pi* p = (struct payload_pi*)handle->data; v8::Handle<v8::Value> argv[2] = { Nan::New(p->fromNode), Nan::New(p->msg).ToLocalChecked() }; cbPeriodic->Call(2, argv); } NAN_METHOD(ReadAsync){ if (info.Length() <= 0) return Nan::ThrowTypeError("Should pass a callback function"); if (info.Length() > 0 && !info[0]->IsFunction()) return Nan::ThrowTypeError("Provided callback must be a function"); cbPeriodic = new Nan::Callback(info[0].As<Function>()); async = (uv_async_t*)malloc(sizeof(uv_async_t)); uv_async_init(uv_default_loop(), async, doCallback); uv_thread_t id; uv_thread_create(&id, keepListen, NULL); uv_run(uv_default_loop(), UV_RUN_DEFAULT); } NAN_METHOD(PrintDetails) { radio.printDetails(); } NAN_METHOD(Close){ uv_close((uv_handle_t*) &async, NULL); } NAN_MODULE_INIT(Init){ Nan::Set(target, New<String>("beginRadio").ToLocalChecked(), GetFunction(New<FunctionTemplate>(BeginRadio)).ToLocalChecked()); Nan::Set(target, New<String>("beginNetwork").ToLocalChecked(), GetFunction(New<FunctionTemplate>(BeginNetwork)).ToLocalChecked()); Nan::Set(target, New<String>("update").ToLocalChecked(), GetFunction(New<FunctionTemplate>(Update)).ToLocalChecked()); Nan::Set(target, New<String>("printDetails").ToLocalChecked(), GetFunction(New<FunctionTemplate>(PrintDetails)).ToLocalChecked()); Nan::Set(target, New<String>("available").ToLocalChecked(), GetFunction(New<FunctionTemplate>(Available)).ToLocalChecked()); Nan::Set(target, New<String>("read").ToLocalChecked(), GetFunction(New<FunctionTemplate>(Read)).ToLocalChecked()); Nan::Set(target, New<String>("readAsync").ToLocalChecked(), GetFunction(New<FunctionTemplate>(ReadAsync)).ToLocalChecked()); Nan::Set(target, New<String>("write").ToLocalChecked(), GetFunction(New<FunctionTemplate>(Write)).ToLocalChecked()); Nan::Set(target, New<String>("close").ToLocalChecked(), GetFunction(New<FunctionTemplate>(Close)).ToLocalChecked()); Nan::Set(target, New<String>("begin").ToLocalChecked(), GetFunction(New<FunctionTemplate>(Begin)).ToLocalChecked()); } NODE_MODULE(nrf24Node, Init)

      All the credit goes to the developers of RF24 and RF24Network library, I just created an addon for the great library. Along the way I learned a lot and could finish the nodejs addon.

       

      Happy coding…

      Written by Sony Arouje

      February 5, 2017 at 4:57 pm

      My Experience in E-Vaping

      with 3 comments

      I was a smoker for so many years. I tried so many ways to quit this habit, at one point I tried Nicotine gums but ended up chewing nicotine gums and smoking at same time. Recently I went to England for a business trip and one of my colleague there quit smoking and moved to E-Cigarette or E-Vaping. I thought, I should give a try but I wasn’t sure I can quit smoking. The hotel where I stayed, I met another guy with an E-Vaping device and when talking to him, he said he quit smoking a year ago with the help of this device. That night I decided, I should buy one and ordered one from Amazon. I didn’t spend much time in understanding the technical details of these devices in detail. So I ordered a beginner model named Thorvap, which cost me around 24 GBP, not a big deal, also bought some E-Liquids. After I came back to India, I started using this device, and for last two weeks I haven’t touched a cigarette.

      I bought the below device

      Image result for thorvap 30w

       

      I didn’t regret buying this device. In my spare time I started researching more about E-Vaping and accessories attached to it. If this post inspire you to buy one then spend some time in understanding different types of systems available in the market before buying one.

      What is an E-Vaping device

      Any E-Vaping device produce vapor from a liquid by heating a coil. This liquid is a mix of Propylene Glycol, Vegetable Glycerin, Some Flavor and some percentage of Nicotine. Once you moved out of Cigarette, you can reduce the percentage of Nicotine and later can use liquids with zero nicotine. Devices comes in several variations like 30, 60 or 120 watts, as per my understanding this will tell how heat the coil can produce and thus increase the vape production.

      Parts in E-Vaping device

      There are mainly three parts

      1. Mod
      2. Tank
      3. Atomizer

      Mod: This part contains the battery and the temperature controlling system. It’s not very complicated to use it, it has an on/off button and buttons to increase or decrease the temperature.

      Tank: This is the top part with the glass. Liquid get filled in this tank. Tanks come is different sizes like 3ml, 6ml etc.

      Atomizer: This is a very crucial part to be decided before buying one. In simple terms, any atomizer contains a coil, cotton and a screw terminal screwed to the battery.

      There are two major type of atomizers

      • RBA (Rebuildable Atomizers)
      • Non RBA

      The coils and cotton will get deteriorate after some weeks, again it depends on how you vape and the sweet content of the liquid. There are so many factors that affects the life span of these coils and cotton.

      In Non RBA’s, you have to buy these part and is a costly affair. I bought a pack of 5 for 11GBP. Some says, these can be reused by washing in hot water or vodaka, I haven’t tried this personally. As per my understanding vaping with Non RBAs add costs.

      RBAs have rebuildable coils, that means we can unscrew it and change the coil and cotton. These coils can be easily made by ourselves. Should be careful to maintain the desired ohm of the coil, otherwise you might damage the mod. Check youtube and can see a lot of videos about how to do it, and learn more about how to maintain the ohm of the coil. RBAs are again divided into RDA and RTA.

      RDA (Rebuildable Drip Atomizer), in this system there is no tank to hold the liquid. We need to wet the cotton with liquid, personally I don’t like it, as I might ended up carrying liquid where ever I go. In RTA (Rebuildable Tank Atomizer) system, the liquid can be filled in a tank. The tank capacity comes in different variations like 3ml, 6ml etc.

      After learning more about the system, I decided to buy a second device. I think a second device is always useful, in case my existing one stop working. I ordered for SMOK XCube Mini and TFV8 Baby Tank and a separate RBA Atomizer.

      What Next?

      Buy an e-Vaping device and give it a try. Just like me, you might quit smoking. After two weeks, I can see a lot of difference in my body. Earlier I used to struggle to breath after running for a kilometer and now I feel more comfortable running a km. I think after some time I can reduce the nicotine in the liquid and later I can stop vapping all together.

      Still research is going on whether e-Vaping is healthy but it’s much less harmful than normal cigarette.

      Note: If you are residing in Karnataka, then selling of E-Cigarette and liquids are banned. I feel it’s a strange decision by the government. May be you can ship it to near by state and get it through some friends. There is a will there is a way.

      Written by Sony Arouje

      December 9, 2016 at 1:08 am

      Posted in Misc

      Tagged with ,

      %d bloggers like this: