Visual Builder Cloud Service – Dynamic Elements

 

I’ve been using VBCS for awhile now and it has really evolved over the past nine months.  I guess that’s one of the wonderful things about these PaaS offerings from Oracle; we don’t have to wait so long for new features and capabilities.

One thing I wanted to do, but it isn’t directly supported in VBCS yet, is to have dynamic displays.  I’ve done quite a bit of programming in native JavaScript and Oracle JET where I’ve used web sockets to make my graphs and gauges change automatically without the need for a refresh button.

Well, I figured out a way to do this in VBCS.  Now I will admit right away, this is pretty ugly, so if you are a software development purist, please turn off your TV now!

turnofftv.jpg

Continue reading “Visual Builder Cloud Service – Dynamic Elements”

Teaching best practices to Design, Build, Secure and Monitor APIs

In this blog, I want to share my experience after having created many APIs using different approaches and technologies. I am going to encapsulate a simple process that will help you construct APIs, starting from scratch with an idea or requirement and move it all along to a happy consumption.

The best part of APIs is that they are microservices enablers, which implies that they are not technology prescriptive, so in this blog you will see that your APIs can be implemented using any technology or programming language.

I decided to use “Jokes” as the vehicle to explain the APIs construction best practices, mainly because jokes are a simple concept that anyone can relate to, but also because I want you to feel compelled to consume these APIs and by doing so, get a laugh or two.

My original idea with jokes is to:

  1. Get a random joke.
  2. Translate the joke to any language.
  3. Share the original or the translated joke with a friend via SMS.

This is the high-level view of how our end solution will look like:

Continue reading “Teaching best practices to Design, Build, Secure and Monitor APIs”

Teaching How to simplify building NodeJS APIs with Loopback Framework

In this blog, I am going to show you how to get started with the Loopback framework to easily auto-build REST APIs in NodeJS and persistence layer in a variety of options, including relational and non-relational databases e.g. In-memory DB, MongoDB, MySQL, Cassandra, Oracle, etc.

In terms of API design and development, Loopback allows you to work “top-down” or “bottom-up”. I am going to cover both approaches in this blog.

First, we are going to create an API model definition in place, as we are building the REST APIs, this exercise will give us a Swagger-based API definition. Alternatively, we are going to start from an existing Swagger definition and use it to implement NodeJS REST APIs pointing to a persistence layer of choice (in-memory DB, MongoDB, MySQL, DB2, Oracle, etc.). I personally prefer the “API First/Top Down” approach, as it gives me the option to properly design and test my APIs first and then, simply move to implementation phase, but this ultimately depends on situations, preferences and requirements.

Continue reading “Teaching How to simplify building NodeJS APIs with Loopback Framework”

Oracle IoT – Working with Sigfox/Thinxtra Devices

How to use the Thinxtra devices and Sigfox network with the Oracle IoT cloud service.

Introduction

There are lots of activities happening today in the world of IoT (Internet of Things).  The market is growing at a staggering pace.  Oracle, of course, is providing services in this area, mainly to support our many great SaaS applications.  Almost every application can benefit from data coming from devices on machines, automobiles, medical devices, human wearables and such.  However, there are several issues people face:

  • How to work with all the various devices.
  • How to manage the devices and information.
  • How to integrate these with other systems in the enterprise.
  • Deployment of devices, configuration, maintenance, versions, upgrades, Etc.

Oracle IoT Cloud service is designed to help with these issues, but it is often overwhelming to get a given device’s data initially into the IoT cloud.  Case and point is with the wonderful devices from Thinxtra which uses the Sigfox network.

Continue reading “Oracle IoT – Working with Sigfox/Thinxtra Devices”

Teaching How to use Nginx to frontend your backend services with Trusted CA certificates on HTTPS

Now days with the adoption of Serverless architectures, microservices are becoming a great way to breakdown problem into smaller pieces. One situation that is common to find, is multiple backend services running on technologies like NodeJS, Python, Go, etc. that need to be accessible via HTTPS. It is possible to enable these internal microservices directly with SSL over HTTPS, but a cleaner approach is to use a reverse proxy that front ends these microservices and provides a single HTTPS access channel, allowing a simple internal routing.

In this blog, I am showing how simple it is to create this front end with Nginx and leveraging “Let’s encrypt” to generate trusted certificates attached to it, with strong security policies, so that our website can score an A+ on cryptographic SSL tests conducted by third party organizations.

Continue reading “Teaching How to use Nginx to frontend your backend services with Trusted CA certificates on HTTPS”

ACCS Zero Downtime Updates and Re-Deployments

The May 2017 update for ACCS (17.2.3) brought a cool new feature to ACCS, zero-downtime updates. While previously, there was support for performing a rolling restart of an application instance, where each instance would be brought down and updated in turn, this only enabled zero-downtime updates if you had an application running two or more instances, and your application could satisfy performance requirements with one fewer node. While any production system should be able to satisfy these requirements, many of the utility systems I ran were on a single node, and I had to send out an email blast to avoid being disruptive when I wanted to push a quick code fix.

Continue reading “ACCS Zero Downtime Updates and Re-Deployments”

Teaching How to Use Alexa to Take Off your Drone using NodeJS

Recently I was in Auckland, New Zealand running an Integration in Action workshop and I used Alexa to trigger some of my APIs, including some NodeJS APIs that I built to take off a drone. Some people found this interesting and asked me to write this blog to explain in detail how it works… So, here it is, I hope you find it useful.

There are multiple ways in which you can make Alexa to call your own APIs. Perhaps the most versatile way is by adding a new skill (see: blog 1, 2). However I found an even easier way to do so, and in order to achieve the MVP approach I have been attempting to practice in my day-to-day life, I took this simpler option, which is simulating a Phillips Hue HA bridge. By doing  so, Alexa detects a new Home Automation (HA) device in “her” network range and accepts voice commands to easily “turn it on” or “turn it off” which you can then leverage to call your own APIs.

Continue reading “Teaching How to Use Alexa to Take Off your Drone using NodeJS”

Teaching how to use Bitnami to deploy any Image into Oracle Public Cloud

Recently, I was challenged at work, to make my drone take off, using a simple voice command via “Alexa”. Given this challenge came from my boss, I decided to happily accept it. I ended up writing some simple NodeJS code that interfaces with my drone and used a series of new Echo skills and a Home Assistant bridge to easily command my drone to take off and follow simple orders, something like “Alexa, please take off my drone and make it back flip twice“.

After some hours and many coffees, I got to the point that I was done and ready to show my working demo. However, I needed to take my NodeJS code and put it somewhere in the cloud to run it. I had a few options, like deploying my code into Oracle Application Container Cloud Service, which runs NodeJS natively over docker containers, but given that I have done this in the past and I was in an “adventurous mode”, I decided to try something new.

I wondered how simple would it be to use Bitnami to spin up a new NodeJS VM in Oracle Public Cloud… Well, I was amazed how pleasant the experience was, so I decided to capture this excitement in a quick blog for you to try as well.

Continue reading “Teaching how to use Bitnami to deploy any Image into Oracle Public Cloud”

Teaching how to use MongoDB and expose it via NodeJS APIs

 

Hi, we are starting to use different technologies to build complex scenarios around APIs. It’s getting common to use popular NoSQL DBs like MongoDB.

For this reason, I decided to build this very easy to follow blog that will help you get started with MongoDB and build simple REST APIs using NodeJS via the Express module.

The use case is simple, we are going to build a MongoDB to store users information that eventually (in a future blog) we are going to use to send SMS and Voice call notifications… But for now we are keeping things simple, we are starting by demystifying the use of MongoDB with NodeJS.

The code that sends the SMS/Voice call notifications in NodeJS via Twilio is out of the scope of this blog, but if you want to use it, you can find it here.

Pre-requisites

 

This blog is about simplicity, so we are going to build a simple HelloWorld sample that starts from scratch and interacts (GET and POST) with a MongoDB via Express APIs.

 

Installing and playing with MongoDB

 

In this section, I am going to show you how to install and get your MongoDB up and running. This is going to be the DB that we are going to use in a future blog to send SMS and voice call notifications to people.

Note: I am using Ubuntu, adjust if using other OS (e.g. yum if using OEL/RH):

sudo apt-get install mongodb

  • Validate the installation by running mongo client – This should connect to the running MongoDB Server

  • In the mongo client prompt, create a simple test database called myTestDB:

    use myTestDB


  • The database doesn’t really exist until we add some data. The best thing of MongoDB is that you store JSON payloads into it. This makes it very friendly when interacting with APIs.

     

    In our case, we want to build a notification service and we want to store the contact details of the recipients. Something like:

     

    {“name”:”Carlos”, “mobile”:”+615550505″, “msg”:”Hello World”}

     

    Note that we are not defining ids, mongoDB will take care of it.

     

  • Still within the mongo client db type the following inserts each followed by ab enter:

     

    db.usercollection.insert({ “name” : “Carlos”, “mobile” : “+615550505”, “msg” : “+Hello World”})

    db.usercollection.insert({ “name” : “Dave”, “mobile” : “+616660606”, “msg” : “+Hello World again”})

    db.usercollection.insert({ “name” : “Serene”, “mobile” : “+617770707”, “msg” : “+Hello World once more”})

     

     

  • Now, simply retrieve the inserted collection

    db.usercollection.find().pretty()


  • You can also add a JSON array of JSON payloads, so that you can insert a bulk of entries to MongoDB. For example:

     

    users = [{“name”:”Callan”, “mobile”:”+618880808″, “msg”:”More stuff”},{“name”:”Alessia”, “mobile”:”+612220202″, “msg”:”Stuff plus stuff”}]

     

    db.usercollection.insert(users);

     

  • Once again, retrieve the inserted collection:

    db.usercollection.find().pretty()


  • You can also use the usercollection object to remove individual entries by _id:

    db.usercollection.remove( {“_id”: ObjectId(“ID_GOES_HERE”)});

  • To completely clean up all your entries, you can simply use:

    db.usercollection.remove()

     

  • If you only want to use mong client to switch to a MongoDB and print a specific collection you can do this:

     

    • From Linux shell:             mongo [db_name]
    • Then, from within the mongo client:

      c = db.[collection_name];

      c.find().pretty()

  • To list all Collections in current db:

    db.getCollectionNames()

  • If you are using a MongoDB that requires authentication, you might need to create a specific user(s) under specific databases. For this follow the next steps:

     

  1. Using mongo client, login to admin db as administrator:

     

    mongo admin –username root –password [GIVEN_PASSWORD]

     

    Note: If using Bitnami, [GIVEN_PASSWORD] is the same as the OS root password.

     

  2. Switch to your MongoDB database:

     

    use [DATABASE]

     

  3. Create a new user(s)

     

    db.createUser(

    {

    user: “USER-GOES-HERE”,

    pwd: “PASSWORD-GOES-HERE”,

    roles: [ “readWrite”, “dbAdmin” ]

    }

    )

     

    Note: Substitute USER-GOES-HERE and PASSWORD-GOES-HERE with your own values.

     

  4. That’s it, you can now use that username and password to authenticate and grant read and write access to your db. If using monk in NodeJS as shown below in this blog, make sure to add the credentials in the connection string. A simple way to achieve this is by adding “USER:PASSWD@” before the server name.

     

    For example:

     

    var db = monk(USER + ‘:’ + PASSWD + ‘@’ + MONGODB_SERVER + ‘:’ + MONGODB_PORT + ‘/’ + DB_NAME);

 

 

For more information: https://docs.mongodb.com/manual/reference/mongo-shell/

 

Ok, our MongoDB is behaving well. Let’s run it together with our Express NodeJS code.

 

Building a simple Express JS API that interacts with a MongoDB

 

Now that we have a running MongoDB server running, let’s build a simple NodeJS code that will expose simple Express REST APIs to interact with your MongoDB.

I am going to explain below those snippets that require explanation, however before continuing make sure that you have downloaded/cloned the NodeJS project found here: https://github.com/barackd222/s2v-iia-nodejs-mongodb-crud-demo

The bits that you need to pay special attention from the NodeJS code:

  • app.js – This is a very standard Express JS code that starts a service listening in a specific port. In terms of MongoDB, notice the following snippets:

     

    • You need to require mongodb module, as well as monk.

  • Monk is going to give you the ability to connect to a MongoDB running somewhere. Think of it as a JDBC connection, in the relational database world, if you like.

  • Configuring the Express app to add the Monk db as part of it, so that it can be accessed within the request variable in each REST API.

 

  • The rest is just adding the right headers for the requests and starting the listening service.

 

  • mongoDBCrudAPIs.js – This file is where I define my REST APIs. At the time of writing this blog I just added 2 APIs, one that GET all Users in the MongoDB and the other that POST a new user.

     

    • GET /users – This is a very
      typical Express GET API. Inside the body though that’s where we:
      • Retrieve the monk DB connection that points to the MongoDB
      • Builds the Collection object as we did in the MongoDB console before
      • Runs a Find ALL function on the collections object that it built in the previous step
      • Sends back the resulting JSON payload as the API response. Notice that you can iterate and do whatever you want with this result

  • POST /user – Once again, this is a very typical POST Express API that retrieves some JSON body elements from the request. In terms of MongoDB, the bits that are worthwhile showing are:
    • Retrieve the monk DB connection that points to the MongoDB

  • Builds the Collection object as we did in the MongoDB console before
  • Uses the collection to insert the new user in a JSON format
  • Then it just handles errors if any. Otherwise return a successful response


  • Package.json – Make sure to add the right modules, so that you can run your NodeJS Application without having to manually install dependencies. In this case I am using:
    • express – To build the REST APIs
    • mongodb – To bring internal MongoDB libraries
    • monk – To help establish the MongoDB db connection
    • body-parser – To parse the incoming JSON requests

 

Let’s test our NodeJS Application

If you have been following this blog, at this point you have a few users in your “myTestDB” MongoDB, that you enter via the mongo client command line earlier.

  • I am using Postman to call the APIs, you can use your preferred test REST client. Let’s first try to retrieve these users.

  • Great job, now invoke your POST API to create a new one. In my case this is the JSON that I am sending:

{“name”:”Jason”,

“mobile”:”+6115791010101″,

“msg”:”Hi. This message comes from POSTMAN “}

 

Note: Make sure to add your header Content-type set to application/json

 

 

  • Now call the GET
    /users API again and make sure your new user is retrieved

Congratulations, you just managed to do very basic, yet powerful tasks using MongoDB as the backstore and NodeJS to expose interactive REST APIs via Express JS

 

If you have any comment, feel free to contact any of the authors of solutionsanz.blog – We are here to help you!

Thanks,

 

 

 

 

 

 

 

 

 

 

 

Developing with ACCS Caching Services

Try though we might to shift everything to a glorious set of stateless services, there are still plenty of scenarios in which an application might need to store some state, commonly to track some information about a user and that user’s session. In these cases, an in-memory cache is a powerful addition to an application, and is nearly indispensable to an elastic cloud architected app which is expected to be able to scale horizontally on-demand, to allow information to be shared between application instances, avoiding the need for load balancer complexity to implement sticky-sessions. Application Container Cloud Service added the ability for applications to add a Caching Service binding in the 17.1.1 drop, and in this blog post we will explore how to leverage this capability in a Node.js application.

Continue reading “Developing with ACCS Caching Services”