Teaching How to Design and Secure an API with Oracle API Platform

This blog is the second part of an end-to-end exercise that starts explaining the steps to clone a GitHub repository that contains an agnostic Medical Records application, built by us in NodeJS and which exposes REST API endpoints via a Swagger API-descriptor running locally on Swagger UI (all included as part of the repository). The previous part of this 2-blogs series also explains the steps required to run the MedRec NodeJS application on Docker containers either locally or in the Oracle Public Cloud. For more information about this first part, go here.

Moving to this second part, we are going to cover the following steps:

  1. Create an Apiary account used to Design APIs (API First approach) and create a new API Project using the existing MedRec Swagger API-definition.
  2. We are going to spend a little bit of time playing with Apiary to feel comfortable in areas such as:
    1. Validating API definitions
    2. Testing API endpoints
    3. Switching across out-of-the-box Mock Servers and real Production MedRec service end-points.
  3. Login to Oracle API Platform and configure an API, this includes:
    1. Enforcing Security and other policies.
    2. Deploy API and securing access level to on-premise and Cloud-based API Gateways.
    3. Publishing APIs into the API Developers Portal.
    4. Linking API to Apiary Swagger API-definition living document.
  4. Login to API Developers Portal (API Catalog)
    1. Register a New Application
    2. Understanding the role of API Keys
    3. Reviewing MedRec API Documentation
    4. Registering to consume MedRec APIs
    5. Testing APIs.
  5. Understand API Analytics, consumption, metrics and monitoring dashboards.

Continue reading “Teaching How to Design and Secure an API with Oracle API Platform”

Experimenting with Fn project

The first AppDev Made Easy (previously known as DX Workshop) for this tour started in Perth. We are continually trialing a few different things as such as we incorporated Fn project https://fnproject.io.

The whole demonstration of Functions was to articulate that there are different ways to execute and understanding the problem to solve as well as the values that the organisation holds (including both business and IT departments including developers) which will determine the technology.

For the demo we start from the very beginning.

Continue reading “Experimenting with Fn project”

Teaching best practices to Design, Build, Secure and Monitor APIs

In this blog, I want to share my experience after having created many APIs using different approaches and technologies. I am going to encapsulate a simple process that will help you construct APIs, starting from scratch with an idea or requirement and move it all along to a happy consumption.

The best part of APIs is that they are microservices enablers, which implies that they are not technology prescriptive, so in this blog you will see that your APIs can be implemented using any technology or programming language.

I decided to use “Jokes” as the vehicle to explain the APIs construction best practices, mainly because jokes are a simple concept that anyone can relate to, but also because I want you to feel compelled to consume these APIs and by doing so, get a laugh or two.

My original idea with jokes is to:

  1. Get a random joke.
  2. Translate the joke to any language.
  3. Share the original or the translated joke with a friend via SMS.

This is the high-level view of how our end solution will look like:

Continue reading “Teaching best practices to Design, Build, Secure and Monitor APIs”

Apiary designed APIs tested using Dredd

APIs are becoming the window to the digital assets of the modern business. Well documented, well governed and easy to use APIs are key to their successful uptake, longevity and associated business success. Yes, I did say well documented. In this instance I am talking about the documentation required to describe the APIs capabilities in a manner that is meaningful for your ultimate audience, the “API Consumers”, however it will also provide the template for the API Developer to develop their code from. In the modern business climate, we probably don’t want to produce War and Peace, we simply want to take a minimum viable approach to our API documentation. But where would I find a capability that will simplify our task as API Designers, capture the design documentation for our APIs, allow us to do some initialise testing to validate the usefulness of our design before any code is cut, and also have the documentation ready for consumption by team members and interested parties using a standards based approach. Where indeed ! Look no further than Apiary.io. Continue reading “Apiary designed APIs tested using Dredd”

Oracle IoT – Working with Bosch Devices

How to use the Bosch XDK with the Oracle IoT cloud service.

Introduction

As I continue to work with various IoT vendors to see how they could be used with the Oracle IoT cloud service, I came across one of these nice little Bosch XDK kits.

bosch-xdk

This is a demo kit to show off the many Bosch sensors available and give people an environment for prototyping.  Here are some of the built-in sensors:XDK Sensors

The device has built-in wifi and Bluetooth LE.

My goal was to get this to periodically send sensor data into the IoT cloud service and make it easy for others to do the same.

Continue reading “Oracle IoT – Working with Bosch Devices”

Exploring GitHub Docker Hub and OCCS Part 4

In my previous post in this series I covered linking GitHub and DockerHub and configuring the environment such that a build of a Docker image was triggered on updates to GitHub. In this final post of the series I will take you through the steps to pull the image from Docker Hub into OCCS in order to run the application. It should be noted that the image built on Docker Hub in my example is only the web tier that contains my Node.js project (APIs and SwaggerUI). The MongoDB component of my OCCS Stack is pulled directly from Docker Hub when my Stack containing the Web Tier and Database Tier services is deployed to OCCS. Continue reading “Exploring GitHub Docker Hub and OCCS Part 4”

Exploring GitHub, DockerHub and OCCS Part 2

In my previous post I detailed how I Dockerised the MedRec app. In this post I will show how I added MongoDB and defined a stack using Docker-Compose.

Add MongoDB layer using Docker-Compose

According to the official docker documentation ;

“Compose is a tool for defining and running multi-container Docker applications. With Compose, you use a Compose file to configure your application’s services. Then, using a single command, you create and start all the services from your configuration. ”

A single command to create and start all the services in a configuration sounded pretty good to me. I definitely was keen on exploring docker-compose.

Add a docker-compose.yml file

Having proved that my web application runs up, I now need to address the persistence layer. The above Dockerfile contains the steps to create the required runtime platform for my node app, and installs the node application and package dependences (as specified in the package.json) file by doing the npm install. However if I tried to do a GET or a PUT my app will fail as it won’t find a MongoDB inside my container. I therefore still need a MongoDB somewhere in my environment to hold my application data. Continue reading “Exploring GitHub, DockerHub and OCCS Part 2”