It’s almost 9 days before the event launches on the Friday night. Even before that, there are a series of workshops / webinars that we are hosting as part of the event in the days leading up to the event. Even then we are:
a/ Making sure that we have people, mentors, marketing, product managers, executives lined up to help where they can. b/ Making sure that we have ideas, platforms, trials, programs, education material lined up to help where it’s feasible. c/ Making sure that we help promote, advocate, market the event so those who would benefit would know about the event and attend.
All this effort for what outcome?
This says it all. And even though this is about #anomalydetection #deepfake #cybersecurity, much of this comes down to data – where the data can be sourced, how the data can be analysed, is the data reliable and can it be trusted.
Over the coming days leading up to the event – there will be plenty of chatter around it. Follow the event on LinkedIn. Some easy ways to follow are:
I’ll be writing more about it here as we go and as new content is available. If you are interested to know or more if you want to join a team or showcase a project or product – head to the Hackmakers website https://hackmakers.com/ to learn more and register.
On August 17th, we’ll be announcing winners of the #BuildWithAI hackathon and it will be live-streamed on youtube – https://youtu.be/URuB0FtBIJo (note – set your reminder). Cassie Kozyrkov (Chief Decision Scientist, Google), Steve Nouri (Board Member, Hackmakers), Cherie Ryan (Regional MD of ANZ and VP, Oracle) as well as an all-star judging line-up will be there.
Before we get to that, lets rewind, fast-forward and bring together some of the interesting points of the #BuildWithAI hackathon – an event that was truly global in its nature hosted by Hackmakers (https://hackmakers.com/).
July 24th 11:45am AEST – I received a calendar alert for the Leader Mentor Zoom session for the #BuildWithAI hackathon. Trying to finish as many of the things that I needed to get done before I joined this call. This will be interesting. Watching the number of competitors join the event’s slack workspace climbing from a hundred users when I first joined, to now over 3,500 users in the #introductions channel, it was an unique experience. I’m thinking about lots of different things from past hackathons that I’ve participated, mentored, sponsored, hosted – how will this one be any different. I’ll just have to wait and see. And better yet, give to the community and the competitors as much as I can in the time we have.
This moment was not the beginning nor the end of this experience. It was somewhere in between. I’ll give you some background.
Over the past couple of weeks, there’s been another Viz for Social Good project that was running. For this project, the supporter was Kiron Open Higher Education (https://kiron.ngo/en/) – an organisation that is providing a learning platform for refugees and underserved communities in the Middle East.
The project was to put a spotlight on refugees and immigrants and was linked to the virtual refugee conference called Amplify Now (https://virtualrefugeeconference.com/). Submitted projects went into the running to be featured at the conference itself.
In November 2018, I had the privilege to attend the Australian Oracle User Group national conference “#AUSOUG Connect” in Melbourne. My role was to have video interviews with as many of the speakers and exhibitors at the conference. Overall, 10 interviews over the course of the day, 90 mins of real footage, 34 short clips to share and plenty of hours reviewing and post-editing to capture the best parts.
We all know that data is massively valuable to businesses, whether it is to support daily business transactional activities (Online Transaction Processing – OLTP), or to help business with planning, problem solving and decision making (Online Analytical Processing – OLAP). Either way, businesses heavily rely on both ways to support their most important strategies and activities.
Until recently, companies had to heavily invest in provisioning, securing, patching and driving either way of Online Data processing mechanisms. In most cases, even with Cloud adoption, companies still had to rely on their own skills to make sure that their databases were properly patched, secured, tuned and managed.
However, today there is another option with the recent announcements that Oracle have made around Autonomous Databases for both OLAP and OLTP data processing. What this means, is that Oracle has taken automation to a totally new level with the assistance of Machine Learning. The idea is that the DB itself is self-sufficient with a full set of automated activities that range from patching, securing, optimising, etc. This will reduce not only the effort to run data workloads, but removing completely human errors, creating the opportunity to not only keep the lights on, but to focus on crucial business activities around innovation and differentiation.
In a previous blog, I explained how to treat your Infrastructure as Code by using technologies such as Vagrant and Terraform in order to help automate provisioning and decommissioning of environments in the cloud. Then, I evolved those concepts with this other blog, where I explained how to use Oracle PaaS Service Manager (PSM) CLI in order to provision Oracle PaaS Services into the Cloud.
In this blog, I am going to put together both concepts and show how simply you can automate the provisioning of Oracle Integration Cloud with Terraform and PSM CLI together.
To provision a new PaaS environment, I first create a “Build Server” in the cloud or as my boss calls it a “cockpit” that brings all the required bells and whistles (e.g. Terraform, PSM CLI, GIT, etc) to provision PaaS environments. I will add all the tooling it requires as part of its bootstrap process. To create the “Build Server” in the first place, I am using Vagrant + Terraform as well, just because I need a common place to start and these tools highly simplify my life. Also, this way, I can also treat my “Build Server” as “infrastructure as code” and I can easily get rid of it after I built my target PaaS environments and save with that some bucks in the cloud consumption model.
Once I build my “Build Server”, I will then simply git clone a repository that contains my scripts to provision other PaaS environments, setup my environment variables and type “terraform apply”. Yes, as simple as that!
In this blog, I am going to get you started with Oracle PaaS Service Manager (PSM) CLI – A great tool to manage anything API-enabled on any Oracle PaaS Service or Stack. For example, provisioning, scaling, patching, backup, restore, start, stop, etc.
It has the concept of Stack (multiple PaaS services), what means that you can very easily provision and manage full Stacks, such as Oracle Integration Cloud (OIC), that combines multiple PaaS solutions underneath, e.g. ICS, PCS, VBCS, DBCS, etc.
For this, we are going to use a pre-cooked Vagrant Box/VM that I prepared for you, so that you don’t have to worry about installing software, but moving as quickly as possible to the meat and potatoes.
This is a graphical view of what we are going to do:
I have been Integrating applications for the last 15+ years and normally every integration is a new challenge. It really doesn’t matter what technology or standards you use or have used in the past, the reality is that integrations come a new set of challenges. In the past few years the pre-built adapters have simplified in a way these challenges, but with hybrid architectures a new set of challenges comes to the picture, for example; connectivity, security, simplicity, message reliability, etc.
I was very excited when I learned the way Oracle Cloud is using to solve this problem. It is with something called Oracle ICS (Integration Cloud Service) Connectivity Agents, that basically get installed close to the Application being integrated, for example on-premise or in IaaS. Then using ICS in a browser, you can introspect into the backend application as if it was close to ICS, in fact with this solution ICS is completely oblivious of the location of the actual backend application. The Connectivity Agent will in this case connect internally with the back-end applications and communicate to ICS by pushing out messages via Messaging Cloud Service.
After provisioning an Oracle SOA Cloud Service and its related Oracle Database 12c Pluggable Database Cloud Service Instance, I then needed to connect to the Database in the Oracle Public Cloud in order to run some scripts for my demo tables. Obviously mixing customer data into the SOA Meta Data-Store (MDS) Database is not best practice but it was fine for my demo purposes.
In order to run the SQL scripts I had I was presented with a couple of choices (at least).