One of the recent additions to Oracle Cloud Infrastructure (OCI) is IAM Domains. New OCI tenancies are provisioned with IAM Domains and at time of writing tenancies with IDCS instances are being migrated to IAM Domains.
I originally created Peek to create a visual representation of effective user permissions inside an OCI tenancy to assist with performing user access reviews. Excessive permissions and IAM misconfigurations are a common issue found in cloud environments that can lead to privilege escalation and/or unauthorised access to resources and data.
At time of writing the latest release of the OCI CLI now supports interacting with IAM Domain resources and so I have created a version of Peek that works with IAM domains.
Step-by-step guide discovering how to provision and build a business processwith OCI Process Automation
OCI Process Automation (shortly OPA) is an OCI PaaS Oracle Managed cloud service which helps customers to build their business processes based on Structured or Unstructured models. This is the best solution to easily manage business processes granting to business users to build their own implementations without coding but just using a web browser and drag&drop capabilities… what we usually call a “no code” environment
The article has the goal to explain how, step by step, we can quickly test the features included in OPA… starting from my experience with the tool.
Just to simplify the explanation, I will describe a “happy path” process … in my example building one business process which usually is quite loved by everyone…. mainly when talking about the Vacation Request Approvals 🙂
If you’re running workloads in Oracle Cloud Infrastructure (OCI) then it’s likely you’ll be familiar with Virtual Cloud Network (VCN) resources such as Subnets, Route Tables, Gateways etc. These software defined components allow you to build networks in OCI for you to deploy and run your workloads.
Oracle has documentation that explains VCN access and security features which include things like Security Rules, Security Zones, Local and Network Firewalls, and IAM policies. Security rules are made up of Security Lists and Network Security Groups (NSG’s) and are a foundational element of every VCN and Subnet that you create. They define what traffic is allowed in and out of your subnets and what hosts can talk to one another. When you create a subnet a Security List is automatically created with some default rules:
Default Security List Ingress RulesDefault Security List Egress Rules
When it comes to implementing network access controls, you can use Security Lists, Network Security Groups or both. They are virtual firewall features that control traffic at the packet level. I’ll be covering Network Security Group reviews in a later post as I want to focus on Security Lists, specifically how you can easily review and validate rules to ensure they align with your workload, organisational, security and compliance requirements.
In my previous post I explained how you can use Let’s Encrypt and Oracle Cloud Infrastructure (OCI) serverless functions to obtain a publicly signed SSL certificate, and automatically manage its renewal lifecycle. The solution works as expected; I have a Let’s Encrypt certificate for my website automatically renewing 30 days before expiry. If you haven’t read my previous post I’d recommend taking a look before following the setup outlined below as it covers how the solution works, and some prerequisites.
Having multiple workloads running in various OCI regions I started thinking about a more elegant way to provision certificates across multiple regions. Certificates stored in the certificate service are only available to resources in the same region and would have required a function to be deployed in each region, and for each SSL certificate required.
I’ve since updated the solution to address this requirement. It is now possible to provision certificates across multiple OCI regions using a single OCI Function application. I’ve also taken the opportunity to implement other features such as:
Loading a list of certificates you want to manage from a JSON file stored in Object Storage.
Adding support for wildcard SSL certificates.
Adding support for Subject Alternative Names (SAN) in addition to the CN name.
Adding support for the use of DNS zones and Vaults that reside in different regions to the OCI Function.
Adding support to specify which vault, and region to use for a given certificate ensures that workloads with strict cryptographic key material requirements can still benefit from this solution.
If you’ve already followed the instructions from my previous post, the solution will continue to work as described. The only limitation being that it’ll only work for a single certificate. By following the steps below you can easily upgrade to issuing multiple certificates. If you haven’t set anything up yet that’s also fine as I’ll be covering the full install again here.
Let’s Encrypt made its debut back in late 2015. It is a free Certificate Authority provided by the Internet Security Research Group. The goal was to support the adoption of SSL / TLS to ensure the privacy of information sent over the public Internet. Let’s Encrypt is now serving over 2.5M certificates per day.
If you’re reading this it’s likely you’ve had to deal with SSL certificates before. It’s also likely some of you will have investigated an outage, only to find that an SSL certificate expired somewhere that no one knew about. Certificate discovery, management, and renewal can be time consuming and not much fun.
Cloud providers have made this job easier with the introduction of certificate services that are able to issue public Domain Validation (DV) certificates. Oracle Cloud Infrastructure (OCI) currently allows you to create private Certificate Authorities (CA’s), private Certificates, and private Certificate Authority bundles. Private certificate resources are used to secure communication across a private network, where certificates can be installed and trusted to enable secure communication.
But what about publicly signed certs for users connecting over the Internet? Using a private OCI certificate will result in a “certificate not trusted” error in your web browser; this is where Let’s Encrypt comes in. I’m going to show you how to run a completely automated serverless Let’s Encrypt solution in your OCI tenancy to install and automatically renew certificates that show as trusted in your web browser.
If you are running Oracle E-Business Suite (EBS) application today you will now be able to perform an auto discovery of all related resources in OCI Stack Monitoring. It will collect metrics specific for your EBS resources as well as ability to perform correlation across the EBS application and infrastructure stack as well as enable proactive alerting.
Components that will be auto discovered includes:
Concurrent Processing Node
Workflow Manager
WebLogic
Forms
Today, Stack Monitoring service supports EBS version 12.1 and 12.2 deployments hosted on OCI, On-Premise or Third Party Cloud (eg. AWS, Azure).
In the example, I will show you how you can configure Stack Monitoring for EBS version 12.2.
Oracle Cloud Agent (OCA) – This agent is deployed by default if you provision hosts via the OCI Compute Service. OCA has extensions and plugins which can be used to enable other features native to OCI Compute Services.
Management Agent (OMA) – This agent is a standalone version where you can deploy to hosts or VMs: – That do not have OCA installed on OCI eg. OCI Database Services (eg. Oracle Base VM/BM, ExaCS). – On-Premise – Third Party Cloud (AWS, Azure etc..)
…
Please see the current O&M support we have for each agent:
OCI Agent
Logging Analytics
Stack Monitoring
Database Management
Operations Insights
Target
Oracle Cloud Agent (OCA)
Yes
Yes
Yes
OCI Compute VM / BM Host
Oracle Management Agent (OMA)
Yes
Yes
Yes
Yes
Other VM Host (including on-premise and 3rd party cloud)
OMA Agent Install
In previous post, I have provided steps on how you can install the Oracle Management Agent.
OCA Agent Install
For this post, let me show you how easy it is to enable the O&M services for Oracle Cloud Agent (OCA).
HTTPS is essential as it protects the privacy of our data over the Internet. W3’s 2022 report shows nearly 80% of all websites use HTTPS as their default web protocol, up 6% on the previous year.
Getting started with HTTP/TLS is fairly straightforward. Obtain a CA signed certificate, configure it on your web servers and reverse proxy load balancers and you’re good to go. But how do you ensure your configuration stays up-to-date with current industry standards?
CyberSecurity is an arms race. As hardware and software evolves, so do the tools and techniques created to exploit them. This fierce race largely drives the innovation that we see in the industry today.
How does this relate to TLS? Since the inception of SSLv1 by Netscape in the 90’s there’s been many revisions, SSLv2, SSLv3, TLSv1.1, TLSv1.2 with the current version being TLSv1.3. TLSv1.1 was deprecated in 2021, with new versions being released approximately every 5 years. Given the rate at which exploits are discovered these release cycles will also need to keep pace.
For organisations this poses a number of interesting challenges because you can only control what TLS versions you support. Also if your website or API is public then it’s likely you have no control over the connecting client, or which TLS versions they’re able to use.
Functions in Oracle Cloud Infrastructure are great. As a serverless execution environment with pre-built logging, metrics, etc. it allows developers to simply focus on their code and not worry about all of the supporting infrastructure, while still providing a lot of flexibility through the use of container primitives. As great as Functions are, they are reactive, they can only be invoked and can’t natively be configured to be executed in a spontaneous or scheduled manner. Often this won’t matter, as Functions will be invoked directly or indirectly by users, or in response to events, but sometimes you simply need a bit of code to run periodically.
Everyone is aware of the continuous integration and continuous development relevance which is nowadays the mantra of DevOps practices.
Oracle Integration is obviously part of the end2end lifecycle development being involved for connecting legacy applications usually deployed on-premise and SaaS applications often provided by Oracle Cloud or hosted on other Cloud providers.
It doesn’t matter where the applications are, where the integration is; the continuous delivery of new integration processes and versions need to be included in a smart and automated tool able to reduce the gap between the different developer teams.
Developers, who have the ownership to build new services and IT Operators, who have the task of deploying new code versions to the different environments, need to converge on one single tool to simplify complex procedures that can be simply considered as two sides of the same coin.
The common need is to keep all environments aligned with the latest implementations, possibly having everything monitored and tracked to grant audit activities in terms of compliance; this is a must when the project is starting to become critical and relevant at the enterprise level.
Oracle Integration (OIC), as you know, includes Visual Builder Cloud Service which allows open-source standards-based integration to develop, collaborate on, and deploy applications within Oracle Cloud.
Just for this, it’s easy to use Visual Builder Studio, the built-in tool, that allows developers to manage the software life cycle automating the development.
Oracle VB Studio natively supports Oracle Integration artifacts, so we can leverage this one to easily promote our integration flows from an environment to another one moving for example our integration projects from development to test environment once you we completed the new implementation and of course ready to test it.
That’s the right path to be used for promoting projects from Test to Production or from Production to a DR environment, this one probably running on a different OCI Region.
Working with the current implementation you can:
Export integration flows
Import integration flows
Delete integration flows
As shown below in the picture, the options we have working with Oracle Visual Builder Studio and OIC
Herewith an example of pipeline that you can easily configure to automate the Export / Import procedure and defining in cascade all steps (“jobs”) to define the required actions, of course this one below just for demo purposes. This procedure will be later explained step-by-step just in case you want to reproduce this one for your own purposes
In order to export our assets from the development environment, for example, it’s enough to configure our source and target environments about the OIC instances
How to configure our OIC environments?
This is a straightforward operation working with VB Studio, as shown below:
We can create all connections we need to configure properly the tool
Once we have configured our instances, we need to build our “pipeline” so to automate the procedure when needed
Each pipeline can include all “jobs” we need (in the previous screenshot we have used two different jobs “select your OIC project” and “import OIC project”) so to build the right chain among the different available “jobs”
To create a job, select the Build link from the left panel of the Visual Builder studio and then we can create a new job
Each job has some options and parameters to be configured as below the screenshot shows:
Select the “Parameters” tab to configure the string parameter:
The “Default Value” is the value of the integration flow version on our OIC instance to be selected and moved to the new instance. Of course, this value can be changed when we run the build so to properly set the right integration flow version
Now it’s time to select the “Steps” tab to identify the OIC instance from where we want to export our integration flow
If needed, we can also include the asserter recording just flagging the box. In this case we are moving (exporting / importing) the integration flow named “ECHO” and working with its *.iar file once we have exported this one.
Now you can click the “After Build” tab to configure it as below described. The *.iar extension is the default extension of the integration flow when you decide to download it.
Click save and that’s all. Our first job is properly configured now.
To proceed we are now ready to configure the second job (“import OIC project”).
In this case, the first step to be accomplished is the configuration of the “Before Build” tab as below shown and adding a “Copy Artifacts” option
And now, as we did with the first job, we can properly configure the OIC instance target, in our sample, but in this case for the import action.
We can also check the box about the “activate integration” option so that our integration flow will be imported and started just to have this one ready to be invoked by applications
Also, in this case, we can now save our configuration.
Once these operations have been completed, we are ready to test our pipeline selecting the start button on the right side of the web page and below shown
If the execution of our “build” is properly configured, we can see the “green flag” of our jobs once we run it
Furthermore, we can drill down the execution to look at the log information just in case something wrong having also the chance to download the file including the log for further analysis or if we need to share this one with other people or applications.
From the Visual Builder Studio “Home page” we can also get information about statistics and previous executions so to track the activities managed on the different resources we have
This is for sure the best way to properly manage our environments and the best approach to have under control the lifecycle of our projects and their deployment.
For further information, look at the really interesting content already published here: