Oracle recently introduced a Web Application Firewall (WAF) to further enhance and secure Oracle Cloud Infrastructure offerings. The Oracle Cloud Infrastructure WAF is based on Oracle Zenedge and Oracle Dyn technologies. It inspects all traffic destined to your web application origin and identifies and blocks all malicious traffic. The WAF offers the following tools, which can be used on any website, regardless of where it is being hosted:
Origin management
Bot management
Access control
Over 250 robust protection rules that include the OWASP rulesets to protect against SQL injection, cross-site scripting, HTML injection, and more
In this post, I configure a set of access control WAF policies to a website. Access control defines explicit actions for requests that meet conditions based on URI, request headers, client IP address, or countries and regions.
I just came across this great article by Ali Mukadam to autoscale OCI instances based on Instance Configuration, Instance Pools and defining auto-scale policies. Finally, using Kubernetes (OKE) to generate load.
You probably heard that Oracle Autonomous Database (ADB) leverages machine learning to automate with traditional infrastructure related database administration tasks such as security, backups and patching.
No matter how well designed your database infrastructure is, performance and issues relating application or external components which make up the application ecosystem can still have an impact on end user response time or availability. Continue reading “Why Would you Monitor an Autonomous Database?”
I was reflecting recently on how IT tools and productivity aids often allow us to make a mess real quickly. There is often some underlying basics that need to be considered before using the productivity tool in order to get a sustainable outcome. As the old adage goes … A fool with a tool is still a fool !
I just read a great blog posted by my colleague Ali Mukadam. He has been spending some time exploring a number of interesting technologies including the Oracle Container Engine. For those unaware, the Oracle Cloud Infrastructure Container Engine for Kubernetes (OKE) is a fully-managed, scalable, and highly available service that you can use to deploy your containerized applications to the cloud. I have played with this a little and as a technology geek I really love it.
Oracle does provides a Quick create option to help you get to function quickly, however often you will need to consider the wider IT landscape, such as where does this service fit into my overall network topology in order to assess how you want to lay out your network etc. To that end Ali has developed a toolkit to automate the provisioning of OKE on the Oracle Cloud Infrastructure.
I have already blogged two different integration pattern between Service Cloud to Eloqua.
This first integration pattern was using Standard Object e.g. Contact object, documented in this blog
The second integration pattern was using Custom Object e.g. Degree Object, documented in this blog.
Now, in this current blog, I am going to cover another integration pattern where bulk data can be imported from Service Cloud to Eloqua using Service Cloud ROQL statement.
In both my previous blogs, data was getting exchanged real time but one transaction at a time, but in this blog data will be transmitted from Service Cloud to Eloqua in bulk for Opportunity Business Object.
Recently, I have been worked for different use-case scenarios between Service Cloud to Eloqua Integration using OIC, hence thought to publish this blog to cover all those scenarios.
This is first in series which will use standard business Contact object data replication, there will be two more blogs, one covering Custom Object replication and another one will be importing data in bulk from Service Cloud to Eloqua.
Before, I start showing the steps how Contact Business Object data can be replicate from Service Cloud to Eloqua. I need to emphasis the important of Service Cloud Adapter and Eloqua Adapter.
Recently I built a Facial Recognition Mobile App using Oracle Visual Builder having set up the Facial recognition APIs using Tensorflow taking some inspiration from FaceNet. As highlighted above the app does the following: record a video of your face and send it to the API that generates various images and classifies them based on the label we provide at runtime. And in turn, invoke another API that is going to train the machine learning model to update the dataset with the new images and label provided. These two APIs will build a facial recognition Database. Once I have this, I can capture the face and compare that with the dataset I have captured earlier in my Facial recognition Database to output if the face exists in our system.
Hashed Timelock Agreements, or Contracts, have emerged as an important concept in the cryptocurrency space in order to perform transactions across ledgers – and I feel could be a valid mechanism to handle the issue of performing verifiable cross-channel transactions in Hyperledger in some use cases. The basic concept of a Hashed Timelock Agreement (HTLA) is that it allows for a conditional transaction (which I have deemed a ‘proposal’) with a cryptographic challenge which ensures it can only be completed by a pre-defined party. This can be chained through multiple intermediaries, which can enable two organisations who do not share a channel to interact, and for transactions to be confirmed across channels.