Quantcast
Channel: Oracle Blogs | Oracle Shay Shmeltzer's Blog
Viewing all 200 articles
Browse latest View live

Announcing PeopleSoft Cloud Manager Support for Oracle Cloud Infrastructure

$
0
0

Oracle released PeopleSoft Cloud Manager in 2017 featuring in-depth automation to help accelerate adoption of Oracle Cloud (Classic) as an efficient deployment platform for PeopleSoft customers. With the excitement generated around Oracle Cloud Infrastructure (OCI)--a cloud designed for the enterprise customer--several customers and partners have been looking forward to taking advantage of the enhanced OCI with PeopleSoft Cloud Manager.  Oracle is pleased to announce Cloud Manager’s support for OCI beginning with today’s release of PeopleSoft Cloud Manager Version 6.

So, what is new and exciting in PeopleSoft Cloud Manager Version 6?  For the first time, there are two images provided: one for OCI, and the other for OCI Classic.  The Cloud Manager Image 6 for OCI supports a number of OCI features, including Regions, Virtual Cloud Networks, Subnets, Compute and DB System platforms.  With this image, instances will be provisioned on VM shapes.  Customers can lift and shift PeopleSoft environments from on-premises to OCI using the same approach they used OCI Classic.

For PeopleSoft Cloud Manager on OCI Classic, we have enabled support for the lift and shift of on-premises databases encrypted with Oracle Transparent Data Encryption (TDE).  TDE offers another level of data security that customers are looking for as data is migrated to the cloud.  A ‘Clone to template’ option is also available for encrypted databases. 

The lift utility requires a few parameters for TDE so that the encrypted database may be packaged and lifted to the cloud.

During shift process, the same parameters are required to deploy the lifted database.

Customers have also requested an enhancement to support non-Unicode databases for PeopleSoft environments.  PeopleSoft Cloud Manager Version 6 supports lift and shift of environments that use non-Unicode Databases.  Unlike image 5, a conversion of the on-premises database to Unicode is no longer required prior to using Cloud Manager’s Lift and Shift automation.

To get your hands on the new Cloud Manager images, go to the Oracle Marketplace and look for either the OCI-Classic image or the OCI image…or try both!   Be sure to review the documentation and additional important information mentioned in the Marketplace listings.

We are excited to combine the automation of provisioning and maintenance that PeopleSoft Cloud Manager provides with the robust benefits of Oracle Cloud Infrastructure.  Combining support for OCI with the additional features of non-Unicode databases and TDE encrypted databases, we expect all customers to benefit from the latest Cloud Manager image, using whichever Oracle Cloud is right for you. 

Stay tuned for additional information and more Cloud Manager features.  Now, off to the next image!

 


7 Machine Learning Best Practices

$
0
0

Netflix’s famous algorithm challenge awarded a million dollars to the best algorithm for predicting user ratings for films. But did you know that the winning algorithm was never implemented into a functional model?

Netflix reported that the results of the algorithm just didn’t seem to justify the engineering effort needed to bring them to a production environment. That’s one of the big problems with machine learning.

At your company, you can create the most elegant machine learning model anyone has ever seen. It just won’t matter if you never deploy and operationalize it. That's no easy feat, which is why we're presenting you with seven machine learning best practices.

Download your free ebook, "Demystifying Machine Learning"

At the most recent Data and Analytics Summit, we caught up with Charlie Berger, Senior Director of Product Management for Data Mining and Advanced Analytics to find out more. This is article is based on what he had to say. 

Putting your model into practice might longer than you think. A TDWI report found that 28% of respondents took three to five months to put their model into operational use. And almost 15% needed longer than nine months.

Graph on Machine Learning Operational Use

So what can you do to start deploying your machine learning faster?

We’ve laid out our tips here:

1. Don’t Forget to Actually Get Started

In the following points, we’re going to give you a list of different ways to ensure your machine learning models are used in the best way. But we’re starting out with the most important point of all.

The truth is that at this point in machine learning, many people never get started at all. This happens for many reasons. The technology is complicated, the buy-in perhaps isn’t there, or people are just trying too hard to get everything e-x-a-c-t-l-y right. So here’s Charlie’s recommendation:

Get started, even if you know that you’ll have to rebuild the model once a month. The learning you gain from this will be invaluable.

2. Start with a Business Problem Statement and Establish the Right Success Metrics

Starting with a business problem is a common machine learning best practice. But it’s common precisely because it’s so essential and yet many people de-prioritize it.

Think about this quote, “If I had an hour to solve a problem, I’d spend 55 minutes thinking about the problem and 5 minutes thinking about solutions.”

Now be sure that you’re applying it to your machine learning scenarios. Below, we have a list of poorly defined problem statements and examples of ways to define them in a more specific way.

Machine Learning Problem Statements

Think about what your definition of profitability is. For example, we recently talked to a nation-wide chain of fast-casual restaurants that wanted to look at increasing their soft drinks sales. In that case, we had to consider carefully the implications of defining the basket. Is the transaction a single meal, or six meals for a family? This matters because it affects how you will display the results. You’ll have to think about how to approach the problem and ultimately operationalize it.

Beyond establishing success metrics, you need to establish the right ones. Metrics will help you establish progress, but does improving the metric actually improve the end user experience? For example, your traditional accuracy measures might encompass precision and square error. But if you’re trying to create a model that measures price optimization for airlines, that doesn’t matter if your cost per purchase and overall purchases isn’t going up.

3. Don’t Move Your Data – Move the Algorithms

The Achilles heel in predictive modeling is that it’s a 2-step process. First you build the model, generally on sample data that can run in numbers ranging from the hundreds to the millions. And then, once the predictive model is built, data scientists have to apply it. However, much of that data resides in a database somewhere.

Let’s say you want data on all of the people in the US. There are 360 million people in the US—where does that data reside? Probably in a database somewhere.

Where does your predictive model reside?

What usually happens is that people will take all of their data out of database so they can run their equations with their model. Then they’ll have to import the results back into the database to make those predictions. And that process takes hours and hours and days and days, thus reducing the efficacy of the models you’ve built.

However, growing your equations from inside the database has significant advantages. Running the equations through the kernel of the database takes a few seconds, versus the hours it would take to export your data. Then, the database can do all of your math too and build it inside the database. This means one world for the data scientist and the database administrator.

By keeping your data within your database and Hadoop or object storage, you can build models and score within the database, and use R packages with data-parallel invocations. This allows you to eliminate data duplications and separate analytical servers (by not moving data) and allows you to to score models, embed data prep, build models, and prepare data in just hours.

4. Assemble the Right Data

As James Taylor with Neil Raden wrote in Smart Enough Systems, cataloging everything you have and deciding what data is important is the wrong way to go about things. The right way is to work backward from the solution, define the problem explicitly, and map out the data needed to populate the investigation and models.

And then, it’s time for some collaboration with other teams.

Machine Learning Collaboration Teams

Here’s where you can potentially start to get bogged down. So we will refer to point number 1, which says, “Don’t forget to actually get started.” At the same time, assembling the right data is very important to your success.

For you to figure out the right data to use to populate your investigation and models, you will want to talk to people in the three major areas of business domain, information technology, and data analysts.

Business domain—these are the people who know the business.

  • Marketing and sales
  • Customer service
  • Operations

Information technology—the people who have access to data.

  • Database administrators

Data Analysts—people who know the business.

  • Statisticians
  • Data miners
  • Data scientists

You need the active participation. Without it, you’ll get comments like:

  • These leads are no good
  • That data is old
  • This model isn’t accurate enough
  • Why didn’t you use this data?

You’ve heard it all before.

5. Create New Derived Variables

You may think, I have all this data already at my fingertips. What more do I need?

But creating new derived variables can help you gain much more insightful information. For example, you might be trying to predict the amount of newspapers and magazines sold the next day. Here’s the information you already have:

  • Brick-and-mortar store or kiosk
  • Sell lottery tickets?
  • Amount of the current lottery prize

Sure, you can make a guess based off that information. But if you’re able to first compare the amount of the current lottery prize versus the typical prize amounts, and then compare that derived variable against the variables you already have, you’ll have a much more accurate answer.

6. Consider the Issues and Test Before Launch

Ideally, you should be able to A/B test with two or more models when you start out. Not only will you know how you’re doing it right, but you’ll also be able to feel more confident knowing that you’re doing it right.

But going further than thorough testing, you should also have a plan in place for when things go wrong. For example, your metrics start dropping. There are several things that will go into this. You’ll need an alert of some sort to ensure that this can be looked into ASAP. And when a VP comes into your office wanting to know what happened, you’re going to have to explain what happened to someone who likely doesn’t have an engineering background.

Then of course, there are the issues you need to plan for before launch. Complying with regulations is one of them. For example, let’s say you’re applying for an auto loan and are denied credit. Under the new regulations of GDPR, you have the right to know why. Of course, one of the problems with machine learning is that it can seem like a black box and even the engineers/data scientists can’t say why certain decisions have been made. However, certain companies will help you by ensuring your algorithms will give a prediction detail.

7. Deploy and Automate Enterprise-Wide

Once you deploy, it’s best to go beyond the data analyst or data scientist.

What we mean by that is, always, always think about how you can distribute predictions and actionable insights throughout the enterprise. It’s where the data is and when it’s available that makes it valuable; not the fact that it exists. You don’t want to be the one sitting in the ivory tower, occasionally sprinkling insights. You want to be everywhere, with everyone asking for more insights—in short, you want to make sure you’re indispensable and extremely valuable.

Given that we all only have so much time, it’s easiest if you can automate this. Create dashboards. Incorporate these insights into enterprise applications. See if you can become a part of customer touch points, like an ATM recognizing that a customer regularly withdraws $100 every Friday night and likes $500 after every payday.

Conclusion

Here are the core ingredients of good machine learning. You need good data, or you’re nowhere. You need to put it somewhere like a database or object storage. You need deep knowledge of the data and what to do with it, whether it’s creating new derived variables or the right algorithms to make use of them. Then you need to actually put them to work and get great insights and spread them across the information.

The hardest part of this is launching your machine learning project. We hope that by creating this article, we’ve helped you out with the steps to success. If you have any other questions or you’d like to see our machine learning software, feel free to contact us.

You can also refer back to some of the articles we’ve created on machine learning best practices and challenges concerning that. Or, download your free ebook, "Demystifying Machine Learning."

 

Why Now Is the Time for ERP in the Cloud

$
0
0

“The movement to cloud is an inevitable destination; this is how computing will evolve over the next several years.” So said Oracle CEO Mark Hurd at Oracle OpenWorld 2017. Based on the results of new research, that inevitability is here, now.

In our first ERP Trends Report, we surveyed more than 400 finance and IT leaders. We found that 76% of respondents said they either have plans for ERP in the cloud or have made the move already. They are recognizing that waiting puts them at a disadvantage; the time to make the move is now.

The majority of respondents cited economic factors as the reason they made the leap, and it’s easy to see why: Nucleus Research recently published a report that cloud delivers 3.2x the return on investment (ROI) of on-premises systems, while the total cost of ownership (TCO) is 52% lower.  

But even more surprising were the benefits realized once our survey respondents got to the cloud. An astonishing 81% cited “Staying current on technology” as the main benefit of moving to cloud ERP. With a regular cadence of innovation delivered by the cloud, it is easier for companies to quickly incorporate game-changing technologies into everyday business processes—technologies like artificial intelligence, machine learning, the Internet of Things (IoT), blockchain and more. In the cloud, the risk of running their businesses on obsolete technology drops to zero. It’s the last upgrade they will ever need.

“One of the key value propositions in engaging with Oracle and implementing the cloud solutions has been the value of keeping current with technology and technological developments,” said Mick Murray, CFO of Blue Shield of California. “In addition to robotics, we’re looking at machine learning and artificial intelligence, and how do we apply that across the enterprise.”

As new capabilities are rolled out, cloud subscribers like Blue Shield can take advantage of them immediately. This gives them the agility to be both responsive and predictive. Uncertainty is the new normal in business and managing amid uncertainty is a must. It’s no longer enough to be quick-to-change; competitive companies must also have reliable insight into how potential future scenarios could impact performance.

So, what does that mean in terms of daily operations? Basically, it means people using knowledge to make good decisions in a fast, productive, and highly automated manner at all levels of the business. Cloud systems provide the data integration and ongoing technology refresh to incorporate best practices and technology advances.

The cloud also makes it easier to integrate external sources of valuable, contextual knowledge that helps improve the accuracy of data models. This is important considering the scope of threats to sustainable operations for businesses with large, global footprints. Political, environmental, and economic factors across multiple regions could impact business, such as limited travel capabilities slowing down delivery of key supplies.

Business uncertainty is everywhere, and organizations must be able to say, “What is our plan if X happens? What is our plan if X, Y, and Z happen, but W doesn’t?” And this insight must come quickly. Business moves too fast for reports to take days to compile.

ERP Replacement Effort Is Not What It Used to Be

One final stone on the scale in favor of ERP cloud is that migrating does not have to be painful. Don’t let memories of past onsite replacements haunt you. With the right products and the right expertise behind them, cloud migrations happen quickly, cause minimal business disruption, and don’t require intense user training.

For example, Blue Shield of California had set aside $600,000 on change management for the adoption of cloud; in the end, they barely spent anything. Change adoption, they reported, happened quickly and seamlessly.

Considering the benefits for cost savings, elimination of technology obsolescence, and ease of adopting emerging technologies, it is becoming harder to justify a wait on migration to cloud ERP. Disruption is not an issue, and long-term cost saving are substantial. Most importantly, modernizing ERP is an opportunity to modernize the business and embed an ever-refreshing technology infrastructure that enables higher performance on multiple levels.

 

Event Hub Cloud Service. Hello world

$
0
0

In early days, I've wrote a blog about Oracle Reference Architecture and concept of Schema on Read and Schema on Write. Schema on Read is well suitable for Data Lake, which may ingest any data as it is, without any transformation and preserve it for a long period of time. 

At the same time you have two types of data - Streaming Data and Batch. Batch could be log files, RDBMS archives. Streaming data could be IoT, Sensors, Golden Gate replication logs.

Apache Kafka is very popular engine for acquiring streaming data. It has multiple advantages, like scalability, fault tolerance and high throughput. Unfortunately, Kafka is hard to manage. Fortunately, Cloud simplifies many routine operations. Oracle Has three options for deploy Kafka in the Cloud:

1) Use Big Data Cloud Service, where you get full Cloudera cluster and there you could deploy Apache Kafka as part of CDH.

2) Event Hub Cloud Service Dedicated. Here you have to specify server shapes and some other parameters, but rest done by Cloud automagically. 

3) Event Hub Cloud Service. This service is fully managed by Oracle, you even don't need to specify any compute shapes or so. Only one thing to do is tell for how long you need to store data in this topic and tell how many partitions do you need (partitions = performance).

Today, I'm going to tell you about last option, which is fully managed cloud service.

It's really easy to provision it, just need to login into your Cloud account and choose "Event Hub" Cloud service.

after this go and choose open service console:

Next, click on "Create service":

Put some parameters - two key is Retention period and Number of partitions. First defines for how long will you store messages, second defines performance for read and write operations.

Click next after:

Confirm and wait a while (usually not more than few minutes):

after a short while, you will be able to see provisioned service:

 

 

Hello world flow.

Today I want to show "Hello world" flow. How to produce (write) and consume (read) message from Event Hub Cloud Service.

The flow is (step by step):

1) Obtain OAuth token

2) Produce message to a topic

3) Create consumer group

4) Subscribe to topic

5) Consume message

Now I'm going to show it in some details.

OAuth and Authentication token (Step 1)

For dealing with Event Hub Cloud Service you have to be familiar with concept of OAuth and OpenID. If you are not familiar, you could watch the short video or go through this step by step tutorial

In couple words OAuth token authorization (tells what I could access) method to restrict access to some resources.

One of the main idea is decouple Uses (real human - Resource Owner) and Application (Client). Real man knows login and password, but Client (Application) will not use it every time when need to reach Resource Server (which has some info or content). Instead of this, Application will get once a Authorization token and will use it for working with Resource Server. This is brief, here you may find more detailed explanation what is OAuth.

Obtain Token for Event Hub Cloud Service client.

As you could understand for get acsess to Resource Server (read as Event Hub messages) you need to obtain authorization token from Authorization Server (read as IDCS). Here, I'd like to show step by step flow how to obtain this token. I will start from the end and will show the command (REST call), which you have to run to get token:

#!/bin/bash curl -k -X POST -u "$CLIENT_ID:$CLIENT_SECRET"\ -d "grant_type=password&username=$THEUSERNAME&password=$THEPASSWORD&scope=$THESCOPE"\"$IDCS_URL/oauth2/v1/token"\ -o access_token.json

as you can see there are many parameters required for obtain OAuth token.

Let's take a looks there you may get it. Go to the service and click on topic which you want to work with, there you will find IDCS Application, click on it:

After clicking on it, you will go be redirected to IDCS Application page. Most of the credentials you could find here. Click on Configuration:

On this page right away you will find ClientID and Client Secret (think of it like login and password):

 

look down and find point, called Resources:

Click on it

and you will find another two variables, which you need for OAuth token - Scope and Primary Audience.

One more required parameter - IDCS_URL, you may find in your browser:

you have almost everything you need, except login and password. Here implies oracle cloud login and password (it what you are using when login into http://myservices.us.oraclecloud.com):

Now you have all required credential and you are ready to write some script, which will automate all this stuff:

#!/bin/bashexport CLIENT_ID=7EA06D3A99D944A5ADCE6C64CCF5C2AC_APPID export CLIENT_SECRET=0380f967-98d4-45e9-8f9a-45100f4638b2 export THEUSERNAME=john.dunbar export THEPASSWORD=MyPassword export SCOPE=/idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest export PRIMARY_AUDIENCE=https://7EA06D3A99D944A5ADCE6C64CCF5C2AC.uscom-central-1.oraclecloud.com:443 export THESCOPE=$PRIMARY_AUDIENCE$SCOPEexport IDCS_URL=https://idcs-1d6cc7dae45b40a1b9ef42c7608b9afe.identity.oraclecloud.com curl -k -X POST -u "$CLIENT_ID:$CLIENT_SECRET"\ -d "grant_type=password&username=$THEUSERNAME&password=$THEPASSWORD&scope=$THESCOPE"\"$IDCS_URL/oauth2/v1/token"\ -o access_token.json

after running this script, you will have new file called access_token.json. Field access_token it's what you need:

$ cat access_token.json {"access_token":"eyJ4NXQjUzI1NiI6InVUMy1YczRNZVZUZFhGbXFQX19GMFJsYmtoQjdCbXJBc3FtV2V4U2NQM3MiLCJ4NXQiOiJhQ25HQUpFSFdZdU9tQWhUMWR1dmFBVmpmd0UiLCJraWQiOiJTSUdOSU5HX0tFWSIsImFsZyI6IlJTMjU2In0.eyJ1c2VyX3R6IjoiQW1lcmljYVwvQ2hpY2FnbyIsInN1YiI6ImpvaG4uZHVuYmFyIiwidXNlcl9sb2NhbGUiOiJlbiIsInVzZXJfZGlzcGxheW5hbWUiOiJKb2huIER1bmJhciIsInVzZXIudGVuYW50Lm5hbWUiOiJpZGNzLTFkNmNjN2RhZTQ1YjQwYTFiOWVmNDJjNzYwOGI5YWZlIiwic3ViX21hcHBpbmdhdHRyIjoidXNlck5hbWUiLCJpc3MiOiJodHRwczpcL1wvaWRlbnRpdHkub3JhY2xlY2xvdWQuY29tXC8iLCJ0b2tfdHlwZSI6IkFUIiwidXNlcl90ZW5hbnRuYW1lIjoiaWRjcy0xZDZjYzdkYWU0NWI0MGExYjllZjQyYzc2MDhiOWFmZSIsImNsaWVudF9pZCI6IjdFQTA2RDNBOTlEOTQ0QTVBRENFNkM2NENDRjVDMkFDX0FQUElEIiwiYXVkIjpbInVybjpvcGM6bGJhYXM6bG9naWNhbGd1aWQ9N0VBMDZEM0E5OUQ5NDRBNUFEQ0U2QzY0Q0NGNUMyQUMiLCJodHRwczpcL1wvN0VBMDZEM0E5OUQ5NDRBNUFEQ0U2QzY0Q0NGNUMyQUMudXNjb20tY2VudHJhbC0xLm9yYWNsZWNsb3VkLmNvbTo0NDMiXSwidXNlcl9pZCI6IjM1Yzk2YWUyNTZjOTRhNTQ5ZWU0NWUyMDJjZThlY2IxIiwic3ViX3R5cGUiOiJ1c2VyIiwic2NvcGUiOiJcL2lkY3MtMWQ2Y2M3ZGFlNDViNDBhMWI5ZWY0MmM3NjA4YjlhZmUtb2VodGVzdCIsImNsaWVudF90ZW5hbnRuYW1lIjoiaWRjcy0xZDZjYzdkYWU0NWI0MGExYjllZjQyYzc2MDhiOWFmZSIsInVzZXJfbGFuZyI6ImVuIiwiZXhwIjoxNTI3Mjk5NjUyLCJpYXQiOjE1MjY2OTQ4NTIsImNsaWVudF9ndWlkIjoiZGVjN2E4ZGRhM2I4NDA1MDgzMjE4NWQ1MzZkNDdjYTAiLCJjbGllbnRfbmFtZSI6Ik9FSENTX29laHRlc3QiLCJ0ZW5hbnQiOiJpZGNzLTFkNmNjN2RhZTQ1YjQwYTFiOWVmNDJjNzYwOGI5YWZlIiwianRpIjoiMDkwYWI4ZGYtNjA0NC00OWRlLWFjMTEtOGE5ODIzYTEyNjI5In0.aNDRIM5Gv_fx8EZ54u4AXVNG9B_F8MuyXjQR-vdyHDyRFxTefwlR3gRsnpf0GwHPSJfZb56wEwOVLraRXz1vPHc7Gzk97tdYZ-Mrv7NjoLoxqQj-uGxwAvU3m8_T3ilHthvQ4t9tXPB5o7xPII-BoWa-CF4QC8480ThrBwbl1emTDtEpR9-4z4mm1Ps-rJ9L3BItGXWzNZ6PiNdVbuxCQaboWMQXJM9bSgTmWbAYURwqoyeD9gMw2JkwgNMSmljRnJ_yGRv5KAsaRguqyV-x-lyE9PyW9SiG4rM47t-lY-okMxzchDm8nco84J5XlpKp98kMcg65Ql5Y3TVYGNhTEg","token_type":"Bearer","expires_in":604800}

Create Linux variable for it:

#!/bin/bashexport TOKEN=`cat access_token.json |jq .access_token|sed 's/\"//g'`

Well, now we have Authorization token and may work with our Resource Server (Event Hub Cloud Service). 

Note: you also may check documentation about how to obtain OAuth token.

Produce Messages (Write data) to Kafka (Step 2)

The first thing that we may want to do is produce messages (write data to a Kafka cluster). To make scripting easier, it's also better to use some environment variables for common resources. For this example, I'd recommend to parametrize topic's end point, topic name, type of content to be accepted and content type. Content type is completely up to developer, but you have to consume (read) the same format as you produce(write). The key parameter to define is REST endpoint. Go to PSM, click on topic name and copy everything till "restproxy":

Also, you will need topic name, which you could take from the same window:

now we could write a simple script for produce one message to Kafka:

#!/bin/bashexport OEHCS_ENDPOINT=https://oehtest-gse00014957.uscom-central-1.oraclecloud.com:443/restproxy export TOPIC_NAME=idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest export CONTENT_TYPE=application/vnd.kafka.json.v2+json curl -X POST \ -H "Authorization: Bearer $TOKEN"\ -H "Content-Type: $CONTENT_TYPE"\ --data '{"records":[{"value":{"foo":"bar"}}]}'\$OEHCS_ENDPOINT/topics/$TOPIC_NAME

if everything will be fine, Linux console will return something like:

{"offsets":[{"partition":1,"offset":8,"error_code":null,"error":null}],"key_schema_id":null,"value_schema_id":null}

Create Consumer Group (Step 3)

The first step to read data from OEHCS is create consumer group. We will reuse environment variables from previous step, but just in case I'll include it in this script:

#!/bin/bashexport OEHCS_ENDPOINT=https://oehtest-gse00014957.uscom-central-1.oraclecloud.com:443/restproxy export CONTENT_TYPE=application/vnd.kafka.json.v2+json export TOPIC_NAME=idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest curl -X POST \ -H "Authorization: Bearer $TOKEN"\ -H "Content-Type: $CONTENT_TYPE"\ --data '{"format": "json", "auto.offset.reset": "earliest"}'\$OEHCS_ENDPOINT/consumers/oehcs-consumer-group \ -o consumer_group.json

this script will generate output file, which will contain variables, that we will need to consume messages.

Subscribe to a topic (Step 4)

Now you are ready to subscribe for this topic (export environment variable if you didn't do this before):

#!/bin/bashexport BASE_URI=`cat consumer_group.json |jq .base_uri|sed 's/\"//g'`export TOPIC_NAME=idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest curl -X POST \ -H "Authorization: Bearer $TOKEN"\ -H "Content-Type: $CONTENT_TYPE"\ -d "{\"topics\": [\"$TOPIC_NAME\"]}"\$BASE_URI/subscription

If everything fine, this request will not return something. 

Consume (Read) messages (Step 5)

Finally, we approach last step - consuming messages.

and again, it's quite simple curl request:

#!/bin/bashexport BASE_URI=`cat consumer_group.json |jq .base_uri|sed 's/\"//g'`export H_ACCEPT=application/vnd.kafka.json.v2+json curl -X GET \ -H "Authorization: Bearer $TOKEN"\ -H "Accept: $H_ACCEPT"\$BASE_URI/records

if everything works, like it supposed to work, you will have output like:

[{"topic":"idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest","key":null,"value":{"foo":"bar"},"partition":1,"offset":17}]

Conclusion

Today we saw how easy to create fully managed Kafka Topic in Event Hub Cloud Service and also we made a first steps into it - write and read message. Kafka is really popular message bus engine, but it's hard to manage. Cloud simplifies this and allow customers concentrate on the development of their applications.

here I also want to give some useful links:

1) If you are not familiar with REST API, I'd recommend you to go through this blog

2) There is online tool, which helps to validate your curl requests

3) Here you could find some useful examples of producing and consuming messages

4) If you are not familiar with OAuth, here is nice tutorial, which show end to end example

Why are Universal Cloud Credit and Bring Your Own License a great opportunity for Oracle Partners?

$
0
0
image

Oracle simplified buying and consuming for PaaS and IaaS Cloud. Customer can purchase now Universal Cloud Credits. This universal cloud credits can be spend for any IaaS or PaaS service. Partners can start a PoC or project e.g. with Application Container Cloud Service and can add additional service when required e.g. Chabot Cloud Service. The customer can use the universal cloud credits for any available or even upcoming IaaS and PaaS services.

Thousands of customers use Oracle Fusion Middleware and Databases today. With Bring Your Own License they can move easy workload to the cloud. As they already own the license the customer needs to pay only a small uplift for the service portion of PaaS. This is a major opportunity for Oracle partners to offer services to this customers.

To learn more about Universal Cloud Credits and Bring Your Own License Attend the free on-demand training here

 

Developer Partner Community

For regular information become a member in the Developer Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center.

BlogTwitterLinkedInForumWiki

Technorati Tags: PaaS,Cloud,Middleware Update,WebLogic, WebLogic Community,Oracle,OPN,Jürgen Kress

Oracle API Platform Cloud Service Overview by Rolando Carrasco

$
0
0

image

  Oracle API Platform Cloud Services - API Design This is the first video of a series to showcase the usage of Oracle API Platform Cloud Services. API Management Part 1 of 2. Oracle API Cloud Services This is the second video of a series to show case the usage of the brand new Oracle API Platform CS. This is part one of API Management Oracle API Platform Cloud Services - API Management part 2 This is the 3rd video of the series. In specific here we will see the second part of the API Management functionality focused on Documentation. Oracle API Platform CS - How to create an app This is the 4th video of this series. In this video you will learn how to create an application. Oracle API Plaform Cloud Services - API Usage This is the fifth video of this series. In this video I will showcase how you will interact with the APIs that are deployed in APIPCS.

 

PaaS Partner Community

For regular information on business process management and integration become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center.

BlogTwitterLinkedInimage[7][2][2][2]Facebookclip_image002[8][4][2][2][2]Wiki

Technorati Tags: SOA Community,Oracle SOA,Oracle BPM,OPN,Jürgen Kress

Advanced Code Search for Git in Oracle Developer Cloud Service

$
0
0

One of the new features introduced in a recent monthly update of Oracle Developer Cloud Service is the advanced code search box you can see at the top right when you look at your Git repositories. This is a separate search functionality from the regular project artifacts search the box does in the other section of DevCS.

search screen

This search functionality is language aware, supporting a variety of languages including Java, JavaScript, HTML and CSS. It scans and indexes your code to understand its structure. DevCS can then do context aware searches for objects in your code, providing you autosuggest and even supporting camelCasing in the search box.

In the short video below I show you how this works. I start by importing code from a random github project into DevCS - and then I perform a search and show you how to find out the files, lines of code & revision references to your search term. You'll also see how code navigation works in the browser.

For more information about this capability have a look at the documentation here.

 

Sending Emails from Visual Builder Cloud Service

$
0
0

Sending emails as a result of some changes to data is a common requirement we've been hearing from customers of Oracle Visual Builder Cloud Service (VBCS). In previous versions we already added a function that allows you to fire up the client email software from your app. In the new version we rolled out a couple of weeks ago we added a new function that can manage the whole email processing inside VBCS - on the backend/server/cloud side.

When you define a new trigger on a business object you now have the option to add "send email notification" step to your logic flow. 

When you are defining this, you'll be able to pick or define a template for your email. The email template can have parameters to increase reusability. These parameters can then have their value set from expressions that can include values of fields from your objects.

Here is a complete demo video that shows you how to add an email notification to an event on your business object:


A Guide to Attending Oracle OpenWorld and JavaOne for Free

$
0
0

This title seems like a click bait, but I'll try and show you that this is actually possible. (Especially if you act fast).

A pass to both conferences is not a cheap item, and with training budgets shrinking you might be running into problems getting approval to expense this to your company. But here are a couple of tricks you can use to get free access to OOW/JavaOne. Some would get you a full pass and some will get you into specific areas and sessions.

Full Pass

Be a Speaker

Well we are a bit late to the call for papers at this point, but you might still be able to become a co-speaker if you have a good customer story to tell that shows how you are using the product/technology that a talk is about. Oracle is always looking to feature customer stories in our sessions - it is always better when a customer shares their real live success. Look up the content catalog and locate sessions on products that you are successfully implementing - ping the speaker and they might want to add you as a co-speaker.

Apply for the Awards

We are late for getting in this year, but keep an eye for this for next year. There is an Oracle Excelence Award where winners get a free pass to OOW. Details here. There is also a Duke's Choice Award where you get passes to JavaOne - Details.

JavaOne Special Discount

Right now you can get 50% off your JavaOne ticket price if you use this code DJFS2017. Register here https://www.oracle.com/javaone/register.html (oops - turns out that promotion ended last week)

Other Passes

So full pass might be hard to get for free, but you can get in on a big chunk of the action for free using 

The Discoverer pass

A Discoverer pass lets you into the exhibit halls and various keynotes. Getting into the exhibit hall is great if you want to meet product managers and learn about the latest versions (and upcoming versions too). Most of the PMs are going to do shifts at the Oracle Demoground in their product pods - so hang out there and you'll be able to chat with us.

Usually a Discoverer pass is $75 - but since you are reading this blog, you can get one for free - use the code CDFR2017. Register here.

Oracle Code

Want to get into some technical sessions - this year we are bringing Oracle Code back to San Francisco during OOW - and you can get into this part of the conference for free. Reg here:

https://developer.oracle.com/code/sanfrancisco-oct-2017

See you at the conference in a couple of weeks.

 

 

 

Creating Custom Search/Query Pages with Visual Builder

$
0
0

There is built in functionality in Visual Builder Cloud Service (VBCS) that will let you attach an advanced filter to any table. This will give you the ability to define complex searches.

However, sometime people would want to create their own custom query pages. For example the default filter for a field searches for a letter in any place in the field - and maybe you just want to search for the first letter. Or maybe you want to display a nice selection list for filtering. There is a little trick that will let you achieve this, and I show it in the video:

The basic steps involve creating a new edit page with a new dummy business object. Then you add a table that is based on the real business object (the one with the data), and hook up the query of this table to the values of fields from the dummy business object. Then you add a button to do a "fetch all" on the table, and you are done.

At the end of the video you'll also see how to hook this page into your main menu.

Introduction to Liquibase and Managing Your Database Source Code

$
0
0

In previous posts I showed how you can manage SQL scripts lifecycle with the help of Oracle Developer Cloud Service (DevCS) as part of an overall Oracle DB DevOps solution. I wanted to add one more utility that might act as an alternative or addition to the SQL script managing - Liquibase.

Liquibase logo

Liquibase is an open source solution for managing revisions of your databse schema scripts. It works across various types of databases, and supports various file formats for defining the DB structure. The feature that is probably most attractive in Liquibase is its ability to roll changes back and forward from a specific point - saving you from the need to know what was the last change/script you ran on a specific DB instance.

Liquibase uses scripts - referred to as "changesets" - to manage the changes you do to your DB. The changesets files can be in various formats including XML, JSON, YAML, and SQL. In the examples below I'm using the XML format.

As you continue to change an enhance your DB structure through the development lifecycle you'll add more changesets. A master file lists all the changeset files (or the directories where they are). In parallel Liquibase tracks in your database which changesets have already run. 

When you issue a liquibase update command, liquibase looks at the current state of your DB, and identifies which changes have already happened. Then it run the rest of the changes - getting you to the latest revision of the structure you are defining.

By integrating Liquibase into your overall code version management system and continuous integration platform you can synch up your database versions with your app version. In my case this would of course mean integration with Oracle Developer Cloud Service (DevCS) - which you get for free with the Oracle Database Cloud Service. In the video below I show a flow that covers:

  • Tracking my DBA tasks in the issue system
  • Modifying a local MySQL DB with Liquibase (doing forward and backward rolls)
  • Adding a change set defining a new table
  • Committing to Git
  • Automatic build implementing the changes in Oracle Database Cloud Service
  • Automatic testing with UT/PLSQL

Here is a quick 10 minute demo:

For those who want to try and replicate this, here are some resources:

A changeset that creates a "department" table with three columns:

A changeset that creates PL/SQL function, package and procedure. Note that in line 3 the dbms="oracle" means this script will only run when we are connected to an Oracle DB:

create or replace function betwnstr( a_string varchar2, a_start_pos integer, a_end_pos integer ) return varchar2 is begin return substr( a_string, a_start_pos, a_end_pos - a_start_pos+1 ); end; create or replace package test_betwnstr as -- %suite(Between string function) -- %test(Returns substring from start position to end position) procedure basic_usage; end; create or replace package body test_betwnstr as procedure basic_usage is begin ut.expect( betwnstr( '1234567', 2, 5 ) ).to_equal('2345'); end; end; A changeset that adds a record to a table. Line 8 has the rollback tag that defines how to do a rollback for this insert: delete from department where id=20

 

A few tips about my DevCS project and build setup.

1. For the sake of simplicity, I loaded the liquibase and JDBC jar files into my git repository - this makes it easy for my build steps to find the files and execute them. I'm guessing you could also use Maven to host those.

2. I use a password parameter for my build so I don't need to hardcode the password adding a bit of security to my build. Reference teh parameter in your build with a $ sign - $password

3. Want to learn more about test automation with ut/PLSQL - check out this blog entry.

 

 

Using Flyway to Manage Oracle DB Versions in the Cloud

$
0
0

This is another entry in my series about managing database scripts/schema as part of agile development. In the past I showed how to use simple SQL and Liquibase to manage schema creation/population scripts, and today I'll show you how to use Flyway.

Flyway log

Flyway is a free open source solution for managing "database migrations" - or basically helping you keep multiple database in synch by tracking and applying changes to the schema structure and data.

Flyway uses simple SQL scripts - which means you can use DB specific syntax - and tracks their execution in the database through a table it maintains. It is very easy to get started with and only has 6 commands that you need to be familiar with.

The main command is "migrate" which will check your database status, and then run all the newer scripts that have yet to be run on that instance.

Flyway uses a directory structure that contains a sql folder where you'll host all your SQL scripts. It uses a naming convention (that can be adjusted) where you start the file name with a Version number (V1, V1.1, V2.1) and then two "_" followed by a description - so something like V1__Create_Emp_Table - will show up as "Create Emp Table" when you issue the "info" command to find out what is the status of a database and which scripts have already run. By the way, the info command will also show you which new scripts are pending to be run on a specific database instance.

In the video below I show how to configure and use Flyway, and how to integrate it into an automatic DevOps process leveraging Oracle Developer Cloud Service. (including task tracking, Git version management of the source, and build execution of the scripts).

Flyway can integrate with various build framework (ant, maven, gradle etc), but since many DB folks are not familiar with those, I chose to use simple command lines in my demo to invoke Flyway. On my laptop and local MySQL DB I just used the Flyway command line utility. However Flyway is not installed by default in the DevCS servers, so I did a little trick:

Flyway is a Java program, so into my DevCS Git repository I uploaded the Flyway directory along with needed jars for flyway and the JDBC driver. Then I looked at the script for invoking the command line and found out the Java command they used and copied it into a regular shell command in my build:

java -cp lib/flyway-commandline-4.2.0.jar:lib/flyway-core-4.2.0.jar org.flywaydb.commandline.Main info -user=fw -password=$Password -url=jdbc:oracle:thin:@ipaddress:1521/servicename

The $Password refers to a build parameter which is encrypted.

The directory structure and files in my Git are shown in this image:

directory structure

 

Exporting and Importing Data from Visual Builder Cloud Service - with REST Calls

$
0
0

Visual Builder Cloud Service (VBCS) makes it very easy to create custom objects to store your data. A frequent request we get is for a way to load and export data from these business objects. As John blogged, we added a feature to support doing this through the command line - John's blog shows you the basic options for the command line.

I recently needed to do this for a customer, and thought I'll share some tips that helped me get the functionality working properly - in case others need some help skipping bumps in the road.

Here is a demo showing both import and export and how to get them to work.

Exporting Data

Export is quite simple - you use a GET operation on a REST service, the command line for calling this using curl will look like this:

curl -u user:password https://yourserver/design/ExpImp/1.0/resources/datamgr/export > exp.zip

The result is a streaming of a zip file, so I just added a > exp.zip file to the command's end. The zip file will contain CSV files for each object in your application.

Don't forget to replace the bold things with your values for username and password, your VBCS server name and the name of the app you are using (ExpImp in my case).

Importing Data

Having the exported CSV file makes it easy to build a CSV file for upload - in the demo I just replaced and added values in that file. Next you'll use a similar curl command to call a POST method. It will look like this:

curl -X POST -u user:password https://yourserver/design/ExpImp/1.0/resources/datamgr/import/Employee?filename=Employee.csv -H "Origin:https://yourserver" -H "Content-Type:text/csv" -T Employee.csv -v

A few things to note.

You need to specify which object you want to import into (Employee after the /import/ in the command above), and you also need to provide a filename parameter that tell VBCS which file to import.

In the current release you need to work around a CORS security limitation - this is why we are adding a header (with the -H option) that indicate that we are sending this from the same server as the one we are running on. In an upcoming version this won't be needed.

We use the -T option to attach the csv file to our call.

Note that you should enable the "Enable basic authentication for business object REST APIs" security option for the application (Under Application Settings->Security). 

Using Import in Production Apps

In the samples above we imported and exported into an application that is still being developed - this is why we used the /design/ in our REST path.

If you want to execute things on an application that you published then replace the /design/ with /deployment/ 

One special note about live applications, before you import data into them you'll need to lock them. You can do this from the home page of VBCS and the drop down menu on the application.

 

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; line-height: 14.0px; font: 11.0px Menlo; color: #000000; background-color: #ffffff} span.s1 {font-variant-ligatures: no-common-ligatures}

Introduction to Oracle Developer Cloud Service Issue Tracking REST Interfaces

$
0
0

The task tracking system in Oracle Developer Cloud Service (DevCS) helps your team manage your development priorities and process. DevCS offers a simple web interface for working with the system. However, in some cases you might want to build your own interfaces to interact with the issues. For example, you might want to build a system for end-users to report bugs in your app and you don't want to give them direct access to the DevCS web insterface. In the August 17 update of DevCS  we introduced a set of REST services that will let you build a custom interface that will interact with our issues repository.

The official documentation for the DevCS REST services is here.

I wanted to share some tips to help you get this going in your project. The results are in this short video demo, and the details are below.

Figuring Out The End Points

The documentation gives you the basic end-points you should be calling, but it took me a little bit of time to figure out the full URL to the end point. Turns out the URL is composed in the following way:

https://server/org-id/rest/org-id+project-id/issues/v2/issues

The first parts (server/org-id) are quite easy to get - just copy it from the URL of your project when you look at it in your browser.

The org-id+project-id part is something you can get by looking at the details of your maven repository URL - see the image below - what you are looking for is the part before the /maven/ at the end:

Note that in some projects this will also include a numeric value appended to the project name. Something like developer-oracletemplates_db-oss-devops_20266.

In the video sample below the result URL for the REST that returns the list of issues currently in the system ended up being:

https://myserver/developer-oracletemplates/rest/developer-oracletemplates_adf1221/issues/v2/issues

Creating New Issues

One of the useful services is the /issues/v2/issues/create-form service. It returns a json file that you can edit to specify information about a new task that you want to create.

Note that the file start with : {"createIssue":{"links":.... Before you use the file to insert a new issue, you'll need to remove the  {"createIssue": at the start and the corresponding } at the end of the file. Only then can you use it to submit the POST operation to create an issue.

In the video I used the following command to create the issue in the DevCS:

curl -X POST -u shay@oracle.com https://myserver/developer-oracletemplates/rest/developer-oracletemplates_adf1221/issues/v2/issues/ -d@issue.json -H 'Content-type:application/json'

(the -d allows you to specify the name of the file with the new issue, and the -H specifies the content format).

Now that you have access to the information you can create new systems on top of it using your favorite development tool. At the end of the video you can see a simple issue system I built with Oracle Visual Builder Cloud Service - more on that in a future blog entry.

 

Conditional Navigation based on Queries in Oracle Visual Builder Cloud Service

$
0
0

A couple of threads on the Oracle Visual Builder Cloud Service forum asked about writing code in buttons in VBCS that compares values entered in a page to data in business objects and perform conditional navigation based on the values. In a past blog I showed the code needed for querying VBCS objects from the UI, but another sample never hurts, so here is another demo...

For this demo I'm going to show how to do it in a login flow - assuming you have a business object that keeps usernames and passwords, and you want to develop a page where a user types a user/pass combination and you need to verify that this is indeed a valid combination that exist in the business object.

(In reality, if you want to do user authentication in VBCS - you should use the built in security frameworks and not code it this way. I'm just using this as an example.)

Here is a quick video of the working app - with pointers to the components detailed below.

The first thing you'll do is create the business object that hosts the user/pass combination - note that in the video since "user" is a reserved word - the ID for the field is actually "user_" - which is what we'll use in our code later on.

 

Next you'll want to create a new page where people can insert a user/pass combination - to do that create a new page of type "Create" - this page will require you to associate it with a business object, so create a new business object. We won't actually keep data in this new business object. In the video and the code - this business object is called "query".

Now design your page and add the user and pass fields - creating parallel fields in the query business object (quser and qpass in the video). You can then remove the "Save" button that won't be use, and instead add a "validate" button.

For this new button we'll define a new custom action that will contain custom JavaScript code. Custom code should return either a success state - using resolve(); - or failure - using reject();

Based on the success or failure you can define the next action in the flow - in our case we are showing either a success or error message:

success flow

Now lets look at the custom JavaScript code:

require(['operation/js/api/Conditions', 'operation/js/api/Operator'], function (Conditions, Operator) { var eo = Abcs.Entities().findById('Users'); var passid = eo.getProperty('pass'); var userid = eo.getProperty('user_'); var condition = Conditions.AND( Conditions.SIMPLE(passid, Operator.EQUALS,$QueryEntityDetailArchetypeRecord.getValue('qpass') ), Conditions.SIMPLE(userid, Operator.EQUALS, $QueryEntityDetailArchetypeRecord.getValue('quser')) ); var operation = Abcs.Operations().read( { entity : eo, condition : condition }); operation.perform().then(function (operationResult) { if (operationResult.isSuccess()) { operationResult.getData().forEach(function (oneRecord) { resolve("ok"); }); } reject("none"); } ). catch (function (operationResult) { if (operationResult.isFailure()) { // Insert code you want to perform if fetching of records failed alert('didnt worked'); reject("error"); } }); });

Explaining the code:

  • Lines 2-4 - getting the pointers to the business object and the fields in it using their field id.
  • Lines 5-8 - defining a condition with AND - referencing the values of the fields on the page
  • Lins 9-11 - defining the operation to read data with the condition from the business object
  • Line 12 - executing the read operation
  • Line 14-18 - checking if a record has been returned and if it has then we are ok to return success - there was a user/pass combination matching the condition.
  • Line 19 - otherwise we return with a failure.

One recommendation, while coding JavaScript - use a good code editor that will help highlight open/close brackets matches - it would save you a lot of time.

For more on the VBCS JavaScript API that you can use for accessing business components see the doc.


Date Calculations and Queries with Oracle Visual Builder Cloud Service

$
0
0

It's very easy to define a field in a custom object in Oracle Visual Builder Cloud Service to store a date, but when it comes to doing calculations and queries based on this date you'll find that you need to resort to a little bit of JavaScript calculations.

Here are a couple of useful things to know if you are trying to do that.

Calculating Age (or time passed from a date in years)

Let's assume you are storing information about employees and one of the pieces of information you have is their date of birth - the Birthday field in the image below.

How do you show their actual age in years on a page?

You can define a calculated field in your business object - and have VBCS use the "calculate value with formula" as the source for this field.

Your formula would be something like:

(new Date() -new Date($birthdate) )/ (60*60*24*1000*365)

You are calculating the difference between today's date and the birthday field and since the answer is in milliseconds you convert it to years by dividing by the number of milliseconds in a year.

Note that as you type in your formula the dialog shows you the results of the formula below the formula field - quite useful to verify that you are doing it right.

Now your page can show the age of your employees:

Filtering Based on Date

What if you wanted to limit the records shown in the table above to only show employees of a specific age?

The tricky part is that you'll need to do the calculation against the birthday field and not against the age field. The age field is not actually stored anywhere - rather it is calculated on the fly.

Let's take the table shown above and assume we want to limit it to show employees who are younger than 9 years. To do that we'll add a query condition to our table to check that the birthday is larger than the date of (today - 9 years).

The calculation of the date 9 years ago will be with a formula like this:

new Date($current_date-9*365*24*60*60*1000)

Now your table only shows older employees.

Want to have a more dynamic way to define the query criteria - you can adopt the approach I showed in the blog about Creating Custom Search/Query Pages with Visual Builder along with the techniques shown here.

One last note - since not every year has 365 days - the calculation for milliseconds conversion is not completely accurate - but it is quite close.

Using an "On Field Value Changes" Event in Oracle Visual Builder Cloud Service

$
0
0

[Update May 2018 - With the new VBCS there is a better approach for this - see this blog]

This entry is based on previous entries from John and Shray that deal with the same topic and provide the same type of solution. John's entry was created before VBCS provided the UI id for components, and Shray's entry is dealing with a more complex scenario that also involve fetching new data. So I figured I'll write my version here - mostly for my own future reference if I'll need to do this again.

The Goal is to show how you can modify the UI shown in a VBCS page in response to data changes in fields. For example how to hide or show a field based on the value of another field.

To do this, you need to hook into the HTML lifecycle of your VBCS page and subscribe to events in the UI. Then you code the changes you want to happen. Your gateway into manipulating/extending the HTML lifecycle in VBCS is the custom component available in the VBCS component palette. It provides a way to add your own HTML+JavaScript into an existing page.

The video below shows you the process (along with a couple of small mistakes along the route):

The basic steps to follow:

Find out the IDs of the business object field whose value changes you want to listen to. You'll also need to know the IDs of the UI component you want to manipulate - this is shown as the last piece of info in the property inspector when you click on a component. 

Once you have those you'll add a custom component into your page, and look up the observable that relates to the business object used in the page. This can be picked up from the "Generated Page Model (read-only)" section of the custom component and it will look something like : EmpEntityDetailArchetype

Next you are going to add a listener to your custom component model. Add it after the lines 

//the page view model this.pageViewModel = params.root;

your code would look similar to this:

this._listener = this.pageViewModel.Observables.EmpEntityDetailArchetype.item.ref2Job.currentIDSingle.subscribe(function (value) { if (value === "2") { $("#pair-currency-32717").show(); } else { $("#pair-currency-32717").hide(); } }); CustomComponentViewModel.prototype.dispose = function () { this._listener.dispose(); };

Where you will replace the following:

  • EmpEntityDetailArchetype  should be replaced with the observable for your page model.
  • ref2Job  should be replaced with the id of the column in the business object whose value you are monitoring.
  • pair-currency-32717 should be replaced with the id of the UI component you want to modify. (in our case show/hide the component).

You can of course do more than just show/hide a field with this approach.

Why Are You So Quiet?

$
0
0

You might have noticed that this blog didn't post new entries in the past couple of months, and you might have wondered why.

Well the answer is that I've been publishing content on some other related blogs around the Oracle blogsphere.

If you want to read those have a look at my author page here:

https://blogs.oracle.com/author/shay-shmeltzer

As you'll see we have new versions of both Visual Builder Cloud Service and Developer Cloud Service - both with extensive updates to functionality.

Working and learning those new versions and producing some demos is another reason I wasn't that active here lately.

That being said, now that both are out there - you are going to see more blogs coming from me.

But as mentioned at the top - these might be published in other blogs too.

So to keep up to date you might want to subscribe to this feed:

https://blogs.oracle.com/author/shay-shmeltzer/rss

See you around,

Shay

Leveraging "On Field Value Changes" Event in Visual Builder Cloud Service - Redone

$
0
0

With the new Oracle Visual Builder Cloud Service (VBCS) released this month, some of my past how-to's that shows tricks in VBCS are no longer valid/needed.

The direct access we provide to both REST services and the UI components in your application make things that in the past required code or hacking much simpler.

Here is one example - reacting to value change events on fields and modifying other UI components based on them.

Input component have a "value" event that you can hook into and provide an action chain that will be executed when the value change.

In the video below you see for example how I can use a value selected in a drop down list to control whether other components on the page are shown or hidden.

To do this, you define a page variable that you can change in the "value" event. You can then rely on that page variable to control another component behavior.

As you can see - no coding needed - just drag and drop your way to create the functionality.

action chain

 

 

Reflecting Changes in Business Objects in UI Tables with Visual Builder

$
0
0

While the quick start wizards in Visual Builder Cloud Service (VBCS) make it very easy to create tables and other UI components and bind them to business objects, it is good to understand what is going on behind the scenes, and what the wizards actually do. Knowing this will help you achieve things that we still don't have wizards for.

For example - let's suppose you created a business object and then created a UI table that shows the fields from that business object in your page. You probably used the "Add Data" quick start wizard to do that. But then you remembered that you need one more column added to your business object, however after you added that one to the BO, you'll notice it is not automatically shown in the UI. That makes sense since we don't want to automatically show all the fields in a BO in the UI.

But how do you add this new column to the UI?

The table's Add Data wizard will be disabled at this point - so is your only option to drop and recreate the UI table? Of course not!

 

If you'll look into the table properties you'll see it is based on a page level ServiceDataProvider ( SDP for short) variable. This is a special type of object that the wizards create to represent collections. If you'll look at the variable, you'll see that it is returning data using a specific type. Note that the type is defined at the flow level - if you'll look at the type definition you'll see where the fields that make up the object are defined.

Type Definition

It is very easy to add a new field here - and modify the type to include the new column you added to the BO. Just make sure you are using the column's id - and not it's title - when you define the new field in the items array.

Now back in the UI you can easily modify the code of the table to add one more column that will be hooked up to this new field in the SDP that is based on the type.

Sounds complex? It really isn't - here is a 3 minute video showing the whole thing end to end:

As you see - a little understanding of the way VBCS works, makes it easy to go beyond the wizards and achieve anything.

Viewing all 200 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>