In this episode, I’ll share some of the different ideas and techniques that I have seen at customers locations.
Integration services are so important to the business that you must find a way to run SAP PI/PO all the time. It is one of the key things to consider how you can make a good distribution of your system and the workloads. And also be able to support the upgrades/patches as well as configuration as easy as possible.
In the podcast, I mention some different setups they can be combined with a different setup to suit what you want to achieve and how it is best for your organization.
- One installation on multiple instances or server nodes. This is SAPs way of scaling the system. They are all linked but allow you to restart single servers or instances and distribute among the servers. You cannot specify where an integration should run.
- Multi productive PI systems to handle the different flow. Then you can patch one while the others are running. It requires some extra maintenance
- Decentral Adapter Engine. Where all systems are connected to one central system that allows you to distribute workload and just have one place for configuration.
- Preprod failover. Where you are using your reproductive system as a way to have a hot-hot failover. It does require some extra configuration of the scenarios on the PreProd system but gives you some benefits if you want to switch to it.
If you have another idea please post a comment below then others can learn from it.
I’m excited to welcome DJ Adams to The Integration Podcast this week. DJ has been working with SAP software for more than 30 years. He was an SAP Mentor for many years but has since retired to become an SAP Mentor alumnus. DJ has shared a lot on the SAP Cloud Platform Workflow. He has also written a series of blog posts on the subject where he is using the platform to find which beers he needs to drink next. With my background on the SAP PI systems and BPMs there now is a gradually transitioning to the cloud so DJ’s experience is hugely valuable for my work.
DJ first started working on R/2 on IBM mainframes in the 1980’s and has been working with SAP software ever since. He has been fascinated by the cloud platform and has worked with systems outside of the SAP world including Google App Engine. He never worked much with PI but the workflow service is very closely related to SAP BPM.
Getting started with Workflow is really easy. It’s available as a trial account on the Cloud Platform. Users can easily enable it along with the Portal Service and the full stack Web IDE. You need to define your workflow definitions. You need nothing other than a web browser to access those services.
The workflow service is about orchestrating services and tasks across applications but also across systems, organizations, and individuals. There is a service task building block that effectively gives you the ability to make HTTP calls. You also have script tasks that will let you mess around with the context of the data of the workflow instance. There is also a mail task. Then there is the most important step type in a workflow definition which is the user task. Those let you send tasks to a user to let them be able to interact with that workflow instance.
You don’t need to code very much in the UI when you’re first getting started. But if you are going to do any meaningful workflows you will need to do some coding to create user task interfaces for the SAP Fiori My Inbox app, which is a master-detail style app. The user task components you build are injected into the component container in the My Inbox app when tasks are selected for processing. You need to know some SAPUI5 in order to do that, but the team plans to release a forms-based facility to create user task interfaces in the future.
So my vision is to create to create an integrated development platform for developers to make it a lot easier to monitor what is being delivered. So we will support the developer all the way from an incident or problem, to create a ticket for it. Log the changes made on the ticket and then figure out what to test for it. In this release, we offered a way to look at your repository objects and make comparisons on them to see what the changes in them were. This enables you to assign a ticket to each change and use it i the change management. We also enable you to look at message mappings and the show all ICOs that uses that message mapping. This is the key component in giving you the ability to monitor changes.
The other great feature we added is the ability to test the SAP Cloud Platform Integration (CPI aka HCI) . We can now do regression test of HTTP scenarios and for other scenarios, you need to have resent the same messages. In 2.3 we are giving you the option to also test other scenarios and help you select the locations you want to compare. We have made this progress in the two weeks since the trace functionality was released for cloud integration. The trace allow us to get messages from all steps of the processing just by change the log level of the flow. It was quite lucky that it was released now and we did not have to make a lot of modifications to the flow to get correct logging messages.
Have a look at the blog about the release to read all about the features at release note blog about the cloud testing and the change management of SAP PI