Quantcast
Channel: SAP Gateway
Viewing all 137 articles
Browse latest View live

How to implement multi-level create deep

$
0
0

This blog will explain how to implement multi-level create deep.

 

Prerequisite, please make sure you are familiar with create deep implementation via Gateway:

Step by Step development for CREATE_DEEP_ENTITY operation

 

 

For multi-level create deep, we assume this following scenario:

Scenario.PNG

 

Here, FirstSon & SecondSon are the first-level deep of father, FirstGrandSon is the first-level of FirstSon but second-level of Father.

 

Then How to implement this create-deep scenario?

 

 

1. Create entities/associations in SEGW which map to this relationship.

     Four entities are created: Father/FirstSon/SecondSon/FirstGrandson; Each entity has three properties: Key/Property1/Property2

     Three associations are created: "Father to FirstSon" & "Father to SecondSon" & "FirstSon to FirstGrandSon"

Data Model.PNG

 

2. Generate and register the ODate Service.

     I assume you are familiar with these general Gateway steps .

 

3. Implement  create_deep_entity method.

     First create the multi-level deep structure which can holds the nested data from input; Then use  io_data_provider to get the input.

     Here I just write simple code snippet, as long as we can get nested data in the runtime.

Code.PNG

 

4. Test in RestClient

     I used to try to test the multi-level create deep in our traditional GW client and using XML format as payload. Nested data can be transferred into our create_deep_entity method, but the position of the nested data in second level is wrong. Thus, I strongly suggested to use JSON format as payload. Since our GW Client doesn't have a good support json fromat. I recommend to use the RestClient. (In RestClient you have to first get CSRF token and then post)

    

Payload:

 

{

    "Key": "Father-Key",

    "Property1": "Father-Property-1",

    "Property2": "Father-Property-2",

    "FirstSon": [

        {

            "Key": "FirstSon-Key-1",

            "Property1": "Firstson-Property-1",

            "Property2": "Firstson-Property-2",

            "FirstGrandson": [

                {

                    "Key": "GrandSon-Key-1",

                    "Property1": "GrandSon-Property-1",

                    "Property2": "GrandSon-Property-2"

                }

            ]

        },

        {

            "Key": "FirstSon-Key-2",

            "Property1": "Firstson-Property-3",

            "Property2": "Firstson-Property-4",

            "FirstGrandson": [

                {

                    "Key": "GrandSon-Key-2",

                    "Property1": "GrandSon-Property-3",

                    "Property2": "GrandSon-Property-4"

                },

                {

                    "Key": "GrandSon-Key-3",

                    "Property1": "GrandSon-Property-5",

                    "Property2": "GrandSon-Property-6"

                }

            ]

        }

    ],

    "SecondSon": [

        {

            "Key": "SecondSon-Key-1",

            "Property1": "SecondSon-Property-1",

            "Property2": "SecondSon-Property-2"

        },

        {

            "Key": "SecondSon-Key-2",

            "Property1": "SecondSon-Property-3",

            "Property2": "SecondSon-Property-4"

        },

        {

            "Key": "SecondSon-Key-3",

            "Property1": "SecondSon-Property-5",

            "Property2": "SecondSon-Property-6"

        }

    ]

}

 

 

5. Check if the multi-level nested data is mapping to the right position in our runtime.

   

Father-level data:

Father.PNG

 

Son-level data (deep to father):

FirstSon.PNGSecond.PNG

 

Grandson-level data (deep to FirstSon):

     FirstSon[1] has one entry; FristSon[2] has two entry; as payload expected.

Grandson1.PNGGrandson2.PNG

 

Now we get the multi-level nested data in the right position, then we can do anything we want in following .



Hope it helps!


How to do the data mining on the Gateway statistics using CDS view and AMDP?

$
0
0

Recently i got a very interesting requirement from management team to do a POC based on the Gateway statistics to address the following business questions.

 

  1. What are the Top 10 apps accessed within a certain period ?
  2. Who are the top 10 users within a certain period?
  3. Which hour of the day do the most users access ?
  4. Which week day of the week do the most users access?
  5. What are the most search criteria for all the services ?
  6. According to the functions defined in the app, we could also know that how many times of the business functions are executed?
  7. What are the average response time and maximum payload of the http requests?


i did some research on this topic and would like to share it within this community.

Agenda.jpg


step 1: The Gateway performance statistics are stored in tables /iwfnd/i_med_srh and /wfnd/i_med_srt, which could be accessed via tcode/iwfnd/maint_service

   The Gateway performance statistics are stored in table /iwfnd/su_stats , which could be accessed via tcode/iwfnd/stats


step 2 CDS View Building

CDS View 1:

@AbapCatalog.sqlViewName: 'V_CDS_SERV'

define view /nsl/cdsv_gw_service

( service_name, service_description )

as select from /iwfnd/i_med_srh as srh

          association[0..1] to /iwfnd/i_med_srt as srt

          on srh.srv_identifier = srt.srv_identifier and

             srh.is_active = srt.is_active

{

    srh.service_name,

    srt.description

}

where

   srt.language = 'E' and

   srt.is_active = 'A'

 

CDS View 2

@AbapCatalog.sqlViewName: 'V_CDS_STATS'

define view cdsv_stats_basic

( namespace, service_name, userid, timestampl, service_description, operation, entity_type, expand_string, request_address )

as select from /iwfnd/su_stats as stats

  association[1..1] to v_cds_serv  as service on

  stats.service_name = service.service_name

 

{

  stats.namespace,

  stats.service_name,

  stats.userid,

  stats.timestampl,

  service[1:inner].service_description,

  stats.operation,

  stats.entity_type,

  stats.expand_string,

  stats.request_address

}

where

      (  stats.namespace = '/SAP/' )

 

Step 3: ABAP Managed Database Procedure

Create a structure - Overview

Overview.jpg

AMDP Class Defination:

AMDP Snippet

class /XXX/cl_gw_su_stats definition

  public

  final

  create public .

 

 

  public section.

    interfaces: if_amdp_marker_hdb.

methods:

      get_su_stats_total

        importing

          value(iv_client)        type symandt

          value(iv_start_time)    type timestampl

          value(iv_end_time)      type timestampl

        exporting

          value(ev_user_num)      type /XXX/gw_su_basics_s-user_num

          value(ev_apps_num)      type /XXX/gw_su_basics_s-apps_num

          value(ev_apps_per_user) type /XXX/gw_su_basics_s-apps_per_user

          value(ev_top_user)      type /XXX/gw_su_basics_s-top_user

          value(ev_top_app)       type /XXX/gw_su_basics_s-top_app

          value(ev_hot_hour)      type /XXX/gw_su_basics_s-hot_hour

          value(ev_hot_day)       type /XXX/gw_su_basics_s-hot_day.

  protected section.

  private section.

endclass.

method get_su_stats_total by database procedure

                            for hdb language sqlscript

                            options read-only

                            using /XXX/v_cds_stats.

    DECLARE v_servicename INT;

    DECLARE v_user INT;

    /* get the total number users */

         select count(distinct userid) into ev_user_num from "/XXX/V_CDS_STATS"

              where MANDT = :iv_client and timestampl between :iv_start_time and :iv_end_time ;

 

 

    /* get the total number services */

         select  count(distinct service_name) into ev_apps_num  from "/NSL/V_CDS_STATS"

            where MANDT = :iv_client and timestampl between :iv_start_time and :iv_end_time ;

 

 

    /* get the apps per user */

*     ev_apps_per_user = ev_apps_num / ev_user_num ;

    /* get the top user name */

          select top 1 userid,

                          count (service_name ) as service_name into ev_top_user, v_servicename

                          from "/XXX/V_CDS_STATS"

                          where MANDT = :iv_client and timestampl between :iv_start_time and :iv_end_time

                          group by userid

                          order by service_name desc

                          ;

 

    /* get the top app name */

    select top 1

         service_name,

         count(userid) as userid into ev_top_app, v_user

         from  "/XXX/V_CDS_STATS"

          where MANDT = :iv_client and timestampl between :iv_start_time and :iv_end_time

         group by service_name

         order by userid desc ;

 

 

/* which the day of the week do the agents log in the most */

 

select top 1 to_char( utctolocal(to_timestamp (timestampl),'UTC+8'),'DAY')as Date,

               count(userid) as userid into ev_hot_day, v_user

      from "/XXX/V_CDS_STATS"

          where MANDT = :iv_client and timestampl between :iv_start_time and :iv_end_time

           group by to_char( utctolocal(to_timestamp (timestampl),'UTC+8'),'DAY')

           order by userid desc;

 

/* which Hour of the day do the agents log in the most*/

select top 1 hour( to_time( utctolocal(to_timestamp(timestampl),'UTC+8')))as Hour,

               count(userid) as userid into ev_hot_hour, v_user

      from "/XXX/V_CDS_STATS"

          where MANDT = :iv_client and timestampl between :iv_start_time and :iv_end_time

           group by hour( to_time( utctolocal(to_timestamp(timestampl),'UTC+8')))

           order by userid desc;

  endmethod.

Gateway service snippet

data: ls_entity            like line of et_entityset.

 

 

    loop at  it_filter_select_options into data(ls_filter_select_option).

      case ls_filter_select_option-property.

        when 'SelectionDate'.

          loop at ls_filter_select_option-select_options into data(ls_select_option).

            if ls_select_option-low is initial.

              lv_start_time_feeder = sy-datum.

            else.

              lv_start_time_feeder = ls_select_option-low.

            endif.

 

 

            if ls_select_option-high is initial.

              lv_end_time_feeder = sy-datum.

            else.

              lv_end_time_feeder  = ls_select_option-high.

            endif.

          endloop.

        when others.

          raise exception type /iwbep/cx_mgw_busi_exception

            exporting

              textid = /iwbep/cx_mgw_busi_exception=>filter_not_supported.

      endcase.

    endloop.

 

 

    if sy-subrc <> 0.

      raise exception type /iwbep/cx_mgw_busi_exception

        exporting

          textid            = /iwbep/cx_mgw_busi_exception=>business_error_unlimited

          message_unlimited = |Filter is required|.

    endif.

 

    convert date lv_start_time_feeder time sy-uzeit into time stamp lv_start_time time zone sy-zonlo .

    convert date lv_end_time_feeder time  sy-uzeit into time stamp lv_end_time time zone sy-zonlo.

 

 

    data(lo_overview) = new /nsl/cl_gw_su_stats( ).

    try.

        lo_overview->get_su_stats_total(

          exporting

            iv_client        = sy-mandt

            iv_start_time    = lv_start_time

            iv_end_time      = lv_end_time

          importing

            ev_user_num      =  ls_entity-user_num

            ev_apps_num      = ls_entity-apps_num

            ev_apps_per_user = ls_entity-apps_per_user

            ev_top_user      = ls_entity-top_user

            ev_top_app       = ls_entity-top_app

            ev_hot_day       = ls_entity-hot_day

            ev_hot_hour      = ls_entity-hot_hour

        ).

 

      catch cx_amdp_execution_failed into data(lv_error).

    endtry.

    append ls_entity to et_entityset.

 

last step to test the service in GW client

Test.jpg

Problems with multi-origin in SAP NetWeaver 740 SP13 ... and how to solve them

$
0
0

Today I came accross a problem that can occur if you use the multi origin feature of SAP Gateway and if you have upgraded to SAP NetWeaver 740 SP13.

 

Strictly speaking it would also occur if you have upgraded solely the software component SAP_GWFND to the Support Package Level SAPK-74013INSAPGWFND.

 

A request such as /sap/opu/odata/sap/gbapp_poapproval;mo/WorkflowTaskCollection in this case does not return any data.

 

On the other hand $count would still return a result when running the following request in the SAP Gateway Client /sap/opu/odata/sap/gbapp_poapproval;mo/WorkflowTaskCollection/$count.


As a result of this strange behavior SAP Fiori applications such as the MyInbox would not work correctly.

 

Fortunately there is a solution .

 

Simply apply SAP Note 2250491 - OData request does not return entity set data in case of multi-origin composition to your system.

 

Best regards,

Andre

How to show file name when calling GET_STREAM

$
0
0

This document is about how to enhancement our OData service  - Download File.


Currently base on training material, we usually redefine GET_STREAM method and pass mime type and stream value to the result.

ls_stream-mime_type = output-return-MIME_TYPE .
ls_stream
-value= output-return-DOCUMENT_STREAM .
copy_data_to_ref
( EXPORTING is_data = ls_stream
CHANGING  cr_data = er_stream ).

This works, however the problem is if you trigger the service in Chrome. You can only get a document with named ‘entity’.

 

1.png

Problem

  • What if customer has already stored file name in ECM or in Database? They want to see the real name.
  • What if customer don't want to direct download, they may want to review the file before download.

Solution

Solution to this is to set a header to http framework. I attached code below.

DATA ls_lheader TYPE ihttpnvp.
"DATA lv_filename TYPE string.
"lv_filename = escape( val = lv_filename format = cl_abap_format=>e_url ).
ls_lheader
-name = 'Content-Disposition'.
ls_lheader
-value= 'outline; filename="Mobile.pdf";'.
set_header
( is_header = ls_lheader ).

ls_stream
-mime_type = output-return-MIME_TYPE .
ls_stream
-value = output-return-DOCUMENT_STREAM .
copy_data_to_ref
( EXPORTING is_data = ls_stream
CHANGING  cr_data = er_stream ).

Let’s test the result now:

2.PNG

If customer want to preview the document instead of directly download, please change code as below.

ls_lheader-value= 'inline; filename="Mobile.pdf";'.

Let’s test the result:

A new page was opened for preview, you can right click and choose to save. File name ‘Mobile.pdf’ comes up.

3.PNG

How to configure Fiori News App by using SICF service and RSS2.0

$
0
0

This document is written to shown step by step guide to configure news app in Fiori. In all, there are three steps:

  1. Configure Fiori title
  2. Create Table to store news information
  3. Generate SICF service and provide RSS 2.0 format

 

Step one: Configure Fiori title

Click on News Tile

1.png

Fill in SICF service name in the Feed, Article Refresh Interval means the refresh interval of news apps. Prefer to set to 15min.

SICF service name: znews (which will later be created)

1.png

 

Step two: Create a Table to store news information

This table should contain all essential information needed for news.

1.png

 

Step three: Generate SICF service and provide RSS 2.0 format

For the news app itself we need to provide RSS document.

Please refer XML RSS for the principle of RSS 2.0.

We create a SICF service called znews to generate this kind of RSS document (This is the service name you bind in Step one).

Create a class called ZCL_NEWS_SERVICE. Fill in IF_HTTP_EXTENSION in Interface.  Fill below code in IF_HTTP_EXTENSION~HANDLE_REQUEST.  Check and active this class. 

1.png

 

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

  method IF_HTTP_EXTENSION~HANDLE_REQUEST.

    DATA action TYPE STRING .

    DATA xmlcstr TYPE STRING .

    DATA newscontent TYPE TABLE OF ZNEWSCONTENT.

    DATA newscontent_l LIKE LINE OF newscontent.

    DATA guidstr TYPE STRING.

    DATA titlestr TYPE STRING.

    DATA descriptionstr TYPE C LENGTH 400.

    DATA imagestr TYPE STRING.

    DATA linkstr TYPE STRING.

 

    action = server->request->get_form_field( name = 'type' ).

    CASE action.

      WHEN 'news'.

        SELECT * FROM ZNEWSCONTENT INTO TABLE newscontent.

        xmlcstr = '<?xml version="1.0" encoding="utf-8"?><rss version= "2.0"> <channel>'.

        LOOP AT newscontent INTO newscontent_l.

CLEAR: guidstr, titlestr, descriptionstr, imagestr, linkstr.

          CONCATENATE xmlcstr '<item>' INTO xmlcstr.

          guidstr = newscontent_l-guid.

          titlestr = newscontent_l-title.

          descriptionstr = newscontent_l-description.

          imagestr = newscontent_l-imagelink.

          linkstr = newscontent_l-contentlink.

          CONCATENATE xmlcstr '<guid>' guidstr '</guid>'

                              '<title>' titlestr '</title>'

'<description>' descriptionstr '</description>'

                              ''

                              '<link>' linkstr '</link>' INTO xmlcstr.

         CONCATENATE xmlcstr '</item>' INTO xmlcstr.

       ENDLOOP.

       CONCATENATE xmlcstr '</channel></rss>' INTO xmlcstr.

 

server->response->set_header_field(

       name = 'Content-Type'

       value = 'application/xml' ).

 

server->response->set_header_field(

       name = 'accept-origin'

       value = '*' ).

 

       server->response->set_cdata( data = xmlcstr ).

    ENDCASE .

  endmethod.

 

Run traction SICF.

In path default_host/sap/bc, right click on bc and choose New Sub-Element. Fill in your SICF service name.

1.png

Fill in a description and Handler List, leave other as default.

1.png

Check and Active this service.

You can have a test the SICF service now. Right click on the service name and click Test Service.

1.png

Also if you have configure the Fiori Launchpad correct, you can see the real news app.

1.png

2.PNG

Error - ABAP Dictionary element not found

$
0
0

If you have been used to creating, updating gateway services for some time, you might have encountered an issue where you transported your objects/changes to the productive system but your changes did not reflect in the metadata and therefore not usable/working for your update.

 

The change could be as simple as introducing a new field, or changing the name of the field or changing the properties of it. How it is supposed to work is that every time you make a change to the structure of your app via SEGW or directly in your model definition class. The objects are regenerated and once you kick off a call to the service's metadata, it checks for the time stamps and refreshes the metadata based on the model definition.

 

Sometimes, that doesn't work that well and would result in the infamous error /IWBEP/CM_MGW_RT106 (message class /IWBEP/CM_MGW_RT, message no 106). It reads "ABAP Dictionary element 'ZCLXXX_MDP=>FieldXX' not found.", where the field name is ZCLXXX_MDP=>FieldXX, which seems to be the root cause. You could look this up in the error log in the gateway system under transaction /n/IWFND/ERROR_LOG

error.png

The reference documentation has the solution, simply clear the cache in BOTH gateway and the backend system.


Usually the landscape admins would set up jobs for maintenance of Cache which could be directly kicked off using tcode ‘/IWFND/CACHE_CLEANUP’ on the gateway system or ‘/IWBEP/CACHE_CLEANUP’ in the backend system, to fully clear the cache or to clear the cache for a specific model.

 

The error message class also points out to where (in which system) the problem is. In my quoted example, it is the backend system represented by IWBEP

 

Hope this blog is helpful to you!

FIELD WORK FORCE MANAGEMENT

$
0
0

Executive Summary:   Crave InfoTech introduces Asset Tracking &  Field Workforce Management Solution certified from SAP.

This solutioncan help you improve your field workforce productivity, emergency planning & response , scheduling of jobs & off course  customer satisfaction.

A solution for your retiring workforce which helps tracking jobs, vehicles, employees and assets.

 

As a electric or water utility professional, you recognize the value of good data. When you link that data to a geographic location on a map, you can visualize the big picture which gives you a powerful decision-making tool.

Asset Tracking & FWMS provides you with an efficient platform for  tracking movable and immovable assets, data management, planning and analysis, field workforce automation and situational awareness.

Asset Tracking can take you to places that are beyond imagination such as underground fitting, pipeline, poles, switches etc.

 

 

Bread Crumb Trail, Alerts, Internal Messaging, Geo fence are few eye catching

Feature, Download route for mobile users, deviation alert from route, and powerful algorithm to route to nearest job using direction API is one of the intuitive features.

For more information visit:  www.Cravetracker.com



Asset Tracking.png

Capture.PNG

 

 

 

 

Asset tracking

  • Asset tracking is cloud based solution for tracking jobs, immovable asset, vehicles and people on the field real time using Google & ESRI MAP.
  • Jobs Creation, jobs assignment and jobs execution is made much easier and transparent using C-ASTRA (Best solution to keep track on field work).
  • Vehicle theft, fuel theft, misguided route, long halt is no longer a problem, C-ASTRA is single solution to strengthen your field force.
  • Asset Tracking provides scintillating features like, Bread Crumb Trail (Vehicle History), Alerts (misguided route, vehicle maintenance, geo fence entry & exit and much more), Internal Messaging, Geo fence are few eye catching Feature Intuitive features.

Best surveillance system for your Immovable and Movable AssetVehicle Tracking:

  • Track your vehicle real time on map using GPS device.
  • Disable Ignition using SMS commands remotely.
  • Calculate Mileage, maximum idling , kilometer driven at place for all vehicle
  • Control Speed using alerts to driver; get acceleration and hard brake used during trip.
  • Get the history of each trip.

Employee Tracking:

  • Track your marketing professional’s real time in working hours.
  • Monitor client visit of your marketing professionals
  • Helps employee in claiming accidental policy in case of mishap in working hours.

Asset Inspection & inventory

  • This app helps to captures physical condition of asset with current location details and sends it to GIS server.
  • Easy to use, save time and helps to locate asset easily.

Mobile device Management:

  • With 100 of users been tracked on field, managing mobile user base is quite difficult, MDM called as AFARIA help to serve big customers.
  • Helps to deploy mobile app easily on bigger scale
  • Apply policy to minimize use of data plan on roaming.
  • Help to locate stolen mobile device
  • Troubleshoot if no data received.

Interface with SAP and Other Applications:

  • Interface with SAP using SAP Net weaver Gateway.
  • Use of Web services and Online Data proxy to interface with GIS to download jobs and assign work centers (drivers to vehicles) from GIS
  • Can run independently.

GPS Device (Cal Amp LMU)

  • Cal Amp GPS devices are small and easy to install with 100% accurate result, this devices are built in such a way that even remote and daunt area does not have effect on its accuracy.
  • Cal Amp devices are easy to configure and troubleshoot remotely using SMS commands.
  • Minimal use of data plan as it uses hexadecimal code to traverse data.
  • Inbuilt memory to store when no signal to send data.

 

 

Visit SAP Store

https://store.sap.com/sap/cpa/ui/resources/store/html/SolutionDetails.html?pcntry=US&sap-language=EN&pid=0000012059



Asset Tracking & Inspection is listed as top five apps on SAP store

http://www.news-sap.com/5-top-apps-sap-store/


 


 

 

Field Service Manager (Using SAP Mobile Platform):

This application is used for field workforce management; this application is developed using SAP Mobile Platform as middleware and SAP (ISU) as backend for Android and MDT. Usage of application is to manage the service order completion, inventory movements (truck to truck transfer), Meter reading entry, Meter  Installation, removal, exchanges, Turn On ,Turn Off, Shut off non pay, Customer Complaint Investigation,

Disconnection and Re-connections, Permitting, Maintenance order, Measuring points and Payment Collection, Substation Measuring points, Hydrant maintenance, Leak reading, Pole Inspection, Pending and completed jobs. Routing of Jobs with shortest distance calculation using Google Map, Voice Navigation. The business process logic and data is encapsulated in innovative Mobile Business Objects (MBO) using SAP Sync BO and BAPI wrappers.

 

C-FSM (Crave Field Service Manager) for Utilities enables a platform-independent processing of maintenance and service processes using mobile end devices. Representing a full offline/Online mobile application, SAP C-FSM provides ready-made scenarios that help field forces and service technicians perform their daily activities at customer sites and within plants. The strong integration into Enterprise Asset Management (EAM) and SAP Utilities (IS-U/CCS or SAP ECC Industry Extension Utilities) allows the field forces to work on internal orders (within your own technical installations) as well as at the customer site. The solution integrates the customer data from SAP Utilities and SAP EAM in one mobile application.

Visit Android app Store: https://play.google.com/store/apps/details?id=com.crave.mam&hl=en


 

 

 

PLAN YOUR FIELD WORKFORCE TODAY WITH INTELLIGENT GLOBAL INFORMATION SYSTEM.

 

Visit : www.craveinfotech.com

http://www.news-sap.com/5-top-apps-sap-store/

File Attachment in Material Document (MIGO) using SAP Gateway

$
0
0

Business Case: The business requirement is to attach any document while doing the goods receipt for material. The  attachments can be any reference documents or image or any other document related to goods receipt of the material. To meet this requirement SAP has provided a tool bar called 'Generic Service Toolbar'(GOS).

 

 

Now a days since we are using mobile application so there is a requirement to create a attachment for existing Material document through Net-weaver Gateway. We can able to upload any document (PDF/DOC/JPG/XLS)  and through OData service  the document will be linked to the corresponding Material document.

 

The below are steps required to create an OData service in SAP NW Gateway.

 

Step-1: Create a project for Attachment using SEGW transaction.

 

Step-2: Create an Entity type, Entity sets. Remember the entity type should be defined as a Media type.

 

1.jpg

 

Step-3: Create a property for the Entity type.

 

2.jpg

Step-4: Generate the project. It should create all the back end classes for MPC, MPC_EXT, DPC and DPC_EXT.

              Now go to DPC_EXT class and redefine the method /IWBEP/IF_MGW_APPL_SRV_RUNTIME~CREATE_STREAM.

 

              Inside the method write the below code to convert the XSTRING received from the OData service into Binary format and

              then upload the binary data into SAP.

 

METHOD /iwbep/if_mgw_appl_srv_runtime~create_stream.


DATA lt_objhead    TYPE STANDARD TABLE OF soli,
       lt_xdata     
TYPE solix_tab,
       lt_data      
TYPE soli_tab,

       ls_folmem_k   TYPE sofmk,
       ls_note      
TYPE borident,
       ls_object    
TYPE borident,
       ls_obj_id    
TYPE soodk,
       ls_fol_id    
TYPE soodk,
       ls_obj_data  
TYPE sood1,
       ls_data      
TYPE soli,
       ls_xdata     
TYPE solix,

       lv_ep_note    TYPE borident-objkey,
       lv_extension 
TYPE c LENGTH 4,
       lv_mblnr     
TYPE mblnr,
       lv_mjahr     
TYPE mjahr,
       lv_objkey    
TYPE char70,
       lv_tmp_fn    
TYPE string,
       lv_file_des  
TYPE so_obj_des,
       lv_offset    
TYPE i,
       lv_size      
TYPE i,
       lv_temp_len  
TYPE i,
       lv_offset_old
TYPE i.


CONSTANT: lc_hex_null TYPE x LENGTH 1   VALUE '20'.


**Call function to convert XSRTING to Binary

CALL FUNCTION 'SCMS_XSTRING_TO_BINARY'
EXPORTING
buffer          = is_media_resource-value    
append_to_table
= lc_x
TABLES
binary_tab     
= lt_content.


**Call function to get Folder id

CALL FUNCTION 'SO_FOLDER_ROOT_ID_GET'
EXPORTING
region               
= 'B'
IMPORTING
folder_id            
= ls_fol_id
EXCEPTIONS
communication_failure
= 1
owner_not_exist      
= 2
system_failure       
= 3
x_error              
= 4
OTHERS                = 5.


**Get the document number and file name from SLUG

SPLIT iv_slug AT '/' INTO lv_mblnr lv_mjahr lv_file_des.


**Get the file extension

SPLIT lv_file_des AT '.' INTO lv_tmp_fn lv_extension.

CONCATENATE lv_mblnr lv_mjahr INTO lv_objkey.
ls_object
-objkey   = lv_objkey.


**For Goods movement BUS type is BUS2017
ls_object
-objtype    = 'BUS2017'.    
ls_obj_data
-objsns   = 'F'.  
ls_obj_data
-objla    = sy-langu.   
ls_obj_data
-objdes   = lv_file_des.
ls_obj_data
-file_ext = lv_extension.
TRANSLATE ls_obj_data-file_ext TO UPPER CASE.


**Calculate the length

lv_offset = 0.
lv_size
xstrlen( is_media_resource-value ).


ls_obj_data-objlen lv_size.

WHILE lv_offset <= lv_size.
lv_offset_old
= lv_offset.
lv_offset
= lv_offset + 255.
IF lv_offset > lv_size.
lv_temp_len
= xstrlen( is_media_resource-value+lv_offset_old ).

CLEAR ls_xdata-line WITH lc_hex_null IN BYTE MODE.
ls_xdata
-line = is_media_resource-value+lv_offset_old(lv_temp_len).
ELSE.
ls_xdata
-line = is_media_resource-value+lv_offset_old(255).
ENDIF.
APPEND ls_xdata TO lt_xdata.
ENDWHILE.

**Change Hex data to Text data

CALL FUNCTION 'SO_SOLIXTAB_TO_SOLITAB'
EXPORTING
ip_solixtab
= lt_xdata
IMPORTING
ep_solitab 
= lt_data.

 

**Insert document

CALL FUNCTION 'SO_OBJECT_INSERT'
EXPORTING
folder_id                 
= ls_fol_id
object_type               
= 'EXT'
object_hd_change          
= ls_obj_data
IMPORTING
object_id                 
= ls_obj_id
TABLES
objhead                   
= lt_objhead
objcont                   
= lt_data
EXCEPTIONS
active_user_not_exist     
= 1
communication_failure     
= 2
component_not_available   
= 3
dl_name_exist             
= 4
folder_not_exist          
= 5
folder_no_authorization   
= 6
object_type_not_exist     
= 7
operation_no_authorization
= 8
owner_not_exist           
= 9
parameter_error           
= 10
substitute_not_active     
= 11
substitute_not_defined    
= 12
system_failure            
= 13
x_error                   
= 14
OTHERS                     = 15.

IF sy-subrc = 0 AND ls_object-objkey IS NOT INITIAL.
ls_folmem_k
-foltp = ls_fol_id-objtp.
ls_folmem_k
-folyr = ls_fol_id-objyr.
ls_folmem_k
-folno = ls_fol_id-objno.

ls_folmem_k
-doctp = ls_obj_id-objtp.
ls_folmem_k
-docyr = ls_obj_id-objyr.
ls_folmem_k
-docno = ls_obj_id-objno.

lv_ep_note
= ls_folmem_k.

ls_note
-objtype = 'MESSAGE'.
ls_note
-objkey  = lv_ep_note.


**Link the object inserted

CALL FUNCTION 'BINARY_RELATION_CREATE_COMMIT'
EXPORTING
obj_rolea     
= ls_object
obj_roleb     
= ls_note
relationtype  
= 'ATTA'
EXCEPTIONS
no_model      
= 1
internal_error
= 2
unknown       
= 3
OTHERS         = 4.

CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
EXPORTING
wait = 'X'.
IF sy-subrc EQ 0.
COMMIT WORK.

ENDIF.
ENDIF.

ENDMETHOD.

 

Step-5: Now go to gateway client /IWFND/MAINT_SERVICE and add a document. In the SLUG parameter

              pass the Material Document number, Year and File name separated by '/'.

 

3.jpg

 

Now click on execute. While executing please keep in mind that the HTTP method "POST" must be selected.

 

Step-6: After execution of the OData service go to the transaction MIGO and provide the Material document number and year.

             you can able to see the attachment.

 

I hope this will be helpful to many of you. 

 

Regards,

Nitin


Action packed first 3 days at TechEd 2015

$
0
0

Here is the recap of first 3 days at SAP TechEd 2015.

 

On first day evening, Steve Lucas brought light to digital economy in his keynote session. He pointed out that today all economies are going digital (even agriculture).To survive, you must make the digital shift. He mentioned how SAP is moving from tradition enterprise company to a cloud company. He said that SAP you knew was analytics, db, Hana, Sap you know now is analytics, s/4, Hana and Sap you should know is cloud for analytics, s/4, Hana, Vora. Check out the recorded sessionhere


SteveKeynote.png

 

Second day started with energizing keynote session by SAP Executive Board Member Bernd Leukert. He said- "You have the ability to manage your company in a completely different way". He said that SAP has reinvented analytics business. Leukert called Cloud for Analytics “the newest member of our SAP family” before a demonstration, which featured a fictitious retail chain with a few underperforming stores. The tool singlehandedly quickly identified them, geographically represented them and even evaluated their competition, revealing that sales at rival stores were stealing market share. He also stressed on 'SAP HANA Cloud Platform' as the platform to build business applications. There was a demo that revealed how HCP can help an enterprise create a Web page to conduct business in another country. There was much more and you can see recorded videohere.

 

Keynote2.jpg

Keynote2_1.JPG

 

After this energetic start of the day, there was a hands-on session(INT269) on SAP API Management given by Harsh Jagadeesan and Peter NG. They talked about how this solution provides simple, scalable, secure, and unified access for APIs based on open standards such as REST, OData, and Outh2. There were some hands-on exercises as well to get idea about different features of SAP API Management.

 

HandsOn2.JPG

 

Throughout the day, our staff at the POD(at Show floor at Explore zone (Hall C Level 2))- Michael Hill, Elijah Martinez and Sonali Desai,was busy interacting with interested customers and showing them demos related to Integration and Orchestration products(that include Process Integration, Process Orchestration, HANA CLoud Integration, Gateway and API Management). You can watch our demos for Gateway here and check out API Management demoshere

 

Img1.jpg

Img2.jpg

Img3.jpg

On day 3, there was an expert networking session with Paul J. Modderman from Mindset Consulting on 'Innovate with SAP Gateway and API Management'. He talked about how API Management can on board developers quickly and how you can use SAP Gateway to convert your RFCs/BAPIs into SOAP, ODATA/REST services. Here are some pics from that networking session:

 

IMG_3191.JPG

Expert2.jpg

There was another exert networking session with Sandeep Murusupalli, Solution Architect from Apigee and Peter NG, Senior Product Manager at SAP Labs, LLC. It was Q & A session on API Management. They showed live demo of Asset repairs using smartwatch and discussed about how API management plays critical role in this demo.Here are some pics for the same:

 

Expert3.jpg

There was another session given by Stephan Herbert, VP P&I Technology I/O Gateway and Lisa Hoshida from Intel on 'Enable your business with SAP Gateway' They discussed about Intel case study and how SAP Gateway can empower your workforce by moving critical and real-time data, from a back end consisting SAP solutions, to any mobile device.

 

INT105.jpg

Overall it was a great experience. Customers were delighted to see Integration and Orchestration Product portfolio offering and were interested in learning more about the products and how these products can fit in their use cases.

Action packed first 3 days at TechEd 2015

$
0
0

Here is the recap of first 3 days at SAP TechEd 2015.

 

On first day evening, Steve Lucas brought light to digital economy in his keynote session. He pointed out that today all economies are going digital (even agriculture).To survive, you must make the digital shift. He mentioned how SAP is moving from tradition enterprise company to a cloud company. He said that SAP you knew was analytics, db, Hana, Sap you know now is analytics, s/4, Hana and Sap you should know is cloud for analytics, s/4, Hana, Vora. Check out the recorded sessionhere


SteveKeynote.png

 

Second day started with energizing keynote session by SAP Executive Board Member Bernd Leukert. He said- "You have the ability to manage your company in a completely different way". He said that SAP has reinvented analytics business. Leukert called Cloud for Analytics “the newest member of our SAP family” before a demonstration, which featured a fictitious retail chain with a few underperforming stores. The tool singlehandedly quickly identified them, geographically represented them and even evaluated their competition, revealing that sales at rival stores were stealing market share. He also stressed on 'SAP HANA Cloud Platform' as the platform to build business applications. There was a demo that revealed how HCP can help an enterprise create a Web page to conduct business in another country. There was much more and you can see recorded videohere.

 

Keynote2.jpg

Keynote2_1.JPG

 

After this energetic start of the day, there was a hands-on session(INT269) on SAP API Management given by Harsh Jagadeesan and Peter NG. They talked about how this solution provides simple, scalable, secure, and unified access for APIs based on open standards such as REST, OData, and Outh2. There were some hands-on exercises as well to get idea about different features of SAP API Management.

 

HandsOn2.JPG

 

Throughout the day, our staff at the POD(at Show floor at Explore zone (Hall C Level 2))- Michael Hill, Elijah Martinez and Sonali Desai,was busy interacting with interested customers and showing them demos related to Integration and Orchestration products(that include Process Integration, Process Orchestration, HANA CLoud Integration, Gateway and API Management). You can watch our demos for Gateway here and check out API Management demoshere

 

Img1.jpg

Img2.jpg

Img3.jpg

On day 3, there was an expert networking session with Paul J. Modderman from Mindset Consulting on 'Innovate with SAP Gateway and API Management'. He talked about how API Management can on board developers quickly and how you can use SAP Gateway to convert your RFCs/BAPIs into SOAP, ODATA/REST services. Here are some pics from that networking session:

 

IMG_3191.JPG

Expert2.jpg

There was another exert networking session with Sandeep Murusupalli, Solution Architect from Apigee and Peter NG, Senior Product Manager at SAP Labs, LLC. It was Q & A session on API Management. They showed live demo of Asset repairs using smartwatch and discussed about how API management plays critical role in this demo.Here are some pics for the same:

 

Expert3.jpg

There was another session given by Stephan Herbert, VP P&I Technology I/O Gateway and Lisa Hoshida from Intel on 'Enable your business with SAP Gateway' They discussed about Intel case study and how SAP Gateway can empower your workforce by moving critical and real-time data, from a back end consisting SAP solutions, to any mobile device.

 

INT105.jpg

Overall it was a great experience. Customers were delighted to see Integration and Orchestration Product portfolio offering and were interested in learning more about the products and how these products can fit in their use cases.

Gateway Goggles

$
0
0

Imagine this: your friend is a plant maintenance supervisor. His colleagues running the plant fill out hand-written forms identifying broken down equipment. Half those papers he loses somewhere in his disaster of a desk. The other half he has to mash out on his desktop keyboard over lunch. His job is fixing problems, but he's spending more time pushing papers.

 

Enter you. You're a phenomenally skilled SAP developer, known through the company for creative solutions. You hand your buddy an iPhone. On it runs an SAPUI5 app that lets him take a picture of the paper as soon as it's handed to him. The app interprets the things written on the paper and puts them in the appropriate fields on his screen. He gives it a once-over and hits "Submit" - ready to actually enjoy a distraction-free lunch break.

 

You are a hero to your friend and he tells everyone how awesome you are. The CEO gets wind of this innovation and makes you CTO. You retire wealthy and start that surf shop on the beach you always dreamed of.


I'm going to show you how to give your Gateway system “eyes” that can interpret content in a photo. In a ridiculously easy way. No promises on the surf shop, though.

 

Google recently made their Cloud Vision API available for anyone to try. I love when the big software companies give me things to play with. It means I can try out wild and crazy ideas from the comfort of my keyboard. So as soon as I could, I took some free time to tinker with the Vision API and mash it up with SAP Gateway.

 

I present here a simple prototype for using these two tools in tandem. There are about a billion ways this could be useful, so I hope my little slice of code helps someone along the way.

 

I’ll show you how to use Gateway to request Google Vision API processing. I picked a couple Vision abilities that I find awesome, but the API is capable of more.


Without further ado - let’s get started!


Setup


Before you write any code, you’ll need:

  • An SAP Gateway system. If you’re reading this blog and don’t know what that is, then I apologize because you’re probably really bored.
  • Configuration to allow that system to HTTP POST to an external internet API. See here for setting up STRUST to allow that.
  • A Google account, with the Cloud Vision API enabled. Be warned: if you use it more than 1,000 times a month, it’s not free. Just make sure it takes you less than 1,000 tries to get it right.
  • An API key set up in the Google account. I suggest using the browser API key for prototyping, and service accounts for productive use. Getting an API key is covered in the Google getting started guide.

 

Once you have the above configured, it’s time to cracking on the code.

 

Show Me The Code Already

Now that we have things ready to roll, fire up your Gateway system and go to t-code SEGW. I set up a very simple entity that will just hold the description of what Google thinks an image is. Just 3 fields:


Screen Shot 2016-02-26 at 4.56.55 PM.png


Make sure to flag that entity as a "Media" entity:


Screen Shot 2016-02-26 at 4.57.38 PM.png


And that’s it for our bare-bones service definition. You could get a lot more architecturally crazy and set up a bunch of entities and fields to capture every single thing that comes out of the Google API - but I just wanted to get it up and running to see what I could get.


Only two steps left in the setup: coding the model enhancement necessary for media entities and coding the CREATE_STREAM method for processing the file data.


First, the model enhancement. Navigate to the *MPC_EXT class in your project and do a redefinition of the DEFINE method. This code should get you what you need. It’s so short that it’s basically self-explanatory:



  METHODdefine.    super->define( ).    DATA:    lo_entity   TYPE REF TO /iwbep/if_mgw_odata_entity_typ,    lo_propertyTYPE REF TO /iwbep/if_mgw_odata_property.    lo_entity=model->get_entity_type( iv_entity_name='VisionDemo' ).    IFlo_entityIS BOUND.      lo_property=lo_entity->get_property( iv_property_name='ContentType' ).      lo_property->set_as_content_type( ).    ENDIFENDMETHOD.


The model is now ready to support media stuff (in our case pictures) coming in. The other side of the equation is to prepare the request to be sent to Google for processing. We’ll do that in the CREATE_STREAM method of the *DPC_EXT class that the SEGW project generated. Same deal as before, do a redefine of that method and put in the following code:



  METHOD /iwbep/if_mgw_appl_srv_runtime~create_stream.    TYPES: BEGIN OFfeature,             typeTYPEstring,             max_resultsTYPEi,           END OFfeature.    TYPES: featuresTYPE STANDARD TABLE OFfeatureWITHDEFAULTKEY.    TYPES: BEGIN OFimage,             contentTYPEstring,           END OFimage.    TYPES: BEGIN OFrequest,             imageTYPEimage,             featuresTYPEfeatures,           END OFrequest.    TYPES: requestsTYPE STANDARD TABLE OFrequestWITHDEFAULTKEY.    TYPES: BEGIN OFoverall_request,             requestsTYPErequests,           END OFoverall_request.    DATAoverall_requestTYPEoverall_request.    DATArequestsTYPETABLE OFrequest.    DATArequestTYPErequest.    DATAfeatureTYPEfeature.    DATAlv_b64_contentTYPEstring.    DATAlo_http_client  TYPE REF TOif_http_client.    DATAlv_response_dataTYPEstring.    DATAlv_urlTYPEstring.    DATAlv_request_jsonTYPEstring.    DATAlv_response_jsonTYPEstring.    DATAlo_descrTYPE REF TOcl_abap_structdescr.    DATAlv_startTYPEi.    DATAlv_endTYPEi.    DATAlv_total_charsTYPEi.    DATAls_visiondemoTYPEzcl_zgoogle_vision_mpc=>ts_visiondemo.    DATAlv_end_markerTYPEstring.    CALL FUNCTION'SCMS_BASE64_ENCODE_STR'      EXPORTING        input  =is_media_resource-value      IMPORTING        output=lv_b64_content.    lv_url='https://vision.googleapis.com/v1/images:annotate?key=GET_YOUR_OWN_KEY'.    request-image-content=lv_b64_content.    feature-type=iv_slug.    feature-max_results=1.    APPENDfeatureTOrequest-features.    APPENDrequestTOrequests.    overall_request-requests=requests.    lo_descr?=cl_abap_typedescr=>describe_by_data( overall_request ).    lv_request_json= /ui2/cl_json=>dump( data=overall_requesttype_descr=lo_descrpretty_name=abap_true ).    cl_http_client=>create_by_url(      EXPORTING        url                =lv_url      IMPORTING        client             =lo_http_client ).    lo_http_client->request->set_method( method='POST' ).    lo_http_client->request->set_content_type( content_type='application/json' ).    lo_http_client->request->append_cdata2( EXPORTINGdata=lv_request_json ).    lo_http_client->send( ).    lo_http_client->receive( ).    lv_response_data=lo_http_client->response->get_cdata( ).    IFiv_slug='LOGO_DETECTION'.      lv_end_marker='"score":'.    ELSE.      lv_end_marker='"boundingPoly":'.    ENDIF.    SEARCHlv_response_dataFOR'"description":'.    lv_start=sy-fdpos+16.    SEARCHlv_response_dataFORlv_end_marker.    lv_end=sy-fdpos.    lv_total_chars=lv_end-lv_start.    ls_visiondemo-id=1.    ls_visiondemo-description=lv_response_data+lv_start(lv_total_chars).    copy_data_to_ref( EXPORTINGis_data=ls_visiondemo                       CHANGINGcr_data=er_entity ).  ENDMETHOD.


Note the following about this code snippet:

  • I’m using the IV_SLUG parameter to control what kind of request (logo or text detection) I’m making to Google. This means using the “slug” header in an HTTP request, which I’ll show you below.
  • Google expects picture data to be base64 encoded, so the FM SCMS_BASE64_ENCODE_STR handles that for us.
  • Get your own API key - the string at the end of my URL will not work for you. Replace GET_YOUR_OWN_KEY with your actual key.
  • There are a number of ways to handle JSON type data in ABAP. I used the /ui2/cl_json method purely out of simplicity for a demo. For a more robust solution see how to use JSON with the XML parsing tools.
  • There is basically no error handling here. That’s the great thing about prototyping.
  • I know the way I pull the description out of the response is a total hack.

 

 

Try It Out

The easiest way to try this out is through the Gateway Client (/iwfnd/gw_client). Here’s how:


Navigate to /iwfnd/gw_client on your Gateway system. Enter the request parameters as seen here (assuming you’ve named things the same that I have):


gw_client setup.PNG


The two possible values I verified for “slug” are TEXT_DETECTION and LOGO_DETECTION - though the API supports many more than that.


Next, put a picture in the body of the request by clicking “Add File”. If you choose TEXT_DETECTION as the slug, then make sure your image actually has text. Here’s what the result looks like if I put in a picture of my business card. Look at the “Description” field in the right hand text (and note that Google automatically puts in newline characters if there are line breaks in the picture):


result from business card.PNG



And check it out if I put in a logo with the LOGO_DETECTION slug parameter (“Skunk Works” is the right answer for this picture):


logo detection result.PNG


Wrap It Up, Paul

So I’ve proved out that I can use the Google Cloud Vision API in conjunction with SAP Gateway - but I haven’t really done anything truly useful. However, I have some really exciting ideas for this and can’t wait to continue using these mega-powerful cloud services! I hope that my small example helps someone else dream big and make something amazing.

ODATA SERVICE FOR PURCHASE ORDER using RFC

$
0
0

Let discuss the PO creation steps using gateway services in detail.

Step1. Open Tcode SEGW and create a project as shown in below.

1.png

Give the details as shown  below .

2.png

Step 2. Create the first entity by importing an RFC interface. For this right-click on Data Model and choose Import -> RFC/BOR Interface.

3.png

Step 3. Enter the following values in the wizard and then choose next:

 

Entity Type Name

PurchaseOrder

Target System

Local

Data Source Type

Remote Function Calls

Data Source Name

bapi_po_getdetail

4.png

Step 4. Expand the PO_HEADER node and select the following fields:
POHEADER,COMPANYCODE,DOC_CAT,DOC_TYPE,STATUS,VENDOR,PURCH_ORG,PUR_GROUP and Choose Next.



5.png

Step 5. In the first line, PO_NUMBER, select the field Is Key and choose Finish:

6.png

Step 6. Create the second entity again by importing an RFC interface. Right-click Data Model and choose Import -> RFC/BOR Interface

7.png

Step 7. Enter the following values in the wizard and choose next:

 

Entity Type Name

PurchaseOrderItem

Target System

Local

Data Source Type

Remote Function Calls

Data Source Name

BAPI_PO_GETITEMS   

8.png

Step 8. Expand the PO_ITEMS node and select the following fields:
po_item,material,pur_mat,mat_grp,net_price,price_unit,disp_quan
Choose Next.

     9.png

10.png

Step 9.Now our project has 2 entities – one for the Purchase Order and one for the Purchase Order Item. As a next step we create entity-sets out of these entities. Expand the node Data Model and double-click Entity Sets:

Name

Entity Type Name

PurchaseOrderSet

PurchaseOrder

PurchaseOrderItemSet

PurchaseOrderItem

11.png

Step 10.Now the basic definition of the Model is done. As a next step we can generate the necessary runtime artifacts.

a.     Click on the Generate pushbutton:

b.     Leave the default values and choose Enter:

12.png

Please note the Technical Service Name ZPURCHASEORDER_SRV is equal to the External Service Name required to consume this service later on. 

c .Choose Local Object.

d. Runtime objects have been generated successfully now.

  Step 11.Now we can Register and Activate the Service.

a. Double-click Service Maintenance

13.png

b. Select system EC7 and click on the Register button. Please note that the entries listed here depend on the System Alias configuration you have done in the SAP Net Weaver Gateway Implementation Guide (IMG). In a local deployed environment (Backend and Hub components deployed on the same box) you might also find “LOCAL” with the RFC destination “NONE” here

14.png

c. Confirm the warning message displayed in the popup: click yes

d.Press F4 to select the system alias. Select LOCAL from the input held.

e.Confirm the Select System Alias popup: click ok

f. Leave the default values and enter $tmp as the package and choose Enter:

15.png

The External Service Name is defaulted with the Technical Service Name from the Generation Step

g. Verify that the service has been registered and activated successfully:

16.png

Step 12.Now we can run our service the first time. Please note that we’ve only maintained the basic model data so far. As a consequence we can access the metadata of the service,

a. Open a new window, start transaction /IWFND/GW_CLIENT.

b. Enter URI:  /sap/opu/odata/sap/ ZPURCHASEORDER_SRV/$metadata and choose Execute

17.png

Step 13.ZPURCHASEORDER_SRV is External Service Name that was registered before.

We have created a Service Builder Project with two entities and two entity-sets. We have generated the runtime artifacts and registered and activated our OData service.

Step 14.Now we will map the data provider to bring life into our OData service.

(i)Get the PO header data

  • We will start with the Query method for the PurchaseOrderSet entity-set. Expand the node Service Implementation -> PurchaseOrderSet and right-click GetEntity (Read) and select Map to Data Source:

18.png

  • In the map to data source window, enter the following values and choose Enter:

Target System

Local

Data Source Type

Remote Function Call

Data Source Name

bapi_po_getdetail

19.png

  • Map the output parameters by dragging the fields from the model on right side. Also create an input parameter for providing the PO to the RFC. Choose Enter:

Mapping will look like,

20.png

Then Save.

We are done with getting the PO header details.

(ii). Now, we need to get the lines item based on given purchase order number.

For this, we will create operation in entity set PurchaseOrderItemSet.

  • We will start with the Query method for the PurchaseOrderItemSet entity-set.
  • Expand the node Service Implementation - PurchaseOrderItemSet and right-click GetEntitySet (Query) and select Map to Data Source:

21.png

  • Provide Data source name as ‘BAPI_PO_GETITEMS’ and Data source type as ‘Remote Function Call’.
  • Map the output parameters by dragging the fields from the model on right side. Also create an input parameter for providing the PO to the RFC. Choose Enter:

22.png

  • Regenerate the artifacts.

Testing the Service:

Select the registered service and click on gateway client.

23.png

Sample Case 1: For getting PO HEADER DATA provide

/sap/opu/odata/SAP/ZPURCHASEORDER_SRV/PurchaseOrderSet('3000000004') and click on execute.

24.png

25.png

Sample Case 2: For getting PO ITEM DATA provide /sap/opu/odata/SAP/ZPURCHASEORDER_SRV/PurchaseOrderItemSet?$filter=PoNumber eq '3000000004' and click on execute

26.png

27.png

Ever wanted to know why SAP uses OData?

SAP Gateway and OData in a nutshell

$
0
0

The following video on YouTube provides a nice and comprehensive high level overview of SAP Gateway and OData.

 

So if you want to explain to somebody what SAP Gateway and OData are in just 1:45 minutes you can share this link.

 

SAP Gateway and OData - YouTube

 

Best Regards,

Andre

Consume Odata Service in ABAP CL_HTTP_CLIENT->CREATE_BY_DESTINATION

$
0
0

This blog would have not been possible, but for the awesome community that is SCN. I would like to thank every contributor ever for helping me out in my hours of need!.

 

I had tremendous help in navigating through my current requirement thanks to the below blog posts.

How to consume an OData service using OData Services Consumption and Integration (OSCI)

Thank you. Andre Fischer

 

Consuming an External RESTful Web Service with ABAP in Gateway

And Paul J. Modderman

 

both these blogs helped me understand the intricacies of the functionality that is Consuming an OData Service.

this Blog can be considered as an extension of the Blog by Paul J. Modderman.

 

we can consume a data service by using the method CREATE_BY_URL of the class CL_HTTP_CLIENT, but when authentication is involved this method was ill suited for it.

 

The CREATE_BY_DESTINATION method however enables us to store the Credentials in a more standard and secured fashion.

 

The Requirement:-

I needed to access an OData service that was exposed by HANA . It is required that we trigger this service from the ECC system and process the result.

 

The user would be logging in to ECC directly and not via portal, thus it would require the use of an RFC destination Login Credentials.

 

The Process:-

Step 1.

we have to create the RFC connection in SM59 as below.

 

  1. Go to TCode SM59
  2. Click on the create new connection button.
  3. Provide the description
  4. Choose the connection type as G.
  5. enter the Host name/port number(an Odata Service generally has a URL containing ".COM" followed by ":" Port Number)
    1. The  part until the ".com" without the HTTP:// is the host name
    2. The part after the ":" is the port number
  6. Enter any proxy details if a proxy is being used(we were not in my case)
  7. go to the Logon & Security tab
    1. CHoose the Basic Authentication Radio Button.
    2. Enter the logon credentials
  8. Save and click on connection test.
  9. If all setting are proper you should be able to see a webapage that is relevant to the Host you entered, in the Response Body/Response Text tabs.

 

Step2.

 

Now that we have created the RFC connection we proceed to the creation of the HTTP client .

 

to create the client we use the attached  code. CL_HTTP_CLIENT.txt.

 

 

DATA:   lo_http_client TYPE REF TO if_http_client,

        lv_service TYPE string,

        lv_result TYPE string.

 

 

"xml variables

DATA:    lo_ixml TYPE REF TO if_ixml,

        lo_streamfactory TYPE REF TO if_ixml_stream_factory,

        lo_istream TYPE REF TO if_ixml_istream,

        lo_document TYPE REF TO if_ixml_document,

        lo_parser TYPE REF TO if_ixml_parser,

        lo_weather_element TYPE REF TO if_ixml_element,

        lo_weather_nodes TYPE REF TO if_ixml_node_list,

        lo_curr_node TYPE REF TO if_ixml_node,

        lv_value TYPE string,

        lv_node_length TYPE i,

        lv_node_index TYPE i,

        lv_node_name TYPE string,

        lv_node_value TYPE string.

************************************************************************

* lv_ destination will be name of the RFC destination we created in SM59

************************************************************************

CALL METHOD cl_http_client=>create_by_destination

  EXPORTING

    destination              = lv_destination

  IMPORTING

    client                   = lo_http_client

  EXCEPTIONS

    argument_not_found       = 1

    destination_not_found    = 2

    destination_no_authority = 3

    plugin_not_active        = 4

    internal_error           = 5

    OTHERS                   = 6.

IF sy-subrc <> 0.

* MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO

*            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.

ENDIF.

 

*************************************************************************************************

* we need to build the URI, this is the part in the OData Service that comes after the port number

* This includes the Path and Query string for the service that is being called on the host.

* lv_uri holds the path and query string

*************************************************************************************************

 

CALL METHOD cl_http_utility=>set_request_uri

  EXPORTING

    request = lo_http_client->request

    uri     = lv_uri.

 

lo_http_client->receive(

  EXCEPTIONS

    http_communication_failure = 1

    http_invalid_state         = 2

    http_processing_failed     = 3 ).

 

**************************************************

* Making sense of the result parsing the XML

**************************************************

lv_result = lo_http_client->response->get_cdata( ).

 

lo_ixml = cl_ixml=>create( ).

lo_streamfactory = lo_ixml->create_stream_factory( ).

lo_istream = lo_streamfactory->create_istream_string(

                                 lv_result ).

lo_document = lo_ixml->create_document( ).

lo_parser = lo_ixml->create_parser(

                       stream_factory = lo_streamfactory

                       istream        = lo_istream

                       document       = lo_document ).

 

 

 

 

"This actually makes the XML document navigable

lo_parser->parse( ).

DATA: lv_name TYPE string.

"Navigate XML to nodes we want to process

*lo_weather_element = lo_document->get_root_element( ).

lv_name = 'content'.

lo_weather_element = lo_document->find_from_name( lv_name ).

lo_weather_nodes = lo_weather_element->get_children( ).

 

 

 

 

"Move through the nodes and assign appropriate values to export

lv_node_length = lo_weather_nodes->get_length( ).

lv_node_index = 0.

 

 

 

 

WHILE lv_node_index < lv_node_length.

  lo_curr_node = lo_weather_nodes->get_item( lv_node_index ).

  lv_node_name = lo_curr_node->get_name( ).

  lv_node_value = lo_curr_node->get_value( ).

  ADD 1 TO lv_node_index.

ENDWHILE.



Hope this Helps!, let me know if i can clarify further.


Peace!!!!


Code snippet for adding annotations to your Gateway Service

$
0
0

Now that annotations are making UI5 development easier by using Smart controls, it is important to learn how to add these annotations to your service. SEGW does not yet allow you to add most of the annotations. Till SEGW inherently provides that feature, here is how you can do it using code.

 

Step 1. Goto you MPC_EXT class

 

Step 2. Redefine Define method.

 

Step 3. Write this code.

 

    super->define( ). "Ensure you call the parent metadata

    lo_entity_type = model->get_entity_type( iv_entity_name = 'EmpDetail'). "Your Entity Name

    lo_property = lo_entity_type->get_property( iv_property_name = 'DateOfHire'). "Property inside your Entity

    lo_annotation = lo_property-/iwbep/if_mgw_odata_annotatabl~create_annotation( /iwbep/if_mgw_med_odata_types=>gc_sap_namespace ). "SAP's annotations

    lo_annotation->add( iv_key = 'display-format' iv_value = 'Date' ). "Specific annotation you want to add.

 

This will result in

Annotation.PNG

Real life example of upgrading a 3-tier NetWeaver Gateway landscape

$
0
0

Introduction

 

This blog shares our experience of upgrading a 3-tier NetWeaver Gateway landscape, pointing out the challenges we faced and how we were able to solve them.

 

Source Landscape

 

Existing 3-tier landscape with NetWeaver Gateway 2.00 SP 07 (NetWeaver 7.31 SP 09) in a central hub deployment model. Applications are connecting to a CRM 7.0 EHP1 system with NetWeaver Gateway 2.00 SP 07 backend components.

 

Target Landscape

 

Existing 3-tier landscape with SAP Gateway 7.40 SP 13 in a central hub deployment model. No changes to backend systems.

 

Upgrade Strategy

 

Execute a in-place upgrade. Start with sandbox systems, which are recent copies of respective production systems.

 

Initial Plan

 

According to SAP note 1830198 systems can be independently upgraded, upgrading Gateway doesn't require one to upgrade the backend components of the connected backend systems, assuming sufficient SP levels exist both in the Gateway and the backend systems. In our case we met the SP level requirements. The plan was not to upgrade the backend components.

 

Challenges

 

As soon as we had upgraded sandbox, we realized that our existing Gateway services didn't work anymore. More specifically, none of the services leveraging filter functionality worked. In addition there were issues with currencies that used to work that no longer worked.

 

Troubleshooting

 

Debugging the filter problem we found out that passing filter values got broken by the upgrade, values were being truncated. Looking into it in detail, we found SAP note 2205402 which we applied on both the Gateway system as well as the backend system, as instructed by the SAP note. This however wasn't sufficient. Since the corrections are partly contained in 740/13, we had to also implement SAP notes 2232883 and 2241188 on the Gateway system. Even that wasn't sufficient, we had to also implement SAP note 2245413 in the backend system.

 

Applying the SAP notes fixed the issues with filter functionality. The issue with currencies is explained in SAP note 2028852. We chose to change the applications in order to avoid the decimal problems described in the SAP note.

 

New Plan

 

In order to apply the SAP notes required to fix the issues with filtering, we had to also update the backend components to 2.00 SP 11. The new plan is to execute the in-place upgrade of NetWeaver Gateway and update the backend components.

 

Conclusion

 

I'm sure breaking compatibility or interoperability wasn't on SAP's radar but it happened. I have contacted SAP Gateway Product Management but I haven't yet been provided with an official explanation. In our case a simple technical upgrade of NetWeaver Gateway turned into a full-fledged upgrade project.

 

Takeaways

 

Take everything with a grain of salt, even official information can't be trusted. Test and validate everything yourself, preferably in a sandbox environment.

 

We are currently executing the new plan in our development landscape. I will update this blog should we run into other issues.

Slacking Off (1 of 3)

$
0
0

(This is part 1 of a 3 part series. See Part 2 and Part 3 for the whole story.)

 

Did you ever have a late night instant message conversation that went something like this:

 

Screen Shot 2016-03-18 at 12.55.42 PM.png

 

It’s no fun to be in that conversation. You know you’re stuck sitting in front of a screen for at least the next 10 minutes. And since it’s your work laptop you know that the corporate internet police won’t even let you browse reddit for cat pictures while you wait for VPN and SAP GUI to load up. More so, you know that whatever this person is yelling about is probably not your fault.

 

I’ve been there, trust me.

 

What if your conversation could look like this, instead:

 

Screen Shot 2016-03-18 at 12.59.54 PM.png

 

Did you notice Commander Data interject in that exchange? More on that later.

 

As nerds our jobs often involve performing routine technical tasks for people who use our systems. Maybe you reset a lot of passwords, check the status of integrations, or respond to a support inbox. You probably have loads of different communication tools at your disposal. Chat, email, carrier pigeons…whatever gets the job done. If someone needs your help they’ll generally find a way to get in front of you. Digitally or otherwise.

 

One of the coolest communication tools I’ve worked with in the last couple years is Slack. It’s got individual conversations, group chats, categories, and anything you’d expect from a team chat tool. It’s quickly overtaken email as my preferred method of talking with colleagues.

 

Except it’s way more than chat. Slack allows you to use pre-built integrations to look at your Jira tasks, GitHub commits, and thousands of other things. What’s even better: you can make your own integrations that interact with its web API. Which makes it the perfect thing to plug into your SAP Gateway to make use of the REST APIs you’ve already created for other purposes.

 

In my next couple posts, I’ll show you how to make exactly what I did above using (nearly) free tools.

 

Slack Setup

If you're not using Slack already, you can get a free account. It's very simple and straightforward. Once you've got an account, follow these steps to set up the Slack end of this chain:

 

  • I set this up as a Slash Command. That's where the "/ask-sap" piece comes from in my chat transcript above. Go here to set a new one up for yourself.
  • On the next screen, choose the command that you want to use, starting with a '/' character. You can use /ask-sap if you want to stay perfectly within the tutorial, since these custom commands are for your own Slack team only.
  • Click the "Add integration" button.
  • Screen Shot 2016-03-18 at 1.36.36 AM.png
  • On the next page, pay close attention to the "Outgoing Data" and the "Token" sections. You may want to copy or screenshot them for reference later.
  • The only thing that absolutely has to be filled-in is the URL field. This has to be an https destination that you own or have control of, and it has to be an endpoint programmable by you. Post 2 of 3 in this series will show you how to set that up in Google App Engine - but you could realistically do it anywhere you have the rights to set up programs to run on a web server.
  • Click "Add integration" at the bottom of the page, filling out whatever else you want to along the way. I suggest at least showing your command in the autocomplete list.

 

What you just did set it up so that Slack will respond to any message that starts with "/ask-sap" by sending an HTTP POST to the URL you provided in the settings. The format of the POST it sends will look like the "Outgoing Data" section that you saw in the setup process. For this demo, the most important pieces are the token and text fields.

 

That's it! You now have a Slash Command available in any of your Slack channels. It won't do anything yet, but that's what we'll set up in the next section.

 

On to Part 2!

Slacking Off (2 of 3)

$
0
0

(This is Part 2 of a 3 part series. See Part 1 and Part 3 for the whole story.)


In Part 1, we got Slack up and running with a Slash Command that will send an HTTP POST to a specified URL endpoint. In this part, I'll show you how to set up a basic Google App Engine web server in Python to respond to this HTTP POST and format a request for SAP Gateway. From Gateway, we'll output the data that the request asks for and send it back to Slack. I will not be exhaustive of all the features of App Engine - this is an SAP blog, after all - but I'll provide sample code, links to how-tos, and some tricks I learned along the way. The amazing thing is that a super basic implementation is only about 40 lines of Python code!

 

 

Setup

  • You'll need a Google account (if you have a Gmail address, you're good to go). I like using an IDE like Eclipse with PyDev installed, but if you are a complete notepad.exe ninja then go for it.
  • You'll need to secure a domain name for yourself, or have rights to one. Google, again, has an easy way to do this.
  • You'll also need to get SSL set up for that domain, which you can do for 90 days free at Comodo.
  • Once you have the cert, you can apply it to your Google Domain like this.

 

Now you're ready to code! The easiest way to set up a project for App Engine is do the play-at-home 5-minute version. This will get you a project set up, the right deployment tools installed, and a project folder ready to go. Try it out, test it a few times.

 

Once you're comfortable with how that works, you can simply replace the code files with code I'll provide below. Note that there are several places in the code where I've put some angle brackets with comments - this is where you'll need to fill in your own solution details. My meager programmer salary won't cover a giant hosting bill because everyone copies my domain/settings and sends all their messages through my server.

 

First, replace the contents of your app.yaml file with this code:

 

application: <your-google-app-id> 

version: 1 

runtime: python27 

api_version: 1 

threadsafe: true   

 

handlers:

- url: /.*   

  script: main.app 

 

 

Very straightforward, not much to comment on here. Just remember to replace the app-id section at the top.

 

Next, create a file called main.py (or replace the contents of the existing one) with this code:

 

import webapp2

import json

from google.appengine.api import urlfetch

 

class SlackDemo(webapp2.RequestHandler):

    def post(self):

        sap_url = '<your-sap-gateway>/ZSLACK_DEMO_SRV/RfcDestinationSet'

        json_suffix = '?$format=json'

        authorization = 'Basic <your-basic-credentials>'

        slack_token = '<your-slack-token>'

        request_token = self.request.get('token')

 

        if slack_token != request_token:

            self.response.headers['Content-Type'] = 'text/plain'

            self.response.write('Invalid token.')

            return

 

        text = self.request.get('text')

        details = {}

 

        if text.find('shout') > -1:

            details['response_type'] = 'in_channel'

            response_text = ''

 

        if text.find('test') > -1:

            rfc_destination = text.split()[-1]

            request_url = sap_url + "('" + rfc_destination + "')" + json_suffix

            headers = {}

            headers['Authorization'] = authorization

            response_tmp = urlfetch.fetch(url=request_url,

                              headers=headers,

                              method=urlfetch.GET)

            response_info = json.loads(response_tmp.content)

            response_text += 'Sensor sweep indicates the following:\n'

            response_text += response_info['d']['Destination'] + ' - '

            response_text += response_info['d']['ConnectionStatus'] + ' - '

            response_text += str(response_info['d']['ConnectionTime']) + ' ms response'

        else:

            response_text += "I'm sorry, Captain, but my neural nets can't process your command."

 

        details['text'] = response_text

        json_response = json.dumps(details)

        self.response.headers['Content-Type'] = 'application/json'

        self.response.write(json_response)

 

app = webapp2.WSGIApplication([

    ('/slackdemo', SlackDemo),

], debug=True)

 

 

I'll do a little explaining here.

  • We'll set up ZSLACK_DEMO_SRV in the next post, part 3.
  • To use Basic authentication, you'll need to take some credentials with access to your SAP Gateway and turn them into base64 encoded characters. One easy way is to bring up the Chrome javascript console (ctrl-shift-j), type "btoa('USERNAME:PASSWORD')", and take the resulting string. Obviously use a real user and password here.
  • Take the slack_token value from the screen where you set up your slack slash command in part 1.
  • The app configuration at the bottom will make it so that you should configure slack to send its commands to https://<your-domain>/slackdemo. Change that to whatever you like.
  • We treat the 'shout' text as a command to send the result of the command to the whole chat window. Otherwise the command will respond only to the person who sends the command and others won't see it.
  • We look for the word 'test' as the key to actually invoke our functionality. If we don't find that, Commander Data will respond with his polite apology.
  • We look for the name of the RFC by splitting the command up into words and then just taking the last word. Python has this nice little syntax for lists where index [-1] is the last element of the list. text.split()[-1] does this for us.

 

Build the project and deploy it to the web site you're using. Now we're ready to create the Gateway service that will do the simple RFC test that Commander Data did in part 1.

 

Off to part 3!

Slacking Off (3 of 3)

$
0
0

(This is Part 3 of a 3 part series. See Part 1 and Part 2 for the whole story.)


In the last 2 posts we paved the way to get some data out of SAP from Slack. First, we set up Slack to send out a request when a user enters a Slash Command. Then, Google App Engine handles that request and forwards it to Gateway. Now Gateway needs to respond back to Google with the RFC connection test that the Slack user asked for.


Here's a simple OData service setup that will test an RFC connection on the ABAP system. My intention is to inspire you to do other cool solutions - I'm just setting this up to show off quick-n-dirty style to explain concepts. Take this and make something else work for you!


Go to SEGW and create a service. I called mine ZSLACK_DEMO. Here's an example setup of the fields for an entity called RfcDestination:


segw for slack service.PNG


Then code up the RFCDESTINATIONSE_GET_ENTITY method in the generated class ZCL_ZSLACK_DEMO_DPC_EXT (assuming you kept the same names I used). Make sure you generate the project first, and then do the redefinition process for the method I mentioned. Here's a great document on setting up class-based Gateway services that goes more in-depth.


Here's a simple implementation of an RFC ping method that matches up with the service we created.


   METHOD rfcdestinationse_get_entity.
     DATA: lv_start TYPE i,
           lv_end TYPE i,
           lo_ex TYPE REF TO cx_root,
           lv_rfcdest TYPE rfcdest,
           ls_key_tab LIKE LINE OF it_key_tab.

     READ TABLE it_key_tab INTO ls_key_tab WITH KEY name = 'Destination'.
     IF sy-subrc IS INITIAL.
       lv_rfcdest = ls_key_tab-value.
     ENDIF.

     er_entity-destination = lv_rfcdest.

     TRY.
       GET RUN TIME FIELD lv_start.
       CALL FUNCTION 'RFC_PING' DESTINATION lv_rfcdest
         EXCEPTIONS
           system_failure        = 1
           communication_failure = 2
           OTHERS                = 99.
       GET RUN TIME FIELD lv_end.

       IF sy-subrc IS INITIAL.
         er_entity-connection_status = 'OK'.
         er_entity-connection_time = ( lv_end - lv_start ) / 1000.
       ELSE.
         CALL FUNCTION 'TH_ERR_GET'
           IMPORTING
             error           = er_entity-connection_status.
       ENDIF.

     CATCH CX_ROOT INTO lo_ex.
       er_entity-connection_status = lo_ex->get_text( ).
     ENDTRY.
   ENDMETHOD.


Maybe not production quality, but ready to do the trick. For a good connection, it will give you an OK ConnectionStatus and a number in milliseconds for the response time. For a bad connection, it will respond with the RFC error in the ConnectionStatus field. Our Google App Engine web server receives this and plugs it into a text response to Slack. When Slack receives the response it puts the text into the chat window for the user who requested it.


Wrapping Up

Assuming all the pieces of the chain have been done correctly, you can activate your slash command. Try it with something like "/ask-sap shout out a test of RFC <your_destination>". If you're all set, the chat window will shortly return to you with a response from SAP.


This was a very simple prototype implementation - but there are so many things you could do! I'll leave you with a brain dump of ideas to inspire you beyond my work.

  • Set up a batch program that looks for new work items for people and send them a Slack message. Wouldn't it be cool to get a digest of everything I need to approve in one private chat window every morning? Check out Incoming Webhooks for more possibilities here.
  • Incoming webhooks would even be a simple way to enable some basic real-time notifications. User-exits or enhancement points could be created with knowledge of the webhooks and respond right away to almost any ABAP event.
  • Plug in some of your master data creation processes (assuming they're simple enough) to fit into a command. Imagine "/sap-create new business partner PAUL MODDERMAN <other field> <other field>".
  • Slack is cloud-based and doesn't require any complex VPN setup. The Slack app on your smartphone would be an easy way to enable your processes for quick response.

 

Go make cool stuff!

Viewing all 137 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>