Service Virtualization

  • 1.  how do deal with apis with large numbers of array elements containing structures with variable arrays (of structures), with optional fields

    Posted Jan 06, 2016 10:17 AM

    I thought I had asked this question more  directly, but can't find it.. this applies to both REST/JSON and SOAP based protocols.

     

    in many of our apis, there is an array of structures (think someones address).  the structures have optional fields (like second and third street address line).  the minimum fields are zip and countrycode.  but it could have 10 fields, or 2. or any combination.

    the api supports 1 to 50 of these structures in the array.

     

    in another part of a request, there is an array of requested shipping services, which have optional fields. so the structure is 4 fields at minimum and 8 maximum. no two array elements have to be the exact same layout.

     

    on of the features of the apis is to 'validate/clense' addresses.. you didn't provide the street number, we find it and add it, of the 4 digit  zip extension, , or the city name, or... (so we want to use the data input as part/all of the output)

     

    now to record or build a service from RR pairs,... but..  DT will flatten the structure to individual fields, with some naming convention based on the fields location in the structure in the array.

     

    because the VSI matching will use all the field names (signature) and content (exact following field matching rules),  there must be a signature for each combination of fields possible for there to be coverage of all the possible message contents

    (forget about matching the exact data, assume we use 'anything' for the match algorithm on fields, or just use signature).

     

    because of the size of the data packet, the number of fields gets into the many hundreds quickly..

     

    as these apis are under development, and structure changes are frequent, so this large text blob in the VSI for each transaction gets difficult to manage

     

    how do we do this, without writing code?    (I can do my own match scripts, if I turn off all the fields in the vsi transaction), and I can write our own matching logic, and copy logic (which I have done for my rest/json simulator),

     

    but the power of DT was that I didn't HAVE to write code to make this work..  I just haven't figured out how to do this



  • 2.  Re: how do deal with apis with large numbers of array elements containing structures with variable arrays (of structures), with optional fields

    Posted Jan 06, 2016 01:50 PM

    Hi Sam,

     

    Maybe you can use the Request Data Manager data protocol handler to help you with this.

    https://docops.ca.com/display/DTS90/Request+Data+Manager+Data+Protocol

     

    Using this DPH you can add actions to Keep the arguments that you need, and it will ignore the others, or Delete the optional arguments.

     

    Another option is after the VSI is created you can use the "Mass Change" icon to perform a mass change of request arguments:

    https://docops.ca.com/display/DTS90/Transactions+Tab+for+Stateless+Transactions

     

    As an example, you can change the comparison operator to "anything" in all transactions that "matches" a specific name or value.

     

    Hope it helps.

     

    Thank you,

    Heloisa



  • 3.  Re: how do deal with apis with large numbers of array elements containing structures with variable arrays (of structures), with optional fields

    Posted Jan 06, 2016 07:41 PM

    thanks.. but neither of these really help.

     

    the optional data matters when its supplied, and doesn't when its not supplied. but the two(hundred) would/should generate different matches which would produce different results.



  • 4.  Re: how do deal with apis with large numbers of array elements containing structures with variable arrays (of structures), with optional fields

    Posted Jan 14, 2016 09:48 AM

    Daniel_Bingham or Chris_Kraus - can you help with this question?



  • 5.  Re: how do deal with apis with large numbers of array elements containing structures with variable arrays (of structures), with optional fields

    Posted Jan 21, 2016 05:49 PM

    If I understand your scenario correctly, it sounds like there is probably some consistent data in addition to the variable data?  A trivial example might be if there is a userID field that is always there and then the "addresses" array that varies.  You probably have more to your example than that, but hopefully that gives the basic idea.

     

    If my assumption about that is correct, you have what amounts to a couple of distinct usages for the input(request) data.  Namely:

    1. Some of it is necessary for matching and provides a unique "signature" for a type of transaction (i.e. all VerifyUserDetails calls have some basic sub-set of fields that are consistent, even if the address fields vary).

    2. Some of it helps determine the structure of the response (i.e. if three "addresses" came in the request, three should go out in the response)

    3. Some of it helps populate the response (i.e. the zip code in address 1 in the request needs to go into the zip code field in address 1 in the response)

     

    And, in particular, I suspect that you have this variability in the purpose of your input data specifically because the data you see at playback isn't the same as the data you captured during recording.  i.e. you may have recorded 5 "example" VerifyUserDetail transactions, but you actually have 200 users in your test environment and want the VS to "work" for any of them you want to use in a given test.

     

    This is important because it represents a philosophical difference between the historical guiding principles for the product and your current usage.  Historically the product was designed under the assumption that you would record the exact scenario(s) that you want to play back, and then you'd just play them back, verbatim, N number of times.  Obviously, if we see ALL of the permutations during recording, then these problems go away.  We already have all the various "signatures" that will happen along with their data, magic strings, etc and you can just go straight to playback.  Further, the assumption has been that these Virtual Services are "throw away" in the sense that if your scenarios change and the data we captured before no longer applies, you'll simply throw away that VS and record a new one for the updated regression suite.

     

    I mention that for historical perspective, not to imply that one way is "right" or "wrong".  We could probably have a very enjoyable and long debate about that if we wanted.  But that wouldn't help you right now.

     

    So the short answer is that you'll have to do some level of scripting because of that history.  We didn't build any features into the product to specifically help with what you are doing.  But perhaps by attempting to be a bit clever (reader gets to decide if I was successful in being clever), we could minimize how much scripting you need to do.  And/or make it such that maintaining that scripting is as easy as possible.

     

    Here is what I would do:

    1. Use SOAP or XML DPH to create arguments out of all the XML elements.

    2. Use the Request Data Copier DPH to copy all of the arguments into TestExec properties

    3. Use Request Data Manager to "keep" only the items that uniquely identify the signature (i.e. #1 above)

     

    Then, once the service is created, open the VSM and add a script step before the respond step.  In that script I would:

    1. Check to see if the transaction in flight is one of these "variable" ones.  If so:

    2. Count the number of "addresses" in the incoming request and

    3. Dynamically build a response chunk for those "addresses" including adding in any property references that are appropriate (i.e those "magic strings" you saved into testExec with Request Data Copier).

    4. Save that chunk of text into a testExec property with a known name (i.e. 'dynamicResponseChunk').

     

    Then, in the SI, find the transaction in question and edit the response to put a reference to "{{dynamicResponseChunk}}" in the place where that dynamically generated chunk should go.

     

    The "hardest" part to get both right and "maintainable" is going to be the script step.  Specifically, how to dynamically build that chunk and copy in the appropriate property references.  I do think it is possible to do dynamically and generically since you know how many addresses there are and the properties are named in a consistent manner with an appropriate index.  You should be able to check if a property exists with the expected name, if so, add a reference, if not, insert dummy data, a string generator pattern, or whatever.

     

    Once you've done those steps, the Respond step will (automatically) correctly resolve all the property references, string patterns, etc and build a complete, "correct" response to send back to the caller.

     

    I hope that helps.  And if you have specific ideas on how we could enhance the product to enable you to accomplish this use case easier, we'd love to have those in the "Ideas" section.

     

    Daniel



  • 6.  Re: how do deal with apis with large numbers of array elements containing structures with variable arrays (of structures), with optional fields

    Posted Jan 25, 2016 09:23 AM

    Thanks..I understand better the DT objectives.. which you've noted, aren't the same as ours. 

     

    I was trying to use the product functionality without resorting to scripting..  I have the scripting solution which works, similar to but more powerful that the Data Driven solution.

    but like all locally developed solutions, long term maintenance becomes an issue.

     

    some design constraints.

     

    1. the api user knows nothing about DT, but does know Excel  - similar to DT Data Driven

    2. the service knows nothing about the message content/design - its all just text

    3. the person that configures the service operation, only knows their data, and a few (data mapping) rules.

    4. the person that configures the service operation has 100% control of the level of matching (including headers, parameters (in url or as query parms), and data, at any level, field or structure.

    5. we want to re-use this service unmodified, for many arbitrarily selected applications (why make 2 if 1 will work!)

    6. the service will do all the work of adjusting files to get the right info needed (yaml -> json for json schema or wsdl to xsd for xml schema, so they don't have to.

     

    I think I have submitted ideas on two key blockers

    1. when dealing with array data , the algorithm that flattens the data uses different names for the 1st array elements, depending on if there is 1 structure or more than 1.. makes it really hard to write generic code.

    2. the json schema checker step  cannot take property syntax ({{}}) to identify the file/path to use for THIS message.  our developers have many different message formats base on the url. all in the same service,

        using schema checking helps significantly with the reduced workload of exact data matching (they can decide, if it matches the schema, which supports optional parms, nested 99 deep, then whatever it is is ok.)



  • 7.  Re: how do deal with apis with large numbers of array elements containing structures with variable arrays (of structures), with optional fields

    Posted Jan 28, 2016 04:56 PM

    Re-Recording is the best answer rather than trying to recreate the business logic in the script. As dynamically generated responses need a lot of maintenance ( identifying the API that needs this customization itself is time consuming) which deviates us from the purpose of using this tool in the first place.

     

    In better words Quoting Daniel

     

    "This is important because it represents a philosophical difference between the historical guiding principles for the product and your current usage.  Historically the product was designed under the assumption that you would record the exact scenario(s) that you want to play back, and then you'd just play them back, verbatim, N number of times.  Obviously, if we see ALL of the permutations during recording, then these problems go away.  We already have all the various "signatures" that will happen along with their data, magic strings, etc and you can just go straight to playback.  Further, the assumption has been that these Virtual Services are "throw away" in the sense that if your scenarios change and the data we captured before no longer applies, you'll simply throw away that VS and record a new one for the updated regression suite."



  • 8.  Re: how do deal with apis with large numbers of array elements containing structures with variable arrays (of structures), with optional fields

    Posted Jan 28, 2016 05:35 PM

    thanks.. but I cannot record.. the application doesn't exist yet.