Testing Services provides a platform for QA professionals to discuss and gain insights in to the business value delivered by testing, the best practices and processes that drive it and the emergence of new technologies that will shape the future of this profession.

November 20, 2020

Making Enterprise Test Automation a Possibility using UiPath

Energizing the core of enterprise business using AI driven automation is the key for success for all progressive business houses. Combining the power of Robotic Process Automation (RPA) and related cognitive capabilities is giving a new direction to enterprise test automation. UiPath has been at the forefront of this journey. The new UiPath Test Suite (Advanced IDE StudioPro, Test Manager and Test Robots) provides a platform for all enterprise testing needs, be it Software testing or even testing RPA. Its current success gives us confidence that the future of Enterprise test automation will be driven by RPA based platforms.

Continue reading "Making Enterprise Test Automation a Possibility using UiPath" »

August 26, 2020

Graph Analytics - The Science of Network Analysis

Gartner predicts that the adoption of graph analytics and graph databases will grow multifold at 100% annually through 2022 to continuously accelerate decision making and enable more complex and adaptive data science. By 2023, graph technologies will facilitate rapid decision making in 30% of organizations worldwide.


Through this blog, we will understand What is Graph Analytics, Types of Graph Analytics, and a few Use Cases.


A.    What is Graph Analytics


Graph represents relationship between two entities. Graphs are made up of Nodes or Vertices (entities) & Edges or Links (relationships). The graphs provide a scalable and supple platform for finding connections between data nodes or analyzing the data derived from strength of relationships. Graph Analytics is the methodology of analyzing relationship amongst entities to uncover insights that are difficult to visualize with other techniques. This analysis is supported by building graph algorithms powered by graph databases.


·         Graph analysis enables how several trends or data entities are related to each other


·         Graph models determine connectedness across data points to determine the nodes that generate the most activity or sphere influence


·         Graph databases provide expanded capabilities for storing, calibrating, and analyzing graph models


·         Generating dynamic graphs instead of static relational schemes help uncover deeper insights



B.      Types of Graphs 


·         Path Analysis: Analysis to determine the relationship and shortest path between 2 nodes. A typical applicability includes route optimization in supply chain and logistics


·         Connectivity Analysis: Analyze the strength or weakness of connection between two nodes, can be applied to identify weaknesses in networks


·         Centrality Analysis: Analyze how important a node is for connectivity of the network, can be used to identify most influential people in a social media network or to find high traffic web pages


·         Community Detection: Distance and/or density of relationships can be used to identify communities and detection and behavior patterns



C.       Relevant Use Cases


·         Detecting financial crimes like money laundering or payments to prohibited entities


·         Identifying fraud in banking transactions, fraudulent insurance claims, and suspicious activities in telecommunications


·         Regulatory compliance - tracking sensitive data lineage through enterprise systems


·         Graph algorithms can optimize airline routes, supply distribution chains, and logistics


·         Support study in Life Sciences - medical investigation, vaccine development, disease pathology


·         Social Media - identify social influencers and online communities


·         Optimization of recommendation engines in eCommerce platform by use of collaborative filtering resulting in more personalized recommendations


Leading Graph Database Tools


                Amazon Neptune, Neo4J, ArangoDB, DataStax, OrientDB, Titan


Role of Data Testing in Graph Analytics


Data Validation becomes a vital cog in successful implementation of graph processing in various domains and industries. Data (Sedentary and Streaming) from traditional, cloud, multi-cloud systems is extracted, integrated and loaded into schema free Graph DBs. Robust data validation techniques will ensure data integrity and no data loss as data moves through disparate systems. Analytics models built on top of Graph DBs also not to be validated to develop human-machine trust and a thorough Data Visualization testing needs to be conducted to ensure analysts are able to make decisions on accurate data visualization.


Conclusion


As data and devices become increasingly interconnected and sophisticated, it is indispensable to relationships within data and derive insights from the data resulting in faster decision making by supporting analysts. Graph are suited to this due to their exceptional capability of finding revealing patterns in the connected data. As organization and enterprises continue leveraging the power of big data and data analytics, the unique capabilities of graphs are essential to today's wants and tomorrow's triumphs.


May 19, 2020

Data Fabric - The Futuristic Data Management Solution

Global Research and Advisory firm Gartner has identified top data and analytics trends for 2020, which will have significant transformative potential in the next two to five years. Data Fabric is one of the most prominent trends. With the enormous growth of both structured and unstructured data from smartphones, IoT devices, and digital channels there is a need to be able to process large amounts of data, mine it, analyze it and make it accessible. Data Fabric is a method to understand large amounts of data traversed through the cloud systems.

In this blog, we will understand What is a Data Fabric, The Data Fabric Stack, and a few Use Cases.

A.      What is Data Fabric

Data Fabric is a unified architecture and data service set running on that architecture, that helps organizations manage their data across on-premise, cloud, and hybrid cloud systems. Data Fabric is a single, unified platform for data integration that simplifies and integrates data management across platforms to accelerate digital transformation.

 

§  Connects to platforms using pre-packaged functions and connections

§  Integrates and manages data from on-premise and cloud environments

§  Support for batch and real time data streams

§  Data Quality, Data Enrichment and Data Governance capabilities

§  Support for API development and integration

 

B.      The Data Fabric Stack includes following layers - 


·        Data Collection & Storage: Ingest & Integrate data, events and APIs from any source, from on premise and in the cloud

·        Data Services: Manage several services at this layer including data governance, data protection, data quality and adherence to compliance standards

·        Transformation Layer: Involves cleaning and enrichment of batch and real-time data to enable informed decisions

·        Analytics/Sharing Layer: Realize data value by making it available internally and externally via self-service capabilities, analytic portals, and APIs


C.       Successful Use Cases

«  A leading pizza company in the world, with both delivery and carry put operations, utilizes data fabric to maintain the competitive advantage. It allows ordering of pizza from a plethora of devices including TV, Smartwatch, Smartcar, etc. resulting in 25TB of data, from 100,000 data sources - structured and unstructured. Using Data Fabric, company gathered and analyzed data from their POS systems, multiple supply chain centers, and across digital channels including text messages, Twitter, Amazon Echo

«  A leading pharma company, applied AI to develop weed identification, and enabled farmers to apply the exact solution needed to kill the weed species. It developed an app which used machine learning and artificial intelligence to match photos that farmers uploaded to the app. This resulted in better choice of seed variety, better application of crop protection products, and best harvest timing

«  Leading insurance company is utilizing data fabric to store and analyze claim data - claim report, incident data, police report, claim history, claim details, counterparty details, etc. This has helped in faster settlement of claims and also to make policies more compelling and price them competitively

Conclusion

In a world where technology is changing everyday lives, digital transformation tops the strategic agenda of most organizations and their leaders. To be successful in digital transformation journey, data is lifeline, to enable new customer touch points, create innovative business propositions, and optimize operations. Data fabric enables businesses to achieve these by offering connectors for hybrid systems, advanced data integration capabilities, and analytical capabilities. Demand for data fabric will get stronger as organizations look to stay on top of emerging technologies and new trends to stay competitive, stay relevant and maintain business edge.

May 13, 2020

'By failing to prepare, you are preparing to Fail' - Data Analytics and its role in dealing with Covid-19

During the past few months, the world is fighting a battle to contain a lesser known virus with no cure or vaccine in sight and which has had far reaching impact on the world's healthcare and economic stability. Informed decisions are need of the hour and these decisions are making use of data generated from various feeds globally across sectors. COVID-19 has brought to the core the importance of Big Data Analytics. 

Continue reading "'By failing to prepare, you are preparing to Fail' - Data Analytics and its role in dealing with Covid-19" »

September 16, 2019

UI vs UX: Revisiting the age old debate

With technological advancement reaching the common man's hand in 21st century, everybody seeks to experience technology without spending much brain and time. Mobile and Web consumers nowadays expect quick and consistent navigation with seamless experience. Hence the growing emphasis on professional UI/UX design in software applications.

While we realize the immense importance of a visually appealing and user friendly application experience, UI/UX are the terms that are generally used interchangeably in the software world. As a matter of fact, the terms are closely linked when being talked in software design landscape.

UI is not UX

By definition, UI or User Interface is the graphical layout of an application which a user interacts with. This includes buttons, input controls, screen layout and every micro-interaction. UI designers create the look and feel of a user interface for an application.

UI.png

UX or User Experience determines how easy or difficult is to interact with the User interface elements on the application. This being the main reason people generally confuse UI and UX to be similar terms and use them interchangeably.

Granted, it is completely fine to use UI/UX together in software design, wherein, the UX designers are also concerned on application UI to ensure smooth navigation and provide a seamless application experience. However, it should be understood that UI is just one of the salient elements in UX as shown in Fig. Designers work on both user interface and user experience design for a customer friendly application.

When it comes to application testing, UX/UI are mostly covered during user acceptance testing phase in an SDLC. While teams do realize the importance of UI testing early on (along with functional testing), to avoid any defects percolating to later tests, usability testing (UX and UI teamed up) is generally scheduled after/with integration tests to accommodate application agility. However, teams end up doing highly expensive rework, due to last minute customer feedback on supposedly less important non-functional aspects like user interface and experience.

UI testing: Test early test often

Validating seamless user experience may seem more relevant during end of the application testing processes, however validating UI with respect to interface, design and navigation requirements need to be taken up way early.

With the rise in customer centric business requirements, it is thus prudent, that UI tests be planned early and be done repeatedly till all functional and non-functional requirements are met. UI automation scripts come handy while planning repetitive tests like performance, load and device/browser compatibility. Early performance or compatibility testing allows capacity planners and infrastructure architects with early warnings on any potential problems with the scalability of the architecture. UI layout and navigation may be volatile during early stages of the application development. Teams must carefully isolate application UI and functionalities to enable independent tests for better results. Automation scripts must be used where functional or UI requirements are stable. QA techniques in early forms can be applied to usability or design testing even before the UI is integrated with functionalities. Automated regression tests should be conducted as often as possible through the course, and not just as part of final QA activities or just before system integration.

A volatile UI may be a ticking bomb towards the end of application lifecycle which may adversely affect the user experience offered by the application. It is thus wise to stub out non-functional testing especially UI for early defect detection and avoid rework post integration tests.

Happy testing!

September 3, 2019

Winning the Test Automation game

Enough has been said about writing better tests, optimize automation scripts or planning test cycles. Teams can choose from a plethora of test accelerators available in the market depending upon the features and automation maturity offered. However, with frequent changes to the application behavior and business requirements per se release planning, test maintenance and test criteria, the selected automation tools are not able to cope up with the pace testing workflows change over time.

Hence it is prudent to consider the maintainability aspect during the engineering cycle of the automation solution[i]. Having said that it is equally important testing teams to layout a viable plan with realistic automation goals and also accommodate incremental automation.  

Lay out an Automation Roadmap

Project teams always thrive on the 'automate whatever possible' mantra. Therefore, end up addressing the pertinent challenges and do away with minimalistic automation ignoring the possible troublemakers. Automation tests may work wonders for progression cycles, however, once teams get into regression, they start realizing the side-effects of not having proper test maintenance in place. While they come up with corrective measures to improve regression planning, reuse automation tests or even correcting them, the overhead is tough to crack.

The major flaw lies in haphazard adoption of automation in pockets wherein the tools in use may address only a few or more aspects in the process workflow. Rest is either carried out manually or using other tools. Teams generally take help of macros or client side stored procedures or scripts to make different tools or manual processes work together. The lack of support in end to end workflow leads to issues like flaky tests, unmanaged automation and hence depleting ROI<reference to debacles>.

It is thus essential to plan test automation meticulously with an incremental roadmap and test traceability on top priority. Provision to accommodate incremental automation like new tools to address complex application features or leverage latest technologies for better coverage, shall help testing teams to achieve better results over time.

 Pick the right ingredients

While the benefits of test automation are proven with variety of mature scripting tools available, testing teams are still struggling with test maintenance debacles. Correcting the approach may help automation teams engineer a sustainable framework, but to realize the best possible results, testing teams must use the right ingredients in the first place.

During test planning phase, testing teams tend to focus more on leveraging application changes to ensure best possible test coverage. Automation test cases keep piling up due to lack of regular optimization, with zero traceability and usability. Leveraging relationship between test assets (test data, automation scripts) available from previous cycles and application objects, functions and their properties, could augment automation planning and self-healing of test scripts. A calculated impact analysis of application changes onto the test scripts thus helps testing teams to reuse, optimize and write better automation tests.

Additionally, test data preparation in pockets also leads to unmanaged automation workflows. It is high time when due diligence is put in preparing test data for application testing using appropriate channels. Automation tools available for test data preparation and workflow management could be leveraged for test data traceability and consistency especially in SIT environments.

Putting it all together

The responsibility of providing a viable test automation solution though lies in the hands of automation engineering team, however it is the implementation team who can make best out of the available offerings and make the test automation process a success.   



[i] https://www.infosysblogs.com/testing-services/2019/05/building_a_sustainable_test_au.html

July 14, 2019

Building a Sustainable Test Automation Solution

Test automation is an essential part of QA processes in the software testing industry. Once a mere tool for optimization, supporting manual testing, test automation has now become primary driver in QA. However, successful test automation is much more than just writing a code to de-manualize a step wise process. 

During a digital transformation journey, everything may seem pretty straightforward while using automation frameworks and scripts. In reality, this automation success is short-term. Less than a year into the implementation, many teams soon get pulled into vicious circle of automation maintenance. Issues like flaky test results, change in expected behavior of the system, environment/infrastructure changes, diminish your ROI from a test automation framework[i]. It is thus essential to realize the fact that the success of automation solutioning, especially in software testing landscape, is more about avoiding mistakes than just getting it right!

The need of the hour is to look beyond the surface and come up with a futuristic, self-healing and sustainable test automation solution bedecked with best practices and technology. There isn't a checklist for 'right automation', but in fact 'quick solutions' are certainly expensive, or nearly impossible to maintain. Consider the following tips while embarking on your test automation journey. It may not be a cakewalk but definitely will have long term maintainability.

Picture2.jpg

  1. Simplify
    Testing requirements are as vast as application development. The automation tests are expected to match up the pace of application complexities as features mature. However, flexibility in the system should not be syntactically complex, that might bog down the user. An ideal solution should truly serve the testing goals while at the same time be fluid enough to handle real world testing complexities.
  2. Modularity & Reusability
    Testing approach/type may differ on project basis, depending upon factors like application type and life-cycle process. Automation components must be tailored in a way that they should be non-cohesive to the landscape diversity. These modules could be reused as common automation assets in multiple projects. Such a system thus eases the test maintainability and enhances trace-ability. 
  3. Handle dynamic nature of the application
    Identification of frequent changes and dynamic elements in an application are the two major challenges in the real world of test automation. Most test automation frameworks are unable to identify these dynamics, wherein test planning becomes ineffective and thus may lead to defects. Hence, a provision for identification criterion of application changes, dynamic objects and properties update, is vital for effective test automation.

  1. Centralize Test services wherever possible
    A one-stop solution for dynamic test assets, on-demand test tools and environment, with minimal domain knowledge and solution expertise, Testing As A Service or TAAS is an outsourcing model. 
    It is recommended to offer software testing as a service over cloud, especially in projects where extensive automation and short execution cycles are involved. The components can be used on-the-fly as per subscription using a centralized infrastructure.   
  2. Domain flavored automation
    While testing an application, a tester must think like an end user. Especially in Banking, Financial Services and Insurance (BFSI) and telecom domains, it is essential to know working procedures and domain keywords to write and execute tests better. Similarly, for an automation solution, a distinct edge on domain knowledge is vital to ensure maximum coverage of the functional and non-functional aspects of an application under test.

  1. Building intelligent automation
    Automation is not a one-time solution but a process. A smart automation solution should ideally be self-learning and adaptive. Amalgamation of AI/ML in QA automation helps inducing trace-ability in progression tests and self-healing in regression tests. Alternatively, for lesser sophisticated continuous automation delivery, code-less test automation can be explored. 
To sustain test automation, an appropriate framework with right mix of infrastructure and technology is vital, it is equally essential to streamline testing processes at practice level. Success in test automation requires immaculate planning and design work. Remember, automation in testing is not just a fancy UI to perform test steps, but should be aimed at building a solution that has long term maintainability and traceability. Hence, a sustainable automation solution that would suit the real world testing problems is a must!

June 19, 2019

Test Automation Debacles

In the era of digital transformation and spurring competition, organizations are joining the automation bandwagon without a second thought. Especially in software testing; automation has acquired an important place to address needs of agile and continuous testing processes. While the benefits of test automation are well proven with plenty of mature scripting frameworks available in the market, the death knell to the automation journey comes when testing teams start to struggle with test maintenance fatalities.

The Mayhem

We studied a few QA projects closely, right from test planning to execution, only to realize the damage, that un-managed test automation can do to the automation ROI.

Based on the process maturity and application under test, testing teams use various tools and frameworks implementing automation at varied levels. There are tools and techniques that offer automation in pockets and enable teams to realize instant benefits. However, shortsighted test automation is unable to keep up with the pace with which application features evolve. With teams going agile, short test cycles bedecked with amateur testing practices add to the debacles, hence lowering the automation tests maturity graph. Whereas, application under test continues to evolve with exponential speed. Therefore, the gap between automation tests and application features keeps increasing at alarming rate with time, as shown in Figure 1. With frequent changes in the application, poorly written tests and un-managed 'quick' automation test cycles, the teams get caught into vicious cycle of maintenance costs and decline in test coverage. This upsurge in test maintenance leads to regression defects, hence diminishing ROI from the automation.

TestAutomationdebacles.png

As a matter of fact, project teams land into worse situation with test maintenance during automation cycles as compared to manual ones! We observed teams spend more time in fixing test scripts almost by a factor of 200 as compared to what they spend in manual tests. That explains the steep surge in automation ROI in Figure 1.

The mistake lies in..?

We have blamed automation practices and application dynamics enough. Test automation experts are already looking into streamlining test automation and devising ways to leverage application changes to plan tests better. Did it really help? I don't think so! It may slow down the damage but eventually the irreversible destruction caused by haphazard test automation is realized sooner or later. The primary cause lies in the expectations. Testing teams do realize end to end automation is an incremental process or for that matter 100% automation may/may not be achievable, due to which, the end-user's expectations from test automation solutions stoop to minimal level. That's where the problem starts. It's the automation engineering that needs to be addressed!

We may standardize tools or processes for a testing team to adhere to, wherein test automation could be implemented in pockets and benefits be realized momentarily. However, the teams must realize the tool benefits are as good as the features offered and the automation coverage, which may turn into a nightmare in long-term, with no test maintenance available. Hence it is now necessary to look beyond test automation and address the gaps in how the automation tools and accelerators are engineered to offer better and sustainable automation.

You may find my recommendations on picking up the right ingredients and building a sustainable test automation solution in coming blogs.

May 17, 2019

QA Paradigms in Open Banking

Open Banking started as a regulation in the British banking circles, and now countries around the world are racing to adopt it. Australia is making its first move towards Open Banking later this year in July. The European Union is adopting PSD2 on lines of Open Banking. Countries implementing Open Banking are being watched intensely by those planning  to adopt these standards like Israel, Canada, Hong Kong, Japan and Singapore. Everyone is waiting to see the outcome of Open Banking imperatives. What is Open banking? Why should the IT world take notice? And what would be the implications of Open Banking in the Software Testing world? I am going to take a stab at these in the next few paragraphs.

To start with, Open Banking is a directive by UK's Competition and Markets Authority which mandates that all banks should expose their customers' data via open APIs to third party providers like competitor banks and FinTechs, with the express consent of the customer. What started as a regulation, Open Banking now broadly refers to the splitting of banking services and enabling customer's data access to partners outside the incumbent banking system with express consent. Open banking has created avenues for Fintechs and challenger banks to use technology which can leverage Customer data to help secure loans, provide a level playing field to pick and choose, help with payments, etc. Open banking has truly enabled FinTech firms to compete with large banks by helping them design more customer friendly products and also has provided much needed competition between banks to provide more value to the customer. Until recently if another bank/FinTech wanted access to the financial data of a customer, either the customer would have to fill in the data fields manually or the bank/FinTech would obtain the customer's login credentials and scrape the incumbent bank's page to get the required data. This is not a best practice with in terms of cyber security and a rather crude way to garner data. Now, Open banking has made it very convenient for customers to expose their data via open APIs. Additionally, it has empowered customers to switch banks easily. And further constructed a level playing field where FinTech firms can leverage data and technology to come up with creative solutions against the larger banks. Overall, Open Banking has increased competition and innovation while adding value to the end customer.

European Union implemented its own Open Banking regulation known as PSD2 which is an abbreviation for Second Payment Services Directive. PSD2 is a regulatory directive by the European Banking Association (EBA) applicable to European Union markets. PSD2 requires banks to grant customers the right to choose their payment partners PSD2 had been conceived with the intent of making payments easier in terms of innovation and use. There were few salient differences between Open Banking regulations of UK and PSD2 but in November 2017, Competition and Markets Authority mandated that Open Banking should be compliant with all PSD2 directives.  Open Banking will now cover all payment products like Credit Cards, Debit Cards, e-Wallets, etc. which are part of PSD2. Both PSD2 and Open Banking regulations have evolved to complement each other by increasing the scope of financial products under Open banking.

Payments will also get simplified via Open Banking. For example, currently on an ecommerce site, a typical payment goes via various intermediaries like the merchant, Payment Gateway, card associations like Visa or MasterCard, issuing bank and acquiring bank. But, with Open Banking, the online retailers can directly conduct payment transactions with your bank without any intermediaries. And again this benefits the end customer as the surcharges demanded by these intermediaries is eliminated.

Which brings us to the ultimate question of this blog, how will Open Banking affect IT industry especially Software Testing? Interoperability through common standards as one of the keystone objectives of Open Banking.  To achieve this, banks will have to build open APIs which comply with regulatory standards, security protocols, safe data transfer, compliance with all the directives, etc. Open banking creates a plethora of opportunities in regulatory testing, penetration testing and security testing to make sure all the security protocols are in place and thwart cyber criminals, performance testing when many customers try to access or transfer data at the same time, API testing, Accessibility testing, consent testing, Strong Customer Authentication testing.

Open Banking has the potential to grow into a niche QA area where domain experts with testing skills would work on ensuring the APIs and platforms are performing optimally. Experts in QA who are well versed with Open Banking landscape will be very much in demand and since all financial institutions operating in Europe and UK must conform to Open Banking standards, all of them will require support in this area. There are 9000+ financial institutions in Europe and all of them will have to comply with Open Banking/PSD2 which translates to immense QA opportunities.

This write up serves as a generic introduction to Open Banking and opportunities in store. My next post of Open Banking will look at more granular details of how Open Banking will affect QA in various industries like Retail!

Reference:
https://bankingthefuture.com/a-primer-on-open-banking/
https://www.openbanking.org.uk/wp-content/uploads/What-Is-Open-Banking-Guide.pdf
 https://www.paymentscardsandmobile.com/psd2-explained-payment-services-directive-created/
https://www.starlingbank.com/blog/explaining-psd2-without-tlas-tough/

Continue reading "QA Paradigms in Open Banking" »

March 31, 2019

RPA Performance Testing

In today's rapidly changing technology landscape, new ground breaking trends are emerging every day. Some of today's key trends driving financial services industry imperatives are -

1. Robotic Process Automations (RPA)
2. AI and Digital Assistant such as Chatbot
3. Block Chain
4. Big Data

RPA has created lot of buzz in the industry. Organizations are reaping in immense benefit by implementing RPA. As per Mckinsey, "110-140 million FTE's could be replaced by automation tools and software by 2020 ". RPA implementation has necessitated strong testing support to avoid any failures because it can be very expensive in the later stages of the development. One of the challenges faced by organizations is identification of bottlenecks and hotspots. As per IBM World Testing Report, 65% of organization are facing challenges related to Performance testing.

ChallengesImage.png

While organization are reaping RPA benefits, it is equally important to ensure the performance of RPA processes is up to the mark and it meets 3S (speed, scalability and stability) mantra.

Before dwelling deeper in RPA PT challenges and solution, let's understand the typical RPA landscape.

  • RPA landscape

RPAComponent_new.png

As seen from above diagram, RPAs possesses immense capability for integration with varied landscape. It can be easily integrated with legacy, web based, API based, mainframe applications and many others. They also promote reuse by "exposing" their learning's to shared library which can be used by other bots. PRA interacts with different systems via screen scarping, emails, OCR, APIs etc. replicating user actions.

  • Performance Testing areas

Having understood the landscape, let's focus on what are the key elements of the performance testing that should focus on.

1. Capacity related issues when concurrent jobs are scheduled by robots

2. Tasks completed in given time per bot

3. Licensing and bot utilization -

  • Licenses - Monitors total number of acquired robot licenses
  • Robot utilization vs. capacity - Monitors the percentage of acquired robot licenses that are utilized in production 

4. hourly/daily variability in robot usage

5. Elastic Scalability - Dynamically upscaling and down scaling hundreds of robots to ensure RPA meets user demands

6.Complete eco system performance - Along with RPA processes, we need to focus on each application in the eco system.


  • Challenges faced

While now we understand what should be focus areas, there are inherent performance testing challenges faced by RPA. They are -

1. Dissimilar technologies: As seen from RPA landscape, each application under RPA execution may belong to a different technology. We need to assure that each component meets performance in isolation and in E2E eco system

2. Performance testing tools availability: Diverse landscape adds complexity that one single PT tool cannot support varied needs of ecosystem. For RPA systems, there are no record and playback mechanisms available while for RPA backend systems, we have to explore appropriate COTS/commercial tools based on protocol support via POC, knowledge sharing, etc.

3.Test environment: The Performance testing environment may not be exact production replica due to cost or any other factors. We need to plan the realistic workload which will cater to scaled down version and any other dependency to achieve desired results within the ecosystem.

4. Monitoring solutions: Similar to performance testing tool availability challenge, narrow set of monitoring solutions exists to monitor platform, detect the performance issues and for bottleneck analysis. We have to explore COTS/open source tools to cover the varied technology landscape.

5. Continues delivery pipeline: Current RPA solutions are mostly commercials solutions and RPA engineers don't have any open source options available due to proprietary the binary file formats. This should likely change down the line as RPA adopts open source standards. Infosys AssistEdge RPA community edition is certainly a revolutionary step towards this.

6.Unavailability of RPA backend / interacting systems: Since complete RPA ecosystem is a complex one, there are chances that one of the interfacing system may be behaving poorly or down temporarily.. 

How do we overcome these challenges? What strategy we adopt? The solution lies in sociability test.

  • Sociability Performance Testing

Sociability test will focus on core RPA process and any systems interacting with RPA. Refer to the diagram below.

RPAComponent_new - Solution.png

  • Key aspects to look at -

1. Tools and technology - Tools used will vary and can be combination of open source and COTS system. We need to assess the complete technology landscape and consider two separate areas here - RPA vs other IT systems.
For RPA there is no specific tool for PT but we can collect critical stats by observing the monitoring console. For E.g. process run time, number of records processed, computing units used, license usage etc. So the monitoring console is currently our best bet to fine tune RPA processes.
For other IT systems, we can explore use of open source systems such as jmeter or COTS such as micro focus performance center, NEOLOAD, etc.
Key is to ensure E2E ecosystem testing to ensure accurate stats and stable systems.

2. Utilizing strong APM - APM such as Dynatrace/AppD will need to be installed in order to get the detailed system metrics and transaction response times on downstream IT systems. APM tool can help in baselining transactions etc.  These can be used to monitor the RPA Infrastructure on which it is hosted and backend/interfacing systems as well.

3. Test Data - For setting up data for test, you can look at RPA itself to create required test data etc. as well i.e. system under test will be leveraged for automation as well.

4. Service Virtualization - Service virtualization using tools CA Service virtualization, Parasoft virtualize etc. can help to emulate the behaviors of various interacting components. It may not be possible to leverage this solution in all situations but should help in cutting down the testing cycle wherever possible.

5. Establishing CoE - PT CoE will play crucial role as we have multiple teams involved in E2E testing. Establishing proper processes and governance models will ensure testing is done in minimal time and less cost.

To summarize, RPA itself automation process and script less, again scripting it using another automation tool may not work. So monitoring is our focus area along with workload formation to test in pre-prod. It's like batch run where workload will be initiated by RPA itself but you will use another tools for monitoring performance etc.
Subscribe to this blog's feed

Follow us on

Infosys on Twitter