Testing Services provides a platform for QA professionals to discuss and gain insights in to the business value delivered by testing, the best practices and processes that drive it and the emergence of new technologies that will shape the future of this profession.

September 16, 2019

UI vs UX: Revisiting the age old debate

With technological advancement reaching the common man's hand in 21st century, everybody seeks to experience technology without spending much brain and time. Mobile and Web consumers nowadays expect quick and consistent navigation with seamless experience. Hence the growing emphasis on professional UI/UX design in software applications.

While we realize the immense importance of a visually appealing and user friendly application experience, UI/UX are the terms that are generally used interchangeably in the software world. As a matter of fact, the terms are closely linked when being talked in software design landscape.

UI is not UX

By definition, UI or User Interface is the graphical layout of an application which a user interacts with. This includes buttons, input controls, screen layout and every micro-interaction. UI designers create the look and feel of a user interface for an application.

UI.png

UX or User Experience determines how easy or difficult is to interact with the User interface elements on the application. This being the main reason people generally confuse UI and UX to be similar terms and use them interchangeably.

Granted, it is completely fine to use UI/UX together in software design, wherein, the UX designers are also concerned on application UI to ensure smooth navigation and provide a seamless application experience. However, it should be understood that UI is just one of the salient elements in UX as shown in Fig. Designers work on both user interface and user experience design for a customer friendly application.

When it comes to application testing, UX/UI are mostly covered during user acceptance testing phase in an SDLC. While teams do realize the importance of UI testing early on (along with functional testing), to avoid any defects percolating to later tests, usability testing (UX and UI teamed up) is generally scheduled after/with integration tests to accommodate application agility. However, teams end up doing highly expensive rework, due to last minute customer feedback on supposedly less important non-functional aspects like user interface and experience.

UI testing: Test early test often

Validating seamless user experience may seem more relevant during end of the application testing processes, however validating UI with respect to interface, design and navigation requirements need to be taken up way early.

With the rise in customer centric business requirements, it is thus prudent, that UI tests be planned early and be done repeatedly till all functional and non-functional requirements are met. UI automation scripts come handy while planning repetitive tests like performance, load and device/browser compatibility. Early performance or compatibility testing allows capacity planners and infrastructure architects with early warnings on any potential problems with the scalability of the architecture. UI layout and navigation may be volatile during early stages of the application development. Teams must carefully isolate application UI and functionalities to enable independent tests for better results. Automation scripts must be used where functional or UI requirements are stable. QA techniques in early forms can be applied to usability or design testing even before the UI is integrated with functionalities. Automated regression tests should be conducted as often as possible through the course, and not just as part of final QA activities or just before system integration.

A volatile UI may be a ticking bomb towards the end of application lifecycle which may adversely affect the user experience offered by the application. It is thus wise to stub out non-functional testing especially UI for early defect detection and avoid rework post integration tests.

Happy testing!

September 3, 2019

Winning the Test Automation game

Enough has been said about writing better tests, optimize automation scripts or planning test cycles. Teams can choose from a plethora of test accelerators available in the market depending upon the features and automation maturity offered. However, with frequent changes to the application behavior and business requirements per se release planning, test maintenance and test criteria, the selected automation tools are not able to cope up with the pace testing workflows change over time.

Hence it is prudent to consider the maintainability aspect during the engineering cycle of the automation solution[i]. Having said that it is equally important testing teams to layout a viable plan with realistic automation goals and also accommodate incremental automation.  

Lay out an Automation Roadmap

Project teams always thrive on the 'automate whatever possible' mantra. Therefore, end up addressing the pertinent challenges and do away with minimalistic automation ignoring the possible troublemakers. Automation tests may work wonders for progression cycles, however, once teams get into regression, they start realizing the side-effects of not having proper test maintenance in place. While they come up with corrective measures to improve regression planning, reuse automation tests or even correcting them, the overhead is tough to crack.

The major flaw lies in haphazard adoption of automation in pockets wherein the tools in use may address only a few or more aspects in the process workflow. Rest is either carried out manually or using other tools. Teams generally take help of macros or client side stored procedures or scripts to make different tools or manual processes work together. The lack of support in end to end workflow leads to issues like flaky tests, unmanaged automation and hence depleting ROI<reference to debacles>.

It is thus essential to plan test automation meticulously with an incremental roadmap and test traceability on top priority. Provision to accommodate incremental automation like new tools to address complex application features or leverage latest technologies for better coverage, shall help testing teams to achieve better results over time.

 Pick the right ingredients

While the benefits of test automation are proven with variety of mature scripting tools available, testing teams are still struggling with test maintenance debacles. Correcting the approach may help automation teams engineer a sustainable framework, but to realize the best possible results, testing teams must use the right ingredients in the first place.

During test planning phase, testing teams tend to focus more on leveraging application changes to ensure best possible test coverage. Automation test cases keep piling up due to lack of regular optimization, with zero traceability and usability. Leveraging relationship between test assets (test data, automation scripts) available from previous cycles and application objects, functions and their properties, could augment automation planning and self-healing of test scripts. A calculated impact analysis of application changes onto the test scripts thus helps testing teams to reuse, optimize and write better automation tests.

Additionally, test data preparation in pockets also leads to unmanaged automation workflows. It is high time when due diligence is put in preparing test data for application testing using appropriate channels. Automation tools available for test data preparation and workflow management could be leveraged for test data traceability and consistency especially in SIT environments.

Putting it all together

The responsibility of providing a viable test automation solution though lies in the hands of automation engineering team, however it is the implementation team who can make best out of the available offerings and make the test automation process a success.   



[i] https://www.infosysblogs.com/testing-services/2019/05/building_a_sustainable_test_au.html

July 14, 2019

Building a Sustainable Test Automation Solution

Test automation is an essential part of QA processes in the software testing industry. Once a mere tool for optimization, supporting manual testing, test automation has now become primary driver in QA. However, successful test automation is much more than just writing a code to de-manualize a step wise process. 

During a digital transformation journey, everything may seem pretty straightforward while using automation frameworks and scripts. In reality, this automation success is short-term. Less than a year into the implementation, many teams soon get pulled into vicious circle of automation maintenance. Issues like flaky test results, change in expected behavior of the system, environment/infrastructure changes, diminish your ROI from a test automation framework[i]. It is thus essential to realize the fact that the success of automation solutioning, especially in software testing landscape, is more about avoiding mistakes than just getting it right!

The need of the hour is to look beyond the surface and come up with a futuristic, self-healing and sustainable test automation solution bedecked with best practices and technology. There isn't a checklist for 'right automation', but in fact 'quick solutions' are certainly expensive, or nearly impossible to maintain. Consider the following tips while embarking on your test automation journey. It may not be a cakewalk but definitely will have long term maintainability.

Picture2.jpg

  1. Simplify
    Testing requirements are as vast as application development. The automation tests are expected to match up the pace of application complexities as features mature. However, flexibility in the system should not be syntactically complex, that might bog down the user. An ideal solution should truly serve the testing goals while at the same time be fluid enough to handle real world testing complexities.
  2. Modularity & Reusability
    Testing approach/type may differ on project basis, depending upon factors like application type and life-cycle process. Automation components must be tailored in a way that they should be non-cohesive to the landscape diversity. These modules could be reused as common automation assets in multiple projects. Such a system thus eases the test maintainability and enhances trace-ability. 
  3. Handle dynamic nature of the application
    Identification of frequent changes and dynamic elements in an application are the two major challenges in the real world of test automation. Most test automation frameworks are unable to identify these dynamics, wherein test planning becomes ineffective and thus may lead to defects. Hence, a provision for identification criterion of application changes, dynamic objects and properties update, is vital for effective test automation.

  1. Centralize Test services wherever possible
    A one-stop solution for dynamic test assets, on-demand test tools and environment, with minimal domain knowledge and solution expertise, Testing As A Service or TAAS is an outsourcing model. 
    It is recommended to offer software testing as a service over cloud, especially in projects where extensive automation and short execution cycles are involved. The components can be used on-the-fly as per subscription using a centralized infrastructure.   
  2. Domain flavored automation
    While testing an application, a tester must think like an end user. Especially in Banking, Financial Services and Insurance (BFSI) and telecom domains, it is essential to know working procedures and domain keywords to write and execute tests better. Similarly, for an automation solution, a distinct edge on domain knowledge is vital to ensure maximum coverage of the functional and non-functional aspects of an application under test.

  1. Building intelligent automation
    Automation is not a one-time solution but a process. A smart automation solution should ideally be self-learning and adaptive. Amalgamation of AI/ML in QA automation helps inducing trace-ability in progression tests and self-healing in regression tests. Alternatively, for lesser sophisticated continuous automation delivery, code-less test automation can be explored. 
To sustain test automation, an appropriate framework with right mix of infrastructure and technology is vital, it is equally essential to streamline testing processes at practice level. Success in test automation requires immaculate planning and design work. Remember, automation in testing is not just a fancy UI to perform test steps, but should be aimed at building a solution that has long term maintainability and traceability. Hence, a sustainable automation solution that would suit the real world testing problems is a must!

June 19, 2019

Test Automation Debacles

In the era of digital transformation and spurring competition, organizations are joining the automation bandwagon without a second thought. Especially in software testing; automation has acquired an important place to address needs of agile and continuous testing processes. While the benefits of test automation are well proven with plenty of mature scripting frameworks available in the market, the death knell to the automation journey comes when testing teams start to struggle with test maintenance fatalities.

The Mayhem

We studied a few QA projects closely, right from test planning to execution, only to realize the damage, that un-managed test automation can do to the automation ROI.

Based on the process maturity and application under test, testing teams use various tools and frameworks implementing automation at varied levels. There are tools and techniques that offer automation in pockets and enable teams to realize instant benefits. However, shortsighted test automation is unable to keep up with the pace with which application features evolve. With teams going agile, short test cycles bedecked with amateur testing practices add to the debacles, hence lowering the automation tests maturity graph. Whereas, application under test continues to evolve with exponential speed. Therefore, the gap between automation tests and application features keeps increasing at alarming rate with time, as shown in Figure 1. With frequent changes in the application, poorly written tests and un-managed 'quick' automation test cycles, the teams get caught into vicious cycle of maintenance costs and decline in test coverage. This upsurge in test maintenance leads to regression defects, hence diminishing ROI from the automation.

TestAutomationdebacles.png

As a matter of fact, project teams land into worse situation with test maintenance during automation cycles as compared to manual ones! We observed teams spend more time in fixing test scripts almost by a factor of 200 as compared to what they spend in manual tests. That explains the steep surge in automation ROI in Figure 1.

The mistake lies in..?

We have blamed automation practices and application dynamics enough. Test automation experts are already looking into streamlining test automation and devising ways to leverage application changes to plan tests better. Did it really help? I don't think so! It may slow down the damage but eventually the irreversible destruction caused by haphazard test automation is realized sooner or later. The primary cause lies in the expectations. Testing teams do realize end to end automation is an incremental process or for that matter 100% automation may/may not be achievable, due to which, the end-user's expectations from test automation solutions stoop to minimal level. That's where the problem starts. It's the automation engineering that needs to be addressed!

We may standardize tools or processes for a testing team to adhere to, wherein test automation could be implemented in pockets and benefits be realized momentarily. However, the teams must realize the tool benefits are as good as the features offered and the automation coverage, which may turn into a nightmare in long-term, with no test maintenance available. Hence it is now necessary to look beyond test automation and address the gaps in how the automation tools and accelerators are engineered to offer better and sustainable automation.

You may find my recommendations on picking up the right ingredients and building a sustainable test automation solution in coming blogs.

May 17, 2019

QA Paradigms in Open Banking

Open Banking started as a regulation in the British banking circles, and now countries around the world are racing to adopt it. Australia is making its first move towards Open Banking later this year in July. The European Union is adopting PSD2 on lines of Open Banking. Countries implementing Open Banking are being watched intensely by those planning  to adopt these standards like Israel, Canada, Hong Kong, Japan and Singapore. Everyone is waiting to see the outcome of Open Banking imperatives. What is Open banking? Why should the IT world take notice? And what would be the implications of Open Banking in the Software Testing world? I am going to take a stab at these in the next few paragraphs.

To start with, Open Banking is a directive by UK's Competition and Markets Authority which mandates that all banks should expose their customers' data via open APIs to third party providers like competitor banks and FinTechs, with the express consent of the customer. What started as a regulation, Open Banking now broadly refers to the splitting of banking services and enabling customer's data access to partners outside the incumbent banking system with express consent. Open banking has created avenues for Fintechs and challenger banks to use technology which can leverage Customer data to help secure loans, provide a level playing field to pick and choose, help with payments, etc. Open banking has truly enabled FinTech firms to compete with large banks by helping them design more customer friendly products and also has provided much needed competition between banks to provide more value to the customer. Until recently if another bank/FinTech wanted access to the financial data of a customer, either the customer would have to fill in the data fields manually or the bank/FinTech would obtain the customer's login credentials and scrape the incumbent bank's page to get the required data. This is not a best practice with in terms of cyber security and a rather crude way to garner data. Now, Open banking has made it very convenient for customers to expose their data via open APIs. Additionally, it has empowered customers to switch banks easily. And further constructed a level playing field where FinTech firms can leverage data and technology to come up with creative solutions against the larger banks. Overall, Open Banking has increased competition and innovation while adding value to the end customer.

European Union implemented its own Open Banking regulation known as PSD2 which is an abbreviation for Second Payment Services Directive. PSD2 is a regulatory directive by the European Banking Association (EBA) applicable to European Union markets. PSD2 requires banks to grant customers the right to choose their payment partners PSD2 had been conceived with the intent of making payments easier in terms of innovation and use. There were few salient differences between Open Banking regulations of UK and PSD2 but in November 2017, Competition and Markets Authority mandated that Open Banking should be compliant with all PSD2 directives.  Open Banking will now cover all payment products like Credit Cards, Debit Cards, e-Wallets, etc. which are part of PSD2. Both PSD2 and Open Banking regulations have evolved to complement each other by increasing the scope of financial products under Open banking.

Payments will also get simplified via Open Banking. For example, currently on an ecommerce site, a typical payment goes via various intermediaries like the merchant, Payment Gateway, card associations like Visa or MasterCard, issuing bank and acquiring bank. But, with Open Banking, the online retailers can directly conduct payment transactions with your bank without any intermediaries. And again this benefits the end customer as the surcharges demanded by these intermediaries is eliminated.

Which brings us to the ultimate question of this blog, how will Open Banking affect IT industry especially Software Testing? Interoperability through common standards as one of the keystone objectives of Open Banking.  To achieve this, banks will have to build open APIs which comply with regulatory standards, security protocols, safe data transfer, compliance with all the directives, etc. Open banking creates a plethora of opportunities in regulatory testing, penetration testing and security testing to make sure all the security protocols are in place and thwart cyber criminals, performance testing when many customers try to access or transfer data at the same time, API testing, Accessibility testing, consent testing, Strong Customer Authentication testing.

Open Banking has the potential to grow into a niche QA area where domain experts with testing skills would work on ensuring the APIs and platforms are performing optimally. Experts in QA who are well versed with Open Banking landscape will be very much in demand and since all financial institutions operating in Europe and UK must conform to Open Banking standards, all of them will require support in this area. There are 9000+ financial institutions in Europe and all of them will have to comply with Open Banking/PSD2 which translates to immense QA opportunities.

This write up serves as a generic introduction to Open Banking and opportunities in store. My next post of Open Banking will look at more granular details of how Open Banking will affect QA in various industries like Retail!

Reference:
https://bankingthefuture.com/a-primer-on-open-banking/
https://www.openbanking.org.uk/wp-content/uploads/What-Is-Open-Banking-Guide.pdf
 https://www.paymentscardsandmobile.com/psd2-explained-payment-services-directive-created/
https://www.starlingbank.com/blog/explaining-psd2-without-tlas-tough/

Continue reading "QA Paradigms in Open Banking" »

March 31, 2019

RPA Performance Testing

In today's rapidly changing technology landscape, new ground breaking trends are emerging every day. Some of today's key trends driving financial services industry imperatives are -

1. Robotic Process Automations (RPA)
2. AI and Digital Assistant such as Chatbot
3. Block Chain
4. Big Data

RPA has created lot of buzz in the industry. Organizations are reaping in immense benefit by implementing RPA. As per Mckinsey, "110-140 million FTE's could be replaced by automation tools and software by 2020 ". RPA implementation has necessitated strong testing support to avoid any failures because it can be very expensive in the later stages of the development. One of the challenges faced by organizations is identification of bottlenecks and hotspots. As per IBM World Testing Report, 65% of organization are facing challenges related to Performance testing.

ChallengesImage.png

While organization are reaping RPA benefits, it is equally important to ensure the performance of RPA processes is up to the mark and it meets 3S (speed, scalability and stability) mantra.

Before dwelling deeper in RPA PT challenges and solution, let's understand the typical RPA landscape.

  • RPA landscape

RPAComponent_new.png

As seen from above diagram, RPAs possesses immense capability for integration with varied landscape. It can be easily integrated with legacy, web based, API based, mainframe applications and many others. They also promote reuse by "exposing" their learning's to shared library which can be used by other bots. PRA interacts with different systems via screen scarping, emails, OCR, APIs etc. replicating user actions.

  • Performance Testing areas

Having understood the landscape, let's focus on what are the key elements of the performance testing that should focus on.

1. Capacity related issues when concurrent jobs are scheduled by robots

2. Tasks completed in given time per bot

3. Licensing and bot utilization -

  • Licenses - Monitors total number of acquired robot licenses
  • Robot utilization vs. capacity - Monitors the percentage of acquired robot licenses that are utilized in production 

4. hourly/daily variability in robot usage

5. Elastic Scalability - Dynamically upscaling and down scaling hundreds of robots to ensure RPA meets user demands

6.Complete eco system performance - Along with RPA processes, we need to focus on each application in the eco system.


  • Challenges faced

While now we understand what should be focus areas, there are inherent performance testing challenges faced by RPA. They are -

1. Dissimilar technologies: As seen from RPA landscape, each application under RPA execution may belong to a different technology. We need to assure that each component meets performance in isolation and in E2E eco system

2. Performance testing tools availability: Diverse landscape adds complexity that one single PT tool cannot support varied needs of ecosystem. For RPA systems, there are no record and playback mechanisms available while for RPA backend systems, we have to explore appropriate COTS/commercial tools based on protocol support via POC, knowledge sharing, etc.

3.Test environment: The Performance testing environment may not be exact production replica due to cost or any other factors. We need to plan the realistic workload which will cater to scaled down version and any other dependency to achieve desired results within the ecosystem.

4. Monitoring solutions: Similar to performance testing tool availability challenge, narrow set of monitoring solutions exists to monitor platform, detect the performance issues and for bottleneck analysis. We have to explore COTS/open source tools to cover the varied technology landscape.

5. Continues delivery pipeline: Current RPA solutions are mostly commercials solutions and RPA engineers don't have any open source options available due to proprietary the binary file formats. This should likely change down the line as RPA adopts open source standards. Infosys AssistEdge RPA community edition is certainly a revolutionary step towards this.

6.Unavailability of RPA backend / interacting systems: Since complete RPA ecosystem is a complex one, there are chances that one of the interfacing system may be behaving poorly or down temporarily.. 

How do we overcome these challenges? What strategy we adopt? The solution lies in sociability test.

  • Sociability Performance Testing

Sociability test will focus on core RPA process and any systems interacting with RPA. Refer to the diagram below.

RPAComponent_new - Solution.png

  • Key aspects to look at -

1. Tools and technology - Tools used will vary and can be combination of open source and COTS system. We need to assess the complete technology landscape and consider two separate areas here - RPA vs other IT systems.
For RPA there is no specific tool for PT but we can collect critical stats by observing the monitoring console. For E.g. process run time, number of records processed, computing units used, license usage etc. So the monitoring console is currently our best bet to fine tune RPA processes.
For other IT systems, we can explore use of open source systems such as jmeter or COTS such as micro focus performance center, NEOLOAD, etc.
Key is to ensure E2E ecosystem testing to ensure accurate stats and stable systems.

2. Utilizing strong APM - APM such as Dynatrace/AppD will need to be installed in order to get the detailed system metrics and transaction response times on downstream IT systems. APM tool can help in baselining transactions etc.  These can be used to monitor the RPA Infrastructure on which it is hosted and backend/interfacing systems as well.

3. Test Data - For setting up data for test, you can look at RPA itself to create required test data etc. as well i.e. system under test will be leveraged for automation as well.

4. Service Virtualization - Service virtualization using tools CA Service virtualization, Parasoft virtualize etc. can help to emulate the behaviors of various interacting components. It may not be possible to leverage this solution in all situations but should help in cutting down the testing cycle wherever possible.

5. Establishing CoE - PT CoE will play crucial role as we have multiple teams involved in E2E testing. Establishing proper processes and governance models will ensure testing is done in minimal time and less cost.

To summarize, RPA itself automation process and script less, again scripting it using another automation tool may not work. So monitoring is our focus area along with workload formation to test in pre-prod. It's like batch run where workload will be initiated by RPA itself but you will use another tools for monitoring performance etc.

March 28, 2019

Service Virtualization using Mock Server

THE BEGINNING

Service Virtualization is a technique for integrating a mock server in a test suite to remove dependencies on real back end systems or external party systems from test environment. It is an ideal solution for Test Driven Development (TDD) and Business Driven Development teams who want to quickly test the application and API services to find out the major problems.

Service Virtualization is best suited in Micro Services based Architecture, Service - Oriented Architecture and Cloud based Architecture.  It is the most important component of DEVOPS community.

Problem Statement

This is a compact view of micro services based architecture in which application is communicated to real end back systems through a number of API calls to receive output responses. For instance, In banking applications -  Some of the important REST API calls like accounts, payments, transactions etc.

Here is the list of problems with this kind of application infrastructure:

-          There is no dedicated environment for Automation testing, UAT testing and Performance testing. As environment has been    shared between all the teams that causes delays.

-          Environment is mostly down due to deployment releases and server configuration issues.

-          As data is different in Automation Testing and Performance Testing, test data setup is also a big challenge for teams.

-          Tests are brittle not robust, that means no reusability hence not able to achieve cent percent test coverage due to environmental issues.

 Implementation of Service Virtualization

In my recent assignment, Our QA team was struggling with test coverage issues in automation testing, unexpected environmental issues, performance related issues and many more in real end back systems because these systems were associated with third party vendors and these systems were not accessible to our teams.

With these problems, Our Teams were blocked and handicapped to perform any testing operations hence unexpected delays in production releases that impacts the project schedule and delivery.

To come out from these challenges we have created and implemented a mock server virtualization solution.

Proposed Solution
a.   Introduction of Solution

-          This is a Wire Mock Server or Virtual Service based environment model. One way of solving the dependencies and issues. Using Virtual services or Mocks, allow you to dismantle the testing from real back end systems and provide independent environment to different testing teams. The problems described above are resolved completely. People are happy and satisfied.

b. Application of Solution

 Here are the advantages of using Service Virtualization Over Traditional approach:

-          Test Coverage has been improved upto cent percent and avoid unexpected environment issues hence test quality has been improved.

-          All the QA teams used similar environment independent of other teams.

-          As per the business requirements, Test data set up is easy to create and handle it in an optimize way.

-          Test Development is robust and having less number of issues.

-          There are no issues in environment deployment and configuration.

-          Service Virtualization model is more agile over traditional models.

-          Less or no cost in the development and implementation of Mock Server Virtualization.

-          Flexible to fit in any kind of application architecture.

-          Leverage testers to become developers by manipulating the output responses according to their needs.

-          It's very quick and fast solution to resolve all the issues related to real end environments.

-          This approach reduces the man working hour efforts and time by 90%; it's a very effective solution for company business.

 Future Direction / Long-Term Focus

-          Service Virtualization is one way. Especially for large software projects, this practice can dramatically reduce the company cost.

-          Enhances the practical reusability of Service Virtualization,  hence reduces the future development efforts.

-          Implement such kinds of testing practices in other business necessities such as Cloud based Architecture, Service Oriented Architecture.

Results / Conclusion

We believe, this kind of approach will help people to accomplish various upcoming engagements and produce remarkable results.


Continue reading "Service Virtualization using Mock Server" »

March 25, 2019

Role of Artificial Intelligence in Performance testing and Engineering


A typical Performance Testing starts with analyzing the application UI and creating the test scripts. Post that users hit the application server and generate beautiful dashboards from Load testing tools indicating the Response time, Throughput, CPU utilization time, memory utilization etc.

In the era of AI (Artificial Intelligence) powered softwares, during the early stages of application design, performance engineers should be able to answer questions like:  What should we expect once the application is in production? Where are the potential bottlenecks? How to tune application parameters to maximize performance?

Critical applications need a mature approach to Performance testing and monitoring. AI is the intelligent part of Performance Testing process. It acts as brain in the process. Daily Tasks like test design, Scripting and implementation can be handled using AI, so that test engineers can focus on creative side of software testing.

One reasonable use case of using AI in PT (Performance Testing) can be codeless automation script. Writing performance scripts using Natural Language Processing(NLP) can make the scripting task way easier. In this type of testing, computers learn from the data given to them without programming it. Below are the aspects of solution empowered by AI-ML (Artificial Intelligence- Machine Learning) in performance testing:

  • The testing environment developed using ML, will have advanced capabilities in terms of self-healing and intuitive dashboarding. using deep learning algorithms, the corrections can be handled automatically.
  • The test flows are recorded and can be tested using data. No coding required in most of the scenarios.
  • Reusable functions and objects can be generated and grouped using semi-supervised learning. Scenarios are flow-based, and thus the implementation is transparent to user.

Yet another use case would be performance test modelling processes. AI's pattern recognition strength can extract relevant patterns while load testing which is very useful for modelling performance process. The PT model consists of the algorithms being used, from which AI learns from the given data. The ability of AI to anticipate future load problems helps in creating Performance test model efficiently. It deals with lot of data and can predict the system failures. Once the system data is analyzed, Performance test model can be created based on the system behavior.

Another area can be SLA design. SLAs should be measurable, attainable, simple, realistic and time bound, but most SLA are not designed like this. This is the basic limitation of human powered systems. However, once AI takes the role, the situation will change. It can track all the affecting areas and gets reinforced into monitoring system with providing granularity. It can analyze the complexity of the system and suggest the appropriate SLA. For example, if the lines of code are 1000 then SLA can be considered as 500 milliseconds. AI can detect working trends in a system directly, as system performance changes, SLA can fine-tune in real time.

 Monitoring tools like Dynatrace, AppDynamics introduced AI into their system which are helping in identifying the bottlenecks in multiple tiers of applications in early stages of software development. It can analyze the application and can predict the performance defects at the code level. Many open source tools like webpage test, GTmetrix, Yslow pinpoint specific problems like server request issues and help engineers to solve the issues quickly. Automation Tools like Test.ai is useful in getting the performance metrics of your application as well.

Role of AI in every phase of performance testing and engineering is proved very beneficial and is future of performance testing. Use of AI in performance testing will make tasks like scripting, monitoring highly impactful and help to get real time results very quickly. I believe, in future role of AI in performance testing will be a game changer!

December 7, 2018

Embrace the Future

The unfolding of Cloud Computing, Introduction of Enterprise level Integration Patterns & up folding of micro-services has not only disrupted the existing nomenclature but also makes us think the way we do Integration. When every pioneering companies are embracing the micro services to tackle their complex enterprise architectures, there's one aspect of is which is still open for exploration i.e. Data Validation & Data Warehousing.

Yes, it is true that there are many organizations who are consciously embracing the concept of data services around there data lakes for either master data management or for analytical purpose (simple data read) but very little have been thought upon using the full flavor of micro service driven architecture on areas like data integration, data quality or validation and metadata management space of work.

If we travel back a few years in Time, the idea of SoC (separation of concerns) was ignored due to the need of heavy lifting of data and availability of Integration Tools which usually were tightly coupled with each other. These tools were an Instant hit as they wrapped up complexities of managing job failures, providing reports etc. but it could not fully tackle the learning curve, complexity involved and most important - the need to adapt to frequent changes.

The basic principle of micro service is to break a complex application and decompose it in to multiple self-contented services which can connect to each other to achieve a complex functionality. Given that above use cases are always complex in nature micro services could be a great way to automate data validation & design our future data warehouses.

September 28, 2018

Chatbot- A digital assistant in Banking Industry

 

Chatbot- A digital assistant in Banking Industry

Financial institutions across the globe are assessing the viability of deploying chatbots for varied objectives, consumers of information range from end customers to CXOs, FS industries are testing out various approaches to proactively deliver insights to the customers based on his/her transactional history and digital profile, example

  • Suggesting investment options depending on savings bank balance and risk profile
  • Giving market related news and impact on portfolios
  • Recommending ways to use reward points of credit cards

Firms are experimenting to authenticate customers based on voice samples from natural conversation and help complete transactions quickly, chatbots is proved to deliver predictive insights to CXOs across all key areas such as sales insights, performance of partners, fraud prevention, risk management, customer profitability and risk analysis, regulatory reporting, market information and benchmarking, customer lifecycle management, survey insights, net promoter score and customer feedback analysis across channels, Internal employee management etc.

Which branch in Singapore has had the maximum business since the beginning of the day ?

Can you check from RBI document of the last 5 years if i can host my card disaster recovery system in Australia ?

What is the on-ground feedback for the eKYC offering launched 2 months back ?

Growth in digital payments post-demonetization in India is unsatisfactory, please schedule a meeting with digital team


Chatbot is changing the face of the communication interface by adopting Artificial Intelligence, it brings huge change and simplifies overall banking experience for the customers, each customer is served with a most personalized approach, provides 24*7 support, resolves query, updates client KYC, Information on new schemes and services around the clock etc. it is predicted "By 2022, 40% of customer facing employees and government workers will consult daily on AI virtual support agent for decision or process support".

By using chatbot channel for communication with customers, banks can achieve a higher market value without annoying the customer. AI is tremendously empowering banking institutions and retail banking customers further by taking large amounts of data, and making it easily accessible anytime to the individual account holders, in the mode of a chat interface.

Chatbots on their current form has reached certain level of maturity, they are developed for specific tasks and unable to suitably handle specialized queries requiring knowledge outside the functional domain, so the capabilities and features of chatbots still has to be enhanced in order to create a completely different experience for banking customers by combining knowledge across all relevant segment areas and providing better insights to the user. This will give rise to a new conversational banking where results are delivered instantly through real time conversations, thus facilitating better decision making, below are some key factors to consider,

  • Significantly drive customer loyalty by adding a new dimension to the power of 'personal touch' and massively enhance customer delight and loyalty.
  • Create a cognitive financial institution by developing cognitive capabilities and deeply customized offering is a key idea for moving to the higher level of conversational banking.
  • Analyze and experiment with integration with other latest upcoming technologies by partnering with technology giants and leverage innovative technologies.

One of the study reveals that the majority of consumers prefer getting speedy information that resolves their immediate requirements which should be identical to traditional search engines, In some scenarios, they would prefer to talk to a person more than they want to deal with any form of artificial intelligence (AI) machines, according to a new study from STARTEK, why? Its because

  • Support for customers by anticipating and acting on needs,
  • Assure customers about their choices,
  • Identify and resolve confusing or complex situations, and
  • Build relationships

However, researchers also found customers don't need a steady diet of personal contact always. While 85% of customers prefer talking to a customer service representative most of the time -- especially when the issue is so personal or complicated -- but they're OK with digital channels as well for more routine issues. In fact, almost a quarter of customers said they initiated their most recent contacts with service through email and chat, care is what matters to customers most.