Testing Services provides a platform for QA professionals to discuss and gain insights in to the business value delivered by testing, the best practices and processes that drive it and the emergence of new technologies that will shape the future of this profession.

December 17, 2020

Content Validation and Accessibility Testing

Content validation is a type of static testing that ensures the quality and accuracy of your content. It includes validating the data present in a document, web page or email against the data present in an application, file, or database.


Accessibility testing is the practice of making web and mobile apps usable by as many people as possible. This includes making content accessible to those with disabilities like vision impairment, hearing disabilities, and other physical or cognitive conditions.


There are 285 million visually impaired people across the globe, of whom 39 million are blind. The impairments include color blindness, total blindness, cataract, blur vision etc. There are 466 million people in the world with disabling hearing loss. (*Source: World Health Organization). The other factor which should be kept in mind is the mental health and intellectual spectrum as the content should be accessible to all humans with varying processing capability.


Since a lot of content is conveyed in the form of documents, we need to ensure that the content curated in a document is validated and accessible. We cannot eliminate the errors, but we can reduce them by implementing some content validation techniques.


In this blog, we have covered requirements, challenges and key focus areas for validation and accessibility of content for PDF, Word, Email, web format and mobile channels.


Content Validation

PDF Content Validation

In current scenario, there is a huge dependency on PDF as an end user deliverable. PDF is a file format used to present and exchange information in an offline mode and is designed for human readers and not for machine processing. Hence, automating the content testing of PDF is a challenge. In order to validate, business entities need to be extracted from a PDF file and compared against the target.

Challenges:

  • Predefining the structure of a PDF according to continuously varying business demand
  • Lack of automation in the content validation resulting in manual content validation for large number of documents
  • Difficulty in extracting the data from image-based documents
  • Identifying business entities from image-based documents


PDF Validation Workflow:


Word Content Validation

Microsoft Word is the most popular word processing document in the world. We may need to verify, store or perform some validations on the data of a word file. A correctly spelled but invalid value can make a document unusable. We can use macros, scripting, etc. to validate the data to some extent.


Challenge:

  • To identify the co-related data as well as the business entities for textual data validation in a word document


Email Content Validation

Just like any other content validation, email content validation is very important part of testing as we handle important official data through email. Verification of email is critical to avoid any miscommunication.


Checkpoints to be considered for email validation:

  • Testing of unknown errors related to HTTP error codes or some non-working or broken links present in the email
  • Missing images due to script related issues
  • Spell checks, web standard checks, etc
  • Correct navigation, easy to read and understand paragraphs. Page loading should be validated and carefully tested by different performance checks.
  • Copyright & Trademark symbols should be carefully placed and checked
  • Testing the subject of email is very important
  • Checking template and basic designing of email as it is the major attraction of email


Challenge:

  • Manual testing for email content validation


Web Content Validation

Web Content Validation includes extracting data from webpages and validating it against some targets like a database or a file. It can also be a webpage to webpage comparison which comes up during website migrations. QA teams work on comparing and verifying the data against old website, find broken links, static content, dynamic content, images, videos etc.

With the usage of latest content management solution for building website and mobile app, enterprises are looking forward to providing a good digital experience to customers via the front end while managing the marketing content and digital assets on the backend.


Challenge:

  • Managing the content across online mediums and simultaneously syncing well with the in-store mediums is a challenge


Content Accessibility

PDF Accessibility

Portable document format is the most used format across the world, it should be made sure that the document is accessible by people who are blind, have low vision, deaf, or with impaired hearing or have any other cognitive impairments.

To make the document accessible for people who rely on assistive technologies, the content should be structured and tagged properly.

Web Content Accessibility Guidelines (WCAG) 2.0 and PDF/UA are common accessibility standards. PDF/A is an ISO-standardized version of the portable document Format (PDF) specialized for use in the archiving and long-term preservation of electronic documents.


Challenges:

  • PDF readers available in market do provide option to check accessibility of the document but user must opt for their pro version to use it.
  • The other challenge is the batch processing of documents. A tool which can process multiple documents simultaneously is required.

Word Accessibility

Microsoft Word, just like the entire Microsoft Office Suite, provides a built-in Accessibility Checker tool that helps identify potential accessibility concerns. The Accessibility Checker task pane will show:

  1. Accessibility Errors
  2. Warnings
  3. Tips to repair the errors

Additional information for specific issues is available at bottom of the task pane.


Challenge:

The Accessibility Checker tool does not guarantee a fully accessible document or comprehensive usability of the workbook for individuals with disabilities. Some documents might present accessibility challenges that need to be addressed manually.


Email Accessibility

Email campaigns holds great value in today's time. How impactful can an email campaign be if it can't be read and understood by people with disabilities?

Emails should be designed keeping best practices in mind like:

  • It should support screen-reading devices
  • It should have alternate text for images which screen reader or voice assistance can dictate
  • Visual content should be checked for color blindness
  • It should be accessible on mobile and IOT devices

Web Accessibility

Web accessibility means the content present on web in form of websites, available tools, technologies should be designed and developed so that it facilitates people with disabilities to access them. People should be able to perceive, understand, navigate and interact and contribute to the web.

Web accessibility encompasses all disabilities that affect access to the Web, including:

  • Audio
  • Cognitive
  • Neurological
  • Physical
  • Speech
  • Visual

Web content accessibility guideline 2.0 is based on four principles: Perceivable, Operable, Understandable and Robust. It is further differentiated into 3 levels:

Level A - Most basic web accessibility features

Level AA - Deals with biggest and most common barriers for disabled users

Level AAA - It is highest and most complex level of web accessibility

Challenges faced:

  • To make the content accessible for wider section of audience
  • To check the conformance of accessibility standards
  • To get repair recommendations for the content and insights on the same


Content on Mobile device

With the heavy content consumption these days on mobile device and increased usage of smart IOT devices, it has become mandatory to make the content accessible. Also, there is a need of making the content uniform across devices - laptop, desktop, smartphone, tablet, smart watch etc.


Some of the pre-requisites for improving the mobile accessibility include:

  1. Responsive design - Content should fit the layout of the mobile and adjust according to the screen orientation of device
  2. Transcript - Alternative text for pictorial content

  3. User Input - Making it easier for user to interact with the content and participate actively
  4. Aesthetics - Color contrast, visibility, font type and style, font color etc. should be customizable as per the user's need

  5. External device support - Now content should be designed and edited in such a way that any external device like screen reader or voice assistance can assist users with disabilities to access the content


Conclusion

Content Validation is a wide topic which spans multiple channels. Each area has its own set of challenges and automating the process of testing the content varies from one area to another. Natural language processing is one good selection while validating textual content.

The requirement for web content validation has increased because managing the content across online mediums and simultaneously syncing well with the in-store mediums has become a tough task. Instead of manually validating the content, automation can be expanded in the area of content validation as many areas of content validation testing are still being done manually by the testing team.

Technology should assist in interpreting the content, be it in tabular, image, audio or video format; it should be made accessible.

Managing content is the right thing to do. As an added advantage to any business, Content validation and accessibility will enhance the user experience and will bring in more customer, trust and credibility to the business.

 


Authors:

Deepak Ruchandani- Senior Associate Consultant

Divya Vettukattu Valappil- Technology Lead


 


 


Continue reading "Content Validation and Accessibility Testing" »

November 20, 2020

Making Enterprise Test Automation a Possibility using UiPath

Energizing the core of enterprise business using AI driven automation is the key for success for all progressive business houses. Combining the power of Robotic Process Automation (RPA) and related cognitive capabilities is giving a new direction to enterprise test automation. UiPath has been at the forefront of this journey. The new UiPath Test Suite (Advanced IDE StudioPro, Test Manager and Test Robots) provides a platform for all enterprise testing needs, be it Software testing or even testing RPA. Its current success gives us confidence that the future of Enterprise test automation will be driven by RPA based platforms.

Continue reading "Making Enterprise Test Automation a Possibility using UiPath" »

August 26, 2020

Graph Analytics - The Science of Network Analysis

Gartner predicts that the adoption of graph analytics and graph databases will grow multifold at 100% annually through 2022 to continuously accelerate decision making and enable more complex and adaptive data science. By 2023, graph technologies will facilitate rapid decision making in 30% of organizations worldwide.


Through this blog, we will understand What is Graph Analytics, Types of Graph Analytics, and a few Use Cases.


A.    What is Graph Analytics


Graph represents relationship between two entities. Graphs are made up of Nodes or Vertices (entities) & Edges or Links (relationships). The graphs provide a scalable and supple platform for finding connections between data nodes or analyzing the data derived from strength of relationships. Graph Analytics is the methodology of analyzing relationship amongst entities to uncover insights that are difficult to visualize with other techniques. This analysis is supported by building graph algorithms powered by graph databases.


·         Graph analysis enables how several trends or data entities are related to each other


·         Graph models determine connectedness across data points to determine the nodes that generate the most activity or sphere influence


·         Graph databases provide expanded capabilities for storing, calibrating, and analyzing graph models


·         Generating dynamic graphs instead of static relational schemes help uncover deeper insights



B.      Types of Graphs 


·         Path Analysis: Analysis to determine the relationship and shortest path between 2 nodes. A typical applicability includes route optimization in supply chain and logistics


·         Connectivity Analysis: Analyze the strength or weakness of connection between two nodes, can be applied to identify weaknesses in networks


·         Centrality Analysis: Analyze how important a node is for connectivity of the network, can be used to identify most influential people in a social media network or to find high traffic web pages


·         Community Detection: Distance and/or density of relationships can be used to identify communities and detection and behavior patterns



C.       Relevant Use Cases


·         Detecting financial crimes like money laundering or payments to prohibited entities


·         Identifying fraud in banking transactions, fraudulent insurance claims, and suspicious activities in telecommunications


·         Regulatory compliance - tracking sensitive data lineage through enterprise systems


·         Graph algorithms can optimize airline routes, supply distribution chains, and logistics


·         Support study in Life Sciences - medical investigation, vaccine development, disease pathology


·         Social Media - identify social influencers and online communities


·         Optimization of recommendation engines in eCommerce platform by use of collaborative filtering resulting in more personalized recommendations


Leading Graph Database Tools


                Amazon Neptune, Neo4J, ArangoDB, DataStax, OrientDB, Titan


Role of Data Testing in Graph Analytics


Data Validation becomes a vital cog in successful implementation of graph processing in various domains and industries. Data (Sedentary and Streaming) from traditional, cloud, multi-cloud systems is extracted, integrated and loaded into schema free Graph DBs. Robust data validation techniques will ensure data integrity and no data loss as data moves through disparate systems. Analytics models built on top of Graph DBs also not to be validated to develop human-machine trust and a thorough Data Visualization testing needs to be conducted to ensure analysts are able to make decisions on accurate data visualization.


Conclusion


As data and devices become increasingly interconnected and sophisticated, it is indispensable to relationships within data and derive insights from the data resulting in faster decision making by supporting analysts. Graph are suited to this due to their exceptional capability of finding revealing patterns in the connected data. As organization and enterprises continue leveraging the power of big data and data analytics, the unique capabilities of graphs are essential to today's wants and tomorrow's triumphs.


May 19, 2020

Data Fabric - The Futuristic Data Management Solution

Global Research and Advisory firm Gartner has identified top data and analytics trends for 2020, which will have significant transformative potential in the next two to five years. Data Fabric is one of the most prominent trends. With the enormous growth of both structured and unstructured data from smartphones, IoT devices, and digital channels there is a need to be able to process large amounts of data, mine it, analyze it and make it accessible. Data Fabric is a method to understand large amounts of data traversed through the cloud systems.

In this blog, we will understand What is a Data Fabric, The Data Fabric Stack, and a few Use Cases.

A.      What is Data Fabric

Data Fabric is a unified architecture and data service set running on that architecture, that helps organizations manage their data across on-premise, cloud, and hybrid cloud systems. Data Fabric is a single, unified platform for data integration that simplifies and integrates data management across platforms to accelerate digital transformation.

 

§  Connects to platforms using pre-packaged functions and connections

§  Integrates and manages data from on-premise and cloud environments

§  Support for batch and real time data streams

§  Data Quality, Data Enrichment and Data Governance capabilities

§  Support for API development and integration

 

B.      The Data Fabric Stack includes following layers - 


·        Data Collection & Storage: Ingest & Integrate data, events and APIs from any source, from on premise and in the cloud

·        Data Services: Manage several services at this layer including data governance, data protection, data quality and adherence to compliance standards

·        Transformation Layer: Involves cleaning and enrichment of batch and real-time data to enable informed decisions

·        Analytics/Sharing Layer: Realize data value by making it available internally and externally via self-service capabilities, analytic portals, and APIs


C.       Successful Use Cases

«  A leading pizza company in the world, with both delivery and carry put operations, utilizes data fabric to maintain the competitive advantage. It allows ordering of pizza from a plethora of devices including TV, Smartwatch, Smartcar, etc. resulting in 25TB of data, from 100,000 data sources - structured and unstructured. Using Data Fabric, company gathered and analyzed data from their POS systems, multiple supply chain centers, and across digital channels including text messages, Twitter, Amazon Echo

«  A leading pharma company, applied AI to develop weed identification, and enabled farmers to apply the exact solution needed to kill the weed species. It developed an app which used machine learning and artificial intelligence to match photos that farmers uploaded to the app. This resulted in better choice of seed variety, better application of crop protection products, and best harvest timing

«  Leading insurance company is utilizing data fabric to store and analyze claim data - claim report, incident data, police report, claim history, claim details, counterparty details, etc. This has helped in faster settlement of claims and also to make policies more compelling and price them competitively

Conclusion

In a world where technology is changing everyday lives, digital transformation tops the strategic agenda of most organizations and their leaders. To be successful in digital transformation journey, data is lifeline, to enable new customer touch points, create innovative business propositions, and optimize operations. Data fabric enables businesses to achieve these by offering connectors for hybrid systems, advanced data integration capabilities, and analytical capabilities. Demand for data fabric will get stronger as organizations look to stay on top of emerging technologies and new trends to stay competitive, stay relevant and maintain business edge.

May 13, 2020

'By failing to prepare, you are preparing to Fail' - Data Analytics and its role in dealing with Covid-19

During the past few months, the world is fighting a battle to contain a lesser known virus with no cure or vaccine in sight and which has had far reaching impact on the world's healthcare and economic stability. Informed decisions are need of the hour and these decisions are making use of data generated from various feeds globally across sectors. COVID-19 has brought to the core the importance of Big Data Analytics. 

Continue reading "'By failing to prepare, you are preparing to Fail' - Data Analytics and its role in dealing with Covid-19" »

September 16, 2019

UI vs UX: Revisiting the age old debate

With technological advancement reaching the common man's hand in 21st century, everybody seeks to experience technology without spending much brain and time. Mobile and Web consumers nowadays expect quick and consistent navigation with seamless experience. Hence the growing emphasis on professional UI/UX design in software applications.

While we realize the immense importance of a visually appealing and user friendly application experience, UI/UX are the terms that are generally used interchangeably in the software world. As a matter of fact, the terms are closely linked when being talked in software design landscape.

UI is not UX

By definition, UI or User Interface is the graphical layout of an application which a user interacts with. This includes buttons, input controls, screen layout and every micro-interaction. UI designers create the look and feel of a user interface for an application.

UI.png

UX or User Experience determines how easy or difficult is to interact with the User interface elements on the application. This being the main reason people generally confuse UI and UX to be similar terms and use them interchangeably.

Granted, it is completely fine to use UI/UX together in software design, wherein, the UX designers are also concerned on application UI to ensure smooth navigation and provide a seamless application experience. However, it should be understood that UI is just one of the salient elements in UX as shown in Fig. Designers work on both user interface and user experience design for a customer friendly application.

When it comes to application testing, UX/UI are mostly covered during user acceptance testing phase in an SDLC. While teams do realize the importance of UI testing early on (along with functional testing), to avoid any defects percolating to later tests, usability testing (UX and UI teamed up) is generally scheduled after/with integration tests to accommodate application agility. However, teams end up doing highly expensive rework, due to last minute customer feedback on supposedly less important non-functional aspects like user interface and experience.

UI testing: Test early test often

Validating seamless user experience may seem more relevant during end of the application testing processes, however validating UI with respect to interface, design and navigation requirements need to be taken up way early.

With the rise in customer centric business requirements, it is thus prudent, that UI tests be planned early and be done repeatedly till all functional and non-functional requirements are met. UI automation scripts come handy while planning repetitive tests like performance, load and device/browser compatibility. Early performance or compatibility testing allows capacity planners and infrastructure architects with early warnings on any potential problems with the scalability of the architecture. UI layout and navigation may be volatile during early stages of the application development. Teams must carefully isolate application UI and functionalities to enable independent tests for better results. Automation scripts must be used where functional or UI requirements are stable. QA techniques in early forms can be applied to usability or design testing even before the UI is integrated with functionalities. Automated regression tests should be conducted as often as possible through the course, and not just as part of final QA activities or just before system integration.

A volatile UI may be a ticking bomb towards the end of application lifecycle which may adversely affect the user experience offered by the application. It is thus wise to stub out non-functional testing especially UI for early defect detection and avoid rework post integration tests.

Happy testing!

September 3, 2019

Winning the Test Automation game

Enough has been said about writing better tests, optimize automation scripts or planning test cycles. Teams can choose from a plethora of test accelerators available in the market depending upon the features and automation maturity offered. However, with frequent changes to the application behavior and business requirements per se release planning, test maintenance and test criteria, the selected automation tools are not able to cope up with the pace testing workflows change over time.

Hence it is prudent to consider the maintainability aspect during the engineering cycle of the automation solution[i]. Having said that it is equally important testing teams to layout a viable plan with realistic automation goals and also accommodate incremental automation.  

Lay out an Automation Roadmap

Project teams always thrive on the 'automate whatever possible' mantra. Therefore, end up addressing the pertinent challenges and do away with minimalistic automation ignoring the possible troublemakers. Automation tests may work wonders for progression cycles, however, once teams get into regression, they start realizing the side-effects of not having proper test maintenance in place. While they come up with corrective measures to improve regression planning, reuse automation tests or even correcting them, the overhead is tough to crack.

The major flaw lies in haphazard adoption of automation in pockets wherein the tools in use may address only a few or more aspects in the process workflow. Rest is either carried out manually or using other tools. Teams generally take help of macros or client side stored procedures or scripts to make different tools or manual processes work together. The lack of support in end to end workflow leads to issues like flaky tests, unmanaged automation and hence depleting ROI<reference to debacles>.

It is thus essential to plan test automation meticulously with an incremental roadmap and test traceability on top priority. Provision to accommodate incremental automation like new tools to address complex application features or leverage latest technologies for better coverage, shall help testing teams to achieve better results over time.

 Pick the right ingredients

While the benefits of test automation are proven with variety of mature scripting tools available, testing teams are still struggling with test maintenance debacles. Correcting the approach may help automation teams engineer a sustainable framework, but to realize the best possible results, testing teams must use the right ingredients in the first place.

During test planning phase, testing teams tend to focus more on leveraging application changes to ensure best possible test coverage. Automation test cases keep piling up due to lack of regular optimization, with zero traceability and usability. Leveraging relationship between test assets (test data, automation scripts) available from previous cycles and application objects, functions and their properties, could augment automation planning and self-healing of test scripts. A calculated impact analysis of application changes onto the test scripts thus helps testing teams to reuse, optimize and write better automation tests.

Additionally, test data preparation in pockets also leads to unmanaged automation workflows. It is high time when due diligence is put in preparing test data for application testing using appropriate channels. Automation tools available for test data preparation and workflow management could be leveraged for test data traceability and consistency especially in SIT environments.

Putting it all together

The responsibility of providing a viable test automation solution though lies in the hands of automation engineering team, however it is the implementation team who can make best out of the available offerings and make the test automation process a success.   



[i] https://www.infosysblogs.com/testing-services/2019/05/building_a_sustainable_test_au.html

July 14, 2019

Building a Sustainable Test Automation Solution

Test automation is an essential part of QA processes in the software testing industry. Once a mere tool for optimization, supporting manual testing, test automation has now become primary driver in QA. However, successful test automation is much more than just writing a code to de-manualize a step wise process. 

During a digital transformation journey, everything may seem pretty straightforward while using automation frameworks and scripts. In reality, this automation success is short-term. Less than a year into the implementation, many teams soon get pulled into vicious circle of automation maintenance. Issues like flaky test results, change in expected behavior of the system, environment/infrastructure changes, diminish your ROI from a test automation framework[i]. It is thus essential to realize the fact that the success of automation solutioning, especially in software testing landscape, is more about avoiding mistakes than just getting it right!

The need of the hour is to look beyond the surface and come up with a futuristic, self-healing and sustainable test automation solution bedecked with best practices and technology. There isn't a checklist for 'right automation', but in fact 'quick solutions' are certainly expensive, or nearly impossible to maintain. Consider the following tips while embarking on your test automation journey. It may not be a cakewalk but definitely will have long term maintainability.

Picture2.jpg

  1. Simplify
    Testing requirements are as vast as application development. The automation tests are expected to match up the pace of application complexities as features mature. However, flexibility in the system should not be syntactically complex, that might bog down the user. An ideal solution should truly serve the testing goals while at the same time be fluid enough to handle real world testing complexities.
  2. Modularity & Reusability
    Testing approach/type may differ on project basis, depending upon factors like application type and life-cycle process. Automation components must be tailored in a way that they should be non-cohesive to the landscape diversity. These modules could be reused as common automation assets in multiple projects. Such a system thus eases the test maintainability and enhances trace-ability. 
  3. Handle dynamic nature of the application
    Identification of frequent changes and dynamic elements in an application are the two major challenges in the real world of test automation. Most test automation frameworks are unable to identify these dynamics, wherein test planning becomes ineffective and thus may lead to defects. Hence, a provision for identification criterion of application changes, dynamic objects and properties update, is vital for effective test automation.

  1. Centralize Test services wherever possible
    A one-stop solution for dynamic test assets, on-demand test tools and environment, with minimal domain knowledge and solution expertise, Testing As A Service or TAAS is an outsourcing model. 
    It is recommended to offer software testing as a service over cloud, especially in projects where extensive automation and short execution cycles are involved. The components can be used on-the-fly as per subscription using a centralized infrastructure.   
  2. Domain flavored automation
    While testing an application, a tester must think like an end user. Especially in Banking, Financial Services and Insurance (BFSI) and telecom domains, it is essential to know working procedures and domain keywords to write and execute tests better. Similarly, for an automation solution, a distinct edge on domain knowledge is vital to ensure maximum coverage of the functional and non-functional aspects of an application under test.

  1. Building intelligent automation
    Automation is not a one-time solution but a process. A smart automation solution should ideally be self-learning and adaptive. Amalgamation of AI/ML in QA automation helps inducing trace-ability in progression tests and self-healing in regression tests. Alternatively, for lesser sophisticated continuous automation delivery, code-less test automation can be explored. 
To sustain test automation, an appropriate framework with right mix of infrastructure and technology is vital, it is equally essential to streamline testing processes at practice level. Success in test automation requires immaculate planning and design work. Remember, automation in testing is not just a fancy UI to perform test steps, but should be aimed at building a solution that has long term maintainability and traceability. Hence, a sustainable automation solution that would suit the real world testing problems is a must!

June 19, 2019

Test Automation Debacles

In the era of digital transformation and spurring competition, organizations are joining the automation bandwagon without a second thought. Especially in software testing; automation has acquired an important place to address needs of agile and continuous testing processes. While the benefits of test automation are well proven with plenty of mature scripting frameworks available in the market, the death knell to the automation journey comes when testing teams start to struggle with test maintenance fatalities.

The Mayhem

We studied a few QA projects closely, right from test planning to execution, only to realize the damage, that un-managed test automation can do to the automation ROI.

Based on the process maturity and application under test, testing teams use various tools and frameworks implementing automation at varied levels. There are tools and techniques that offer automation in pockets and enable teams to realize instant benefits. However, shortsighted test automation is unable to keep up with the pace with which application features evolve. With teams going agile, short test cycles bedecked with amateur testing practices add to the debacles, hence lowering the automation tests maturity graph. Whereas, application under test continues to evolve with exponential speed. Therefore, the gap between automation tests and application features keeps increasing at alarming rate with time, as shown in Figure 1. With frequent changes in the application, poorly written tests and un-managed 'quick' automation test cycles, the teams get caught into vicious cycle of maintenance costs and decline in test coverage. This upsurge in test maintenance leads to regression defects, hence diminishing ROI from the automation.

TestAutomationdebacles.png

As a matter of fact, project teams land into worse situation with test maintenance during automation cycles as compared to manual ones! We observed teams spend more time in fixing test scripts almost by a factor of 200 as compared to what they spend in manual tests. That explains the steep surge in automation ROI in Figure 1.

The mistake lies in..?

We have blamed automation practices and application dynamics enough. Test automation experts are already looking into streamlining test automation and devising ways to leverage application changes to plan tests better. Did it really help? I don't think so! It may slow down the damage but eventually the irreversible destruction caused by haphazard test automation is realized sooner or later. The primary cause lies in the expectations. Testing teams do realize end to end automation is an incremental process or for that matter 100% automation may/may not be achievable, due to which, the end-user's expectations from test automation solutions stoop to minimal level. That's where the problem starts. It's the automation engineering that needs to be addressed!

We may standardize tools or processes for a testing team to adhere to, wherein test automation could be implemented in pockets and benefits be realized momentarily. However, the teams must realize the tool benefits are as good as the features offered and the automation coverage, which may turn into a nightmare in long-term, with no test maintenance available. Hence it is now necessary to look beyond test automation and address the gaps in how the automation tools and accelerators are engineered to offer better and sustainable automation.

You may find my recommendations on picking up the right ingredients and building a sustainable test automation solution in coming blogs.

May 17, 2019

QA Paradigms in Open Banking

Open Banking started as a regulation in the British banking circles, and now countries around the world are racing to adopt it. Australia is making its first move towards Open Banking later this year in July. The European Union is adopting PSD2 on lines of Open Banking. Countries implementing Open Banking are being watched intensely by those planning  to adopt these standards like Israel, Canada, Hong Kong, Japan and Singapore. Everyone is waiting to see the outcome of Open Banking imperatives. What is Open banking? Why should the IT world take notice? And what would be the implications of Open Banking in the Software Testing world? I am going to take a stab at these in the next few paragraphs.

To start with, Open Banking is a directive by UK's Competition and Markets Authority which mandates that all banks should expose their customers' data via open APIs to third party providers like competitor banks and FinTechs, with the express consent of the customer. What started as a regulation, Open Banking now broadly refers to the splitting of banking services and enabling customer's data access to partners outside the incumbent banking system with express consent. Open banking has created avenues for Fintechs and challenger banks to use technology which can leverage Customer data to help secure loans, provide a level playing field to pick and choose, help with payments, etc. Open banking has truly enabled FinTech firms to compete with large banks by helping them design more customer friendly products and also has provided much needed competition between banks to provide more value to the customer. Until recently if another bank/FinTech wanted access to the financial data of a customer, either the customer would have to fill in the data fields manually or the bank/FinTech would obtain the customer's login credentials and scrape the incumbent bank's page to get the required data. This is not a best practice with in terms of cyber security and a rather crude way to garner data. Now, Open banking has made it very convenient for customers to expose their data via open APIs. Additionally, it has empowered customers to switch banks easily. And further constructed a level playing field where FinTech firms can leverage data and technology to come up with creative solutions against the larger banks. Overall, Open Banking has increased competition and innovation while adding value to the end customer.

European Union implemented its own Open Banking regulation known as PSD2 which is an abbreviation for Second Payment Services Directive. PSD2 is a regulatory directive by the European Banking Association (EBA) applicable to European Union markets. PSD2 requires banks to grant customers the right to choose their payment partners PSD2 had been conceived with the intent of making payments easier in terms of innovation and use. There were few salient differences between Open Banking regulations of UK and PSD2 but in November 2017, Competition and Markets Authority mandated that Open Banking should be compliant with all PSD2 directives.  Open Banking will now cover all payment products like Credit Cards, Debit Cards, e-Wallets, etc. which are part of PSD2. Both PSD2 and Open Banking regulations have evolved to complement each other by increasing the scope of financial products under Open banking.

Payments will also get simplified via Open Banking. For example, currently on an ecommerce site, a typical payment goes via various intermediaries like the merchant, Payment Gateway, card associations like Visa or MasterCard, issuing bank and acquiring bank. But, with Open Banking, the online retailers can directly conduct payment transactions with your bank without any intermediaries. And again this benefits the end customer as the surcharges demanded by these intermediaries is eliminated.

Which brings us to the ultimate question of this blog, how will Open Banking affect IT industry especially Software Testing? Interoperability through common standards as one of the keystone objectives of Open Banking.  To achieve this, banks will have to build open APIs which comply with regulatory standards, security protocols, safe data transfer, compliance with all the directives, etc. Open banking creates a plethora of opportunities in regulatory testing, penetration testing and security testing to make sure all the security protocols are in place and thwart cyber criminals, performance testing when many customers try to access or transfer data at the same time, API testing, Accessibility testing, consent testing, Strong Customer Authentication testing.

Open Banking has the potential to grow into a niche QA area where domain experts with testing skills would work on ensuring the APIs and platforms are performing optimally. Experts in QA who are well versed with Open Banking landscape will be very much in demand and since all financial institutions operating in Europe and UK must conform to Open Banking standards, all of them will require support in this area. There are 9000+ financial institutions in Europe and all of them will have to comply with Open Banking/PSD2 which translates to immense QA opportunities.

This write up serves as a generic introduction to Open Banking and opportunities in store. My next post of Open Banking will look at more granular details of how Open Banking will affect QA in various industries like Retail!

Reference:
https://bankingthefuture.com/a-primer-on-open-banking/
https://www.openbanking.org.uk/wp-content/uploads/What-Is-Open-Banking-Guide.pdf
 https://www.paymentscardsandmobile.com/psd2-explained-payment-services-directive-created/
https://www.starlingbank.com/blog/explaining-psd2-without-tlas-tough/

Continue reading "QA Paradigms in Open Banking" »

Subscribe to this blog's feed

Follow us on

Infosys on Twitter