The Infosys Labs research blog tracks trends in technology with a focus on applied research in Information and Communication Technology (ICT)

« June 2019 | Main | March 2020 »

September 20, 2019

Towards a Consistent and Responsive Web design

Visual Interface is the first interaction a user experiences in a website. Even minor lapses in visual consistency can be picked up without even the user being aware. These influence the bounce rates and user experience hence suffers. Unlike the desktop Web era, it is now imperative to address the emergent demand for a higher quality Web on growing number of devices.

While we realize the immense importance of a responsive and consistent web design in current times, design consistency and compliance testing as part of a web testing portfolio is far from a standard industry practice. Majority of the tools and utilities available for detection of HTML failures are dependent upon image oracles, using which recognition of failures may be straightforward but debugging the code can be tricky. UI/UX i.e. User Interface and Experience testing is thus limited to error-prone and expensive manual processes.

Faults need to be corrected not just reported

Substantial amount of research with varied level of success has been done for detection of HTML faults. However reaching out to the faulty code element for corrections is still a challenge. It is thus prudent to delve into the basic design of a web page, to detect and localize possible UI issues. A holistic approach that uses more reliable methodology involving individual elements of a parsed web page, by comparing their structure and style to find any faults, is recommended.

Research[1] recommends leveraging benefits of Document Object Model or DOM[2] for validation of visual consistency, media sources, HTML markup, hyperlinks and browser compatibility for web applications.

Comparative checking of style and structural attributes of individual DOM elements for a given web page can aid identification of following UX/UI faults:
(1) Visual Inconsistencies i.e. style, text and structure mismatches,
(2) Broken or unreachable hyperlinks,
(3) Invalid image sources and
(4) HTML markup errors and warnings.

While reporting consistency and compliance issues is vital, listing HTML code around the element can be useful for developers to easily identify them in code for corrections.

Additionally, analytics upon the consistency and compliance testing results could reveal a lot about UX health of a web application. For example, a lower score in links validation suggests majority of the hyperlink sources listed on the web page are not reachable or might be broken. Similarly, lower HTML markup score reveals, the code is not following markup and coding standards recommended by W3C. This may hinder the user experience of the page under test, which needs to be corrected immediately.

Lately, UX/UI testing has gained a lot of traction. However, it seems to be limited to usability testing during user-acceptance tests. The need of the hour is to adopt UX testing as a salient part of the Web testing processes while in production and to stand out in the competitive Web world. And yes, with immediate effect!



[2] Programming API for XML and HTML pages that defines the logical tree structure of the document for access and manipulation

September 11, 2019

Analytics in Software Testing

Use of analytics in Software Testing leveraging AI, ML & big data analytics etc. focuses on identification of potential problem areas and suggestion of next best actions for high-quality product delivery. Automated software testing using analytics is essentially quality control for operational aspects of the product. With, larger aim to create a robust testing process operating through one or multiple test automation frameworks. Type of analytics used in testing can be categorized into three types namely a) descriptive analytics; to create simple counts, distributions, visualizations describing test data results, b) predictive analytics; to predict organization processes for example predicting types of testing defects at the beginning of release, c) prescriptive analytics; to prescribe corrections and suggest mitigations after identification of risks.

From a market statistics perspective global market for automation testing (AI, Big data & IoT) solutions is expected to grow at a CAGR of 17.7% from 2018 to $19.27 billion by 2023. And, software testing services is maturing market and expected to grow by 6.5% over the 2016-2021 period to $27.9 billion. With specialized testing gaining pace at 48% of total spending, and digital transformation testing is the fastest growing segment at 15% CAGR.

Need for analytics in testing generates from the fact that expected outcome and zero defects has become hygiene factor for mission critical and high-risk applications. In spite crucial importance of software testing there have been many instances where software bugs created major disruptions in business operations. For example, Amazon Christmas glitch, HSBC payment system failure, Microsoft azure crashes, Screwfix price glitch, HMRC's tax blunder etc.

The Traditional Testing methods are static in nature and designed to detect possible defects in early stages of SDLC. With the goal to find lexical, syntactic and even some semantic mistakes. Also focus is more on functional aspects where system is tested around functional requirements and specifications with less focus on the specialized services. Traditional testing methods relies more on descriptive and diagnostics analysis of post testing improvements. And with lot of manual processes and lack of automation traditional methods of testing usually takes weeks or months for software rollout. Also, there are major drawbacks in the traditional approach like less customer focus, reactive to feedback, difficult to detect and isolate defects, and adaptation to new environment etc.

There are few drivers for adoption of analytics into software testing methodologies:

·         Mission Critical Dependency: With mission critical and high-risk applications running on software applications, testing is expected to achieve zero defects and trusted quality. Many cases software bugs created havoc for the customers, companies and its reputation. 

·         Financial implications of bugs: Besides the hit to the customer experience and company reputation, many of the examples of software bugs/failures have resulted in huge financial cost implications for the organizations.                

·         Emerging Tech: Massive adoption of mobile and cloud services have impacted Test Environment Management (TEM) as well as Test Data Management (TDM), however traditional approaches fails to deliver required level of efficiency.

·         Faster on-going testing: Advent of Agile & DevOps has made shorter software development as new normal, but even more is needed to meet today's demand for faster go-to-market (GTM)

The use of analytics in software testing will positively impact efficiency and effectiveness quality process as well as overall release cycles. There are multiple features of analytics which can be leveraged into the software testing landscape like AI and ML led platforms can help in defect analysis, log analytics, test prioritization, improvement in overall test coverage, and automated test case generation. Predictive analytics to spot future failures in view of the past data sources. Enabling pre- cognitive defect healing & proactive defect prevention. Advanced Visualization in form of real time data of test performance, test history charts & Error logs in dashboard can create significant grounds to improve the testing procedures of the future. Touchless fully automated testing across software development life cycle. Along with robotics, ML, conversational AI or natural language processing etc. Use of various analytics techniques such as ANN, NLP, reinforcement learning, deep learning etc. for automation of the testing methods and processes. Use of open source test automation solutions like Selenium and Appium for web and mobile have a high impact in automation and digital transformation.

Many players in the market are showcasing their 'analytics in testing' capabilities via AI/ML. Some of the AI powered tools in the market are NIA, Applitools, SauceLabs, Testim, Sealights, Mabl, ReTest etc. There is huge potential for touchless fully automated analytics based software testing which focuses on automated defect identification as well as automated healing (via bots). Most of the competitors has rolled the solutions under the umbrella of QA services solution or automated software testing solution.