Testing Services provides a platform for QA professionals to discuss and gain insights in to the business value delivered by testing, the best practices and processes that drive it and the emergence of new technologies that will shape the future of this profession.

« July 2019 | Main | May 2020 »

September 16, 2019

UI vs UX: Revisiting the age old debate

With technological advancement reaching the common man's hand in 21st century, everybody seeks to experience technology without spending much brain and time. Mobile and Web consumers nowadays expect quick and consistent navigation with seamless experience. Hence the growing emphasis on professional UI/UX design in software applications.

While we realize the immense importance of a visually appealing and user friendly application experience, UI/UX are the terms that are generally used interchangeably in the software world. As a matter of fact, the terms are closely linked when being talked in software design landscape.

UI is not UX

By definition, UI or User Interface is the graphical layout of an application which a user interacts with. This includes buttons, input controls, screen layout and every micro-interaction. UI designers create the look and feel of a user interface for an application.

UI.png

UX or User Experience determines how easy or difficult is to interact with the User interface elements on the application. This being the main reason people generally confuse UI and UX to be similar terms and use them interchangeably.

Granted, it is completely fine to use UI/UX together in software design, wherein, the UX designers are also concerned on application UI to ensure smooth navigation and provide a seamless application experience. However, it should be understood that UI is just one of the salient elements in UX as shown in Fig. Designers work on both user interface and user experience design for a customer friendly application.

When it comes to application testing, UX/UI are mostly covered during user acceptance testing phase in an SDLC. While teams do realize the importance of UI testing early on (along with functional testing), to avoid any defects percolating to later tests, usability testing (UX and UI teamed up) is generally scheduled after/with integration tests to accommodate application agility. However, teams end up doing highly expensive rework, due to last minute customer feedback on supposedly less important non-functional aspects like user interface and experience.

UI testing: Test early test often

Validating seamless user experience may seem more relevant during end of the application testing processes, however validating UI with respect to interface, design and navigation requirements need to be taken up way early.

With the rise in customer centric business requirements, it is thus prudent, that UI tests be planned early and be done repeatedly till all functional and non-functional requirements are met. UI automation scripts come handy while planning repetitive tests like performance, load and device/browser compatibility. Early performance or compatibility testing allows capacity planners and infrastructure architects with early warnings on any potential problems with the scalability of the architecture. UI layout and navigation may be volatile during early stages of the application development. Teams must carefully isolate application UI and functionalities to enable independent tests for better results. Automation scripts must be used where functional or UI requirements are stable. QA techniques in early forms can be applied to usability or design testing even before the UI is integrated with functionalities. Automated regression tests should be conducted as often as possible through the course, and not just as part of final QA activities or just before system integration.

A volatile UI may be a ticking bomb towards the end of application lifecycle which may adversely affect the user experience offered by the application. It is thus wise to stub out non-functional testing especially UI for early defect detection and avoid rework post integration tests.

Happy testing!

September 3, 2019

Winning the Test Automation game

Enough has been said about writing better tests, optimize automation scripts or planning test cycles. Teams can choose from a plethora of test accelerators available in the market depending upon the features and automation maturity offered. However, with frequent changes to the application behavior and business requirements per se release planning, test maintenance and test criteria, the selected automation tools are not able to cope up with the pace testing workflows change over time.

Hence it is prudent to consider the maintainability aspect during the engineering cycle of the automation solution[i]. Having said that it is equally important testing teams to layout a viable plan with realistic automation goals and also accommodate incremental automation.  

Lay out an Automation Roadmap

Project teams always thrive on the 'automate whatever possible' mantra. Therefore, end up addressing the pertinent challenges and do away with minimalistic automation ignoring the possible troublemakers. Automation tests may work wonders for progression cycles, however, once teams get into regression, they start realizing the side-effects of not having proper test maintenance in place. While they come up with corrective measures to improve regression planning, reuse automation tests or even correcting them, the overhead is tough to crack.

The major flaw lies in haphazard adoption of automation in pockets wherein the tools in use may address only a few or more aspects in the process workflow. Rest is either carried out manually or using other tools. Teams generally take help of macros or client side stored procedures or scripts to make different tools or manual processes work together. The lack of support in end to end workflow leads to issues like flaky tests, unmanaged automation and hence depleting ROI<reference to debacles>.

It is thus essential to plan test automation meticulously with an incremental roadmap and test traceability on top priority. Provision to accommodate incremental automation like new tools to address complex application features or leverage latest technologies for better coverage, shall help testing teams to achieve better results over time.

 Pick the right ingredients

While the benefits of test automation are proven with variety of mature scripting tools available, testing teams are still struggling with test maintenance debacles. Correcting the approach may help automation teams engineer a sustainable framework, but to realize the best possible results, testing teams must use the right ingredients in the first place.

During test planning phase, testing teams tend to focus more on leveraging application changes to ensure best possible test coverage. Automation test cases keep piling up due to lack of regular optimization, with zero traceability and usability. Leveraging relationship between test assets (test data, automation scripts) available from previous cycles and application objects, functions and their properties, could augment automation planning and self-healing of test scripts. A calculated impact analysis of application changes onto the test scripts thus helps testing teams to reuse, optimize and write better automation tests.

Additionally, test data preparation in pockets also leads to unmanaged automation workflows. It is high time when due diligence is put in preparing test data for application testing using appropriate channels. Automation tools available for test data preparation and workflow management could be leveraged for test data traceability and consistency especially in SIT environments.

Putting it all together

The responsibility of providing a viable test automation solution though lies in the hands of automation engineering team, however it is the implementation team who can make best out of the available offerings and make the test automation process a success.   



[i] https://www.infosysblogs.com/testing-services/2019/05/building_a_sustainable_test_au.html