Testing Services provides a platform for QA professionals to discuss and gain insights in to the business value delivered by testing, the best practices and processes that drive it and the emergence of new technologies that will shape the future of this profession.

« September 2012 | Main | November 2012 »

October 12, 2012

How far would you go to test network performance of your mobile application?

Mobile users expect consistent user experience from their applications - whether they are on a crowded train in NY subway (or) enjoying a holiday at Alps (or) camping in the rugged terrains of the Grand Canyon. The application faces its real test when accessing  varying network parameters in the real world conditions explained above. Varying bandwidth, delays, jitters and packet loss are few of those unpredictable but decisive parameters that influence network performance. An application should be versatile enough to perform under such varied conditions; smart enough to detect any degrade in network performance and should keep the user informed of these scenarios to the maximum extent possible.

Smartphones and Tablets today have an inherent capability to store/log network parameters. There are tools available that simulate network variances in a lab environment. The need of the hour is to merge these two to form a comprehensive mobile network performance test framework. Through our article titled "4E framework for Mobile network variability testing" (published in Testing Experience Magazine, September 2012 edition), Kiran Marri and myself have talked about a structured approach to Network performance testing in lab, which helps achieve near real-time network variability coverage at the same time minimizing the amount of on-field testing required. The article can be accessed at http://www.infosys.com/IT-services/independent-validation-testing-services/Pages/effective-network-performance-QA.aspx

October 1, 2012

'Fix it' or 'Build it right'? How do we improve quality through collaboration and shared accountability?

The past few years have witnessed a considerable shift in IT management focus of financial organizations from cost optimization to improving the quality of the product or service. Reducing costs does not help in retaining market position if the products or services don't sell. The economy gets challenged as is, making it an extremely competitive environment, so it is no longer good enough to just identify and fix issues. Every defect before a release contributes to a delay in time to market and defects which slip through, reduce customer confidence in the quality of the product or service. As companies move from a defect detection to a defect prevention mindset, they have reaped the resulting benefits of shorter time to market, increased mitigation of business risks and compliance of IT systems to regulations. Improving upstream quality is therefore something IT leaders have on the top of their mind. Collaboration between different IT and business functions plays a key role in upstream quality, and also involves breaking away from some of the existing silos of software development.