Welcome to the world of Infosys Engineering! It is a half a billion plus organization that takes pride in shaping our engineering aspirations and dreams and bringing them to fruition. We provide engineering services and solutions across the lifecycle of our clients’ offerings, ranging from product ideation to realization and sustenance, that caters to a cross-section of industries - aerospace, automotive, medical devices, retail, telecommunications, hi tech, financial services, energy and utilities just to name a few major ones.

April 13, 2014

Geneva Motor Show March, 2014 - Who is "calling"?

Among the headline makers this year, at Geneva Motor Show held in March,2014, was launch of CarPlay.

CarPlay, is a system for integrating iPhone5 and its higher versions to the vehicle's entertainment system. Users can then control various phone features like call, text, music and navigation, using in-vehicle controls, such as steering wheel buttons, touchscreen or SIRI from the iPhone.

So, what is the buzz about? Why there is such an anticipation building with analysts, industry honchos and customers at this launch? Haven't we been witnessing increasing footprint of telematics for the last decade or so now?

Here are some of the larger scenarios and ramifications (for both automobile and consumer electronics) being discussed by analysts in various forums:-

 1. Automobiles - A new platform for consumer electronics to expand: With a forecasted one in two new cars (about 50 million) produced being connected by 2020 globally, this presents a relatively unexplored market for the Hi-Tech/Consumer Electronics industry giants to tap into. Potentially, we are looking at millions of cars as iOS and Android platforms for customized Apps and accessories.  

(Source: http://www.forbes.com/sites/sarwantsingh/2014/03/19/apple-in-cars-what-does-it-mean-for-the-car-industry-and-google/)

Earlier this year, at CES 2014, Google also announced an alliance with leading car manufacturers, that may see infotainment systems running on the open-source operating system this year.

2. Standardization of user interface: Imagine the customers of iPhone and Android smartphones, getting the same user interface and ease of navigation on their in-vehicle entertainment systems, as they are used to on their smartphones. This will translate into a big surge of Apps, services usage for the device manufacturers.

3. Battle for supremacy on multiple fronts, and competition for OEMs: We will witness contest of iOS Maps v/s Google Maps, iTunes Radio v/s other radio/entertainment services, and serious competition for telematics OEMs and suppliers of the likes of Delphi, Harmans and so on.

4. Good news for car companies: The biggest advantage for car companies is that they can focus all their R&D and spend on hardware and car mechanical design and innovation while benefiting from the shorter product release cycles of the consumer devices industry. This also gives them ability for launching quicker upgrades of in-vehicle entertainment software in the vehicles within 1 year timeframe; which is quite unlike the extensive timelines of 4 to 6 years cycle to bring innovation to market.

5. Customers are set to gain and be happy: Every time, any market sees more competition, customers are set to benefit with more feature rich products, quicker upgrades, dropping costs and increasingly better customer service and this in turn feeds into higher demands. And that is what each sector and economy needs to keep growing.

With the Hi-Tech giants descending in the auto industry with all seriousness, we can expect their next battle to be fought in our car consoles.

April 30, 2013

Automotive Telematics - What Lies Ahead

Telematics typically refers to the integrated use of Telecommunications and Informatics, also known as ICT (Information and Communication Technology). Though Telematics has found applications in a number of domains, automotive telematics still remains one of the most prominent and promising areasof its application. As per Machina Research, 'From less than 90 million connections globally in 2010 the automotive M2M market will grow to almost 1.4 billion connections by the end of 2020'.

Continue reading "Automotive Telematics - What Lies Ahead" »

April 5, 2013

Enable your application for IPv6

As per data collected by Internet society, IPv6 adoption is gaining momentum, across the globe. Though IPv4 is not going away anytime soon, it is clear that IPv6 adoption is on the rise. This makes a good case for software applications, which follow client-server architecture and use TCP/IP based communication, to enable themselves to communicate over IPv6 protocol. Abundant text is already available about why IPv6 is not just about overcoming IP address space crunch and how it is a more efficient protocol than IPv6. In this blog, I explore the challenges associated with changes needed at the application level, to enable support for IPv6.

Continue reading "Enable your application for IPv6 " »

Adopting Big Data - Challenges and Success Factors

Big Data comprises of 3-V factors; namely Volume, Velocity and Variety.  However, considering well understood benefits reaped on adopting Big Data in enterprises, one could be tempted to club another -V (Value) to the existing troika. Mckinsey Global Institute Big data study says that 'The total amount of data created and replicated in 2009 was 800 exabytes -- enough to fill a stack of DVDs reaching to the moon and back' (source: Mckinsey global institute. Big Data: The next frontier for innovation, competition, and productivity. May 2011). Though Big Data adoption is well established in industry segments like Retail, Financial and Insurance, and Manufacturing etc, there is still a need to continuously innovate and implement factors that guarantee success and enable rising returns. The primary challenge in Big Data implementation is the need to handle high voluminous data (that exist in multiple storage sources and multiple data formats ) at the speed it is received and processing to generate intelligent business insights. This leads to an even more complex problem to solve - that of management aspects, where the enterprise structure and processes will need to change in response to findings from Big Data analysis to enable the enterprise to evolve and reap business benefits.

 

Continue reading "Adopting Big Data - Challenges and Success Factors" »

April 1, 2013

Off-board Telematics Services in India

India's rapid economic growth over the last decade, emergence as a leading producer and exporter of cars and a large population of mobile phone subscribers all point to a potential lucrative market for telematics (high reward, high risk). It is early times yet as compared to telematics service adoption in the West, but players have already started making the right moves to gauge adoption potential. Our recent interactions with vehicle OEMs and suppliers seem to indicate the beginnings of a rat race to building telematics platforms and innovations around services on those platforms. Make no mistake - the Indian market has already been exposed to basic telematics services like GPS-enabled navigation and vehicle tracking systems over the years, but adoption has been niche and isolated.

Continue reading "Off-board Telematics Services in India" »

January 8, 2013

Internet Laws and Regulations in today's extended societies


The internet is not constrained by geographical boundaries and hence localized or nationalized laws cannot apply locally. However, Internet Laws exist in most countries around the world and they deal with issues related to the legal use of the Internet. Considering that the Internet is not a physical space is in itself, there are quite a few challenges around implementation of these laws. Online material that is legal in the country of the hosting location may violate local laws of another country where it is being viewed or downloaded.  However to regulate consumption of material that violate local laws or can potentially cause distress in local societies, individual governments do work out mechanisms that enable regulation of content .Governments  have mechanisms in place that ensure there is regulation on what is being served to users consuming and publishing online content in their countries.

 

Continue reading "Internet Laws and Regulations in today's extended societies" »

October 7, 2012

Big Data, Cloud and Analytics

Big Data is the biggest buzzword today as it poses enormous challenges, complex problems that technologists around the world are busy trying to solve. Big Data refers to huge sets of structured and unstructured data. Structured data is the one which can be classified and stored in a pre-defined table schema. For example, when we submit an online payment form, the required metadata information is known before hand and can be stored in a well defined schema. Semi-structured or unstructured data on the other hand, is more like a free form data which does not adhere to any particular schema and is hard to parse and process. Examples are Twitter hash tags appearing in tweets, Facebook updates, comments and mentions, logs, topics and sub-topics that go in a wiki page etc.

 

Organizations capture data from various sources relevant to their business. Captured data volume has grown rapidly in last several years and managing data has always been a difficult problem to solve. Big Data adds complexity to this problem in mainly following ways:

Continue reading "Big Data, Cloud and Analytics" »

September 27, 2012

SQL Server Integration Services: Evolution as a Data Transformation and Warehousing Utility

Typically Integration Services come with a rich set of Tasks, Containers, Transformation Functions and Data Adapters for support and maintenance of business applications. SSIS provides similar services which have evolved over the years from a simple DTS Executable utility to a full-fledged Integration Service offered with the SQL Server Database Platform. SSIS can help in solving complex business problems using ETL and BI and also manage SQL Server Databases and other SQL Server objects.

Business Scenarios for SSIS Usage

SSIS is used in a variety of Data Transformation, Migration and Warehousing scenarios. Few Typical business scenarios are:

·         Copying subset of data from a Large Volume Transactional Database to a Reporting Database to generate various Operational and Analytical Data Reports

·         Migration and Replication of Data from a Development or Test Database Server to Production Servers and vice-versa.

·         Merging Data from various heterogeneous Data Sources to s Single Destination Database Store for further usage or analysis. Further usage of such data will include using them as archive information for generating trend analysis reports and standardizing the data formats for future transactions of an application.

·         Aggregating data on a periodic basis using batch processing for Reporting application for faster report data processing.

·         SSIS also helps in cleaning and standardization of data which comes from various data sources before loading to destination database.

·         It also helps Database administrators in automating the administrative functions like backup and restore. They can be scheduled over SQL Server Agent jobs which help in scheduling memory intensive jobs at non-peak hours and hence prevent interference to regular transaction performance especially in cases of OLTP and OLAP Applications.

 

In summary SSIS helps in providing a Business Intelligence Tool for data transformation processes. From a simple data extraction, load and transform utility SSIS has evolved into a more flexible utility which is easier to debug since it has a graphical user interface and can perform tasks ranging from execution of workflows, data transformations to aggregate, merge and copy data and administrative tasks like data backup and restore. Also it has an application programming interface for programming the integration services tasks using the SSIS object model.

DTS to SSIS

DTS was Microsoft's first ETL utility which was introduced with SQL Server 7.0. It helped improve its Business Intelligence Capabilities. However with the advent of SSIS, Microsoft introduced a new processing engine which provides in-memory buffers, data extraction capabilities and data transformation capability to modify data and make it available to other processes. It enhances the DTS capabilities in the following aspects:

·         Faster data processing and high volume data processing

·         SSIS provides enterprise level capabilities from simple import/export functionality to complex data transformations which can utilized in large scale data warehousing applications

·         It is more easily customizable and scalable as it leverages the .NET framework capabilities to build custom components if not provided out of the box, unlike DTS which did not have a common framework.

·         It provides a robust mechanism for iterative processing by using For Loop and ForEach containers not present in DTS

SSIS Performance Issues and Optimization

Since SSIS executes on large chunks of data it comes with a set of performance concerns like memory utilization during SSIS package executions, CPU load balancing, I/O processing speed and network utilization. However as per the SQL Performance team, SSIS can load 1.18 TB of data in 30 minutes and can process 4.5 million sales transaction rows per second.

Some of the design practices to ensure that SSIS packages perform up to expected levels are:

·         Ensure that all transformations occur in memory

·         Perform capacity planning by understanding resource utilization

·         Optimize the SQL Server data sources, destinations and transformation lookups using optimized SQL queries and stored procedures, indexed tables and optimized data transformation methods.

·         Logging if done in an SSIS package should be minimal to minimize memory usage.

There are various methods to troubleshoot and debug low memory condition in SSIS. Few methods can be:

·         Execute SSIS on a separate computer that is not executing an SQL Server instance.

·         During package execution set the Maximum server memory to a small value to increase available memory.

·         Wherever applicable execute SSIS tasks in sequential manner rather than parallel to handle low memory conditions.

·         Tweak the values of DefaultMaxBufferRows and DefaultMaxBufferSize to get as many records as possible.

·         Avoid unnecessary column usage in Dataflow tasks and configure data types correctly to reduce estimated size.

·         Partially blocking transformations like Merge, Merge Join and Union All and Blocking transformations such as Sort and Aggregate consume the maximum amount of memory since a separate buffer is created for their outputs and new thread is introduced in the dataflow. These transformations are asynchronous in nature and should be used carefully to avoid low memory conditions.

·         Key tools and techniques for debugging SSIS issues are SSIS logging, SSIS performance counter monitoring and SQL Server Profiler.

SSIS Improvements and Capabilities in SQL Server 2012

With the release of SQL Server 2012, further improvements have been made in troubleshooting and logging features of SSIS like capturing data flow component timing information and row counts for all paths within a data flow. Also data taps can be added to a data flow path to capture the data in CSV format during package execution which aids in troubleshooting data issues.

Also SSIS is a useful utility  to migrate data from an on premise SQL Server to SQL Azure. There is however a limitation to this since only data can be moved and not database objects like tables, stored procedures or triggers. The reason for this is that SSIS uses SMO (SQL Server Management Objects) for moving objects and as of now SMO doesn't support SQL Azure. For this reason the database objects like tables and stored procedure need to be moved using Generate and Publish Scripts wizard and then followed by using SSIS for data transformation and migration. There are other alternatives of migrating data to SQL Azure like using the BCP utility to bulk copy data from source to destination (SQL Azure). However BCP lacks SSIS's ability to convert data from one data type to another and also SSIS workflow components.

As of now Microsoft has not come out with a clear strategy of supporting all SSIS Tasks on SQL Azure. During one of the Microsoft PDC (Professional Developers Conference) on SQL Server Services, ETL in the cloud was mentioned but there was no specific mention of SSIS. Many blog writers have questioned the need of ETL in the cloud which makes the future of SSIS in the cloud bleak.

Detailed Reading and further references on various aspects of SSIS are available at MSDN, Technet, Microsoft Support, MSSQL Tips and SQLServerPedia.

 

 

 

 

 

 

 

Continue reading "SQL Server Integration Services: Evolution as a Data Transformation and Warehousing Utility" »

Collaboration possibilities with Lync 2013 preview release

Unified Communication (UC) platform provides a real-time collaboration experience for the enterprise and its customers.  Apart from unified messaging and presence notifications, a UC platform provides business process integration, conferencing capabilities and integrated collaboration tools. Microsoft's first major UC platform release was OCS 2005 and has recently announced the release of Lync 2013 preview along with the Microsoft Office 2013 suite of products.  See my previous blog on Microsoft's earlier release, Lync 2010.

Here we check out the new features available with Lync 2013 and also will see how these developments in the unified communication platform is going to change the way enterprises collaborate.

Continue reading "Collaboration possibilities with Lync 2013 preview release" »

September 14, 2012

Evolution of SharePoint 2010 as an Enterprise Content Management Solution

Essentially SharePoint was designed as a collaboration platform which makes it easier for people to work together. It allows people and organizations to share information among themselves and with others, help manage documents and information libraries and publish reports for improved decision making.

However over the years SharePoint has evolved as an Enterprise Content Management System with Focus on managing Documents, Records, Web Content and Rich Media Content.

In this article we will be reviewing a few areas where SharePoint 2010 has evolved from earlier versions of SharePoint as an ECM Platform and also areas which require further improvements.

Web Content Management Solutions have a set of objectives which needs to be reviewed while designing the solution. Some of these objectives are:

  • Utilizing corporate web sites as a source for sales leads
  • Allow Customer Self-Service
  • Centralizing content Life cycle across all digital assets
  • Management of Content Life Cycle
  • Providing Content control to content owners
  • Promoting thought leadership
  • Complex Approval Process Requirements to adhere to regulatory and compliance needs
Keeping these objectives in mind some of the Key Feature enhancements and capabilities provided in SharePoint 2010 Content to make it a viable ECM option are:
        

  • Enhanced Web Authoring Experience which includes editing content, applying customized styles and themes and using the New SharePoint UI to change the page layouts and structure. SharePoint 2010 has an improved Rich Text Editor (RTE) which provides "Word like" editing experience like rich formatting of text, live preview of formatting options, easy embedding of images and videos directly into the RTE and drag and drop capability to place images and video wherever required. SharePoint 2010has enhanced its theming capabilities to apply a new set of colors to your site. The New Ribbon Interface of SharePoint 2010 allows text formatting options like styles, fonts etc. which is easy to use.
  • Document Sets is another feature which has helped enhance the ECM capabilities of SharePoint 2010 as they make it faster and easier to work on Project related documents and artifacts.Features which make document sets more useful then folder archiving and management in earlier versions are:
    • A Welcome page webpart which displays the contents of the document set and can also be customized to display information other than documents.
    • On creating a document set some default documents can be automatically provisioned.
    • Shared metadata enforces the same metadata on all items inside the document set.
  • SharePoint 2010 provides an enhanced Content Query Web part (CQWP) which helps which supports filtering metadata on items being queried or value passed to the page in the url query string. This feature helps in supporting dynamic content viewing.For example if a page is used to display News or Announcement in an intranet portal and we need to display all the news which is related to current news or announcement, the CQWP will help us to filter the required information from the page.
  • The New Managed Metadata service of SharePoint 2010 helps in representing corporate taxonomy. It helps in defining a hierarchical collection of terms and their further usage as attributes for Items. The terms can also be helpful in tagging content in SharePoint, driving dynamic navigation and Search Engine Optimization.
  • Improved reliability of Content Deployment feature in SharePoint 2010 which helps in deploying content from authoring/staging environment to production environment.For improving reliability better logging information is available which provides insight into Content deployment jobs and their issues. Also database snapshots can be used to work on authoring a site while Content deployment jobs are in progress.
  • Improved Flexibility in publishing approval process depending on type of WCM deployment. Also workflows can be modeled using Microsoft Visio 2010 and imported into SharePoint Designer 2010 and then reused to apply on content types and site templates.
  • SharePoint 2010 provides a new range of Web Analytics capabilities to monitor different aspects of site usage, thereby providing an understanding of a web site's performance. One of the important features added in SharePoint 2010 is "Search Insights" which helps in monitoring what visitors are searching on a site, what the top search queries are, which search queries are failing etc., which can then be used to enhance fine tune content and meta tags to enhance user discovery and experience.
  • SharePoint 2010 has made significant improvements in social collaboration capabilities such as blogs, wikis and social networking capabilities like discussion forums, ratings and comments.
  • A New Offline Client - SharePoint Workspace, helps in managing SharePoint content offline.
  • Improved LOB (Line-Of-Business) Integration capabilities using Business Connectivity Services (BCS).
  • Improved Search Capabilities due to integration with FAST Search Server 2010.

Continue reading "Evolution of SharePoint 2010 as an Enterprise Content Management Solution" »

Subscribe to this blog's feed

Follow us on

Blogger Profiles

Infosys on Twitter


Categories

  • Mechanical Systems
    • Mechanical product development
    • Verification and validation
    • Digital Manufacturing
    • Tooling Engineering
    • Other
  • Electronic products and Systems
  • Software Engineering
  • Engineering Solutions
    • Product lifecycle management
    • Manufacturing Execution systems
    • Other
  • Engineering Consulting
    • Knowledge based engineering
    • VA/VE & Benchmarking
    • Other (1)
  • General (5)