The Infosys Utilities Blog seeks to discuss and answer the industry’s burning Smart Grid questions through the commentary of the industry’s leading Smart Grid and Sustainability experts. This blogging community offers a rich source of fresh new ideas on the planning, design and implementation of solutions for the utility industry of tomorrow.

July 20, 2017

Utility Procurement - a New Vision

Innovation is part of the 'DNA' of Infosys, and we are always being asked to innovate by our clients. All too often however the procurement process constrains our ability to offer that innovation. The deliverables are given strict bounds, and we are only able to offer specific solutions. For example, the need may be for improvements in asset management, but the tender constrained to configuring and installing a particular software package. Whilst in a few cases that may be due to a poor procurement strategy, in most cases it is due to the constraints, both regulatory and corporate, that control how procurement can be undertaken.


Does it have to be this way? I believe that clients could procure in an innovative way, that allows their suppliers to show their ability to offer novel ways to solve problems. The process could be two stages, the first a simple pre-qualification exercise to determine a shortlist (as is currently undertaken), the second to deliver an outline design of the solution, where the client pays a small fee to the tenderers to get into far more detail than current tenders allow. This will enable the supplier to demonstrate their ability to deliver innovation, and the client to both understand that ability, and know how the supplier performs in a work situation. Such a process would enable to client to tackle much larger issues than generally covered in a tender, and indeed a few utility clients are already using a more agile approach. I will demonstrate with an example in asset management.


This example tender could be phrased "Devise a solution that will deliver an x% reduction in asset management costs, whilst producing a y% improvement in performance, without increasing overheads." In the pre-qualification, tenderers would need to demonstrate experience in such areas (although not necessarily in the same industry), and provide good and pertinent references: this would allow the client to shortlist. Tenderers could also consider partners to add to their bid, for example instrumentation suppliers and installers. In the tender, the client would allow a certain sum for each tenderer to produce their innovative solution, with sufficient access to client staff to determine constraints, both technological and business. This phase would of course need to be undertaken under non-disclosure agreements to protect all parties. Once the 'tender' is completed, the client would be able to select a supplier with a much greater understanding of that supplier's ability to innovate in a way that will benefit their business.


Whilst this system may seem strange to some in utility procurement, it is similar to those employed in areas like architecture, that have allowed buildings such as the Sydney Opera House to be developed. Do we want our future to be full of bland boxes, or Guggenheims?

March 14, 2017

The Security trap

Security in IT is very important. Unauthorised access to confidential information can cause major disruption to companies, and to individuals lives. Some disruption can have life changing impacts to finance and reputation. Even 'lesser' security issues, such as viruses, can cause massive damage to company systems. Breaches to Operational Technology (OT) systems (such as SCADA) in utilities could cause countrywide failures, and put lives at risk. IT security is therefore quite rightly taken very seriously by governments, organisations and individuals.


However IT security is just one amongst the many risks we all face on a daily basis. Even a major breach of a utility OT system would not have the impact of an atomic bomb, and yet the world managed to increase overall wealth, and made great strides to reduce poverty, throughout the Cold War, under the threat of mutually assured destruction. IT security is therefore just another risk that we all have to manage.


Unfortunately in too many organisations IT security is used as a reason not to implement technological improvements. For example, video conferencing between computers, and even mobile devices, is something many of us use regularly, however video conferencing between organisations is very rare, generally because of 'IT security' concerns. Sharing of information is frequently blocked, and yet shared information often increases knowledge and opportunity for all of the participating organisations. For example, Transport for London (TfL) made most of the information for its transport systems (e.g. timetables) publically available: there are now a plethora of 'apps' to help travellers plan their journeys, all of which have been produced at no expense to TfL, and increase customer satisfaction.


I believe it is a duty of those of us in the IT world to ensure that IT security is managed appropriately, and not used as an excuse to block the business and personal benefits that our innovative technology can bring. Like any other risk it should be managed appropriately and balanced against the benefits. We cannot let the few who would wish to take advantage of us through IT security breaches constrain our future.

March 3, 2017

The Asset Management Journey - into Adaptive

For utilities, traditionally most asset management was based on cycles of planned maintenance, interrupted by many occurrences of reactive work. The planned maintenance was generally based historic norms, often with little feedback of benefit. With the advent of asset management systems, both IT (e.g. EAM/WAM) and Process (e.g. PAS55, now ISO 55000), work became more planned, and was more based on benefit, drawing particularly on asset risk and criticality. Such changes made major improvements in efficiency, with reductions of reactive work from 70% to 30% not uncommon. However planned work was, and in many cases still is, based on expectations of asset lifecycle performance, and not on actual asset feedback. Whilst such proactive strategies reduced service impacts, it led to higher levels of planned maintenance than necessary to ensure optimum asset life.


Over the last 20 years industries have increasingly turned to predictive methodologies, using sensors and instrumentation, coupled with appropriate analytic software, to predict and prevent asset failure though understanding trends. For example, a large transmission operator uses transformer load measured against ambient and internal temperature. A band range of 'normal' internal temperature against load and ambient temperature is mapped, and the system flags when internal temperature is outside of this range, so that checks can be made before any failure. Increasingly such tools are using machine learning which further helps to predict 'normal' asset behaviour. Asset management has therefore moved from Reactive through Proactive to Predictive.


Artificial Intelligence (AI) tools, such as Infosys NIA, are now starting to be used in asset management. These new methodologies use the AI engine to collate, compare, analyse, and highlight risks and opportunities. The tools can use structured and unstructured data, static and real time, and have the ability to take data from disparate sources. The systems will increasingly refine understanding of asset behaviour based on multiple inputs, such as sensors/instrumentation, third party data (weather), social media feeds, and impacts flagged by external, but publically available, sources. The tools will then be able to advise courses of action based on current events. They could also then be used to model possible scenarios, and advise actions and impacts based on their understanding of inputs against outputs (stochastic modelling +). Such tools will enable an organisation to continuously adapt its asset management strategies and implementation to current and future events.


I call this Adaptive Asset Management.

October 14, 2015

10 key pointers for an effective Web-GIS implementation leveraging ArcGIS Server

 The following pointers came out of a couple of large Web GIS implementation experience in the Utilities domain using ArcGIS for Server version 10.2.1

1. Never try to replicate your Desktop GIS into Web
We have been using GIS as a desktop application since ages. It is a natural tendency to adopt a similar view in the web as well. Long lists of layers in the Table of Contents, plethora of tools that are seldom being used, North arrow, Measurement scale, are few things that remind us of a Desktop GIS. Build Application for targeted audience - give no more features than what the users absolutely need. Restrict them within a (work) flow so that they can navigate your app with ease. Always remember that your web GIS users are not GIS experts.

2. Map server is the key to the success!
Pay special attention while creating your map services. ESRI has made it very easy to serve your spatial data. However, serving them optimally can be very tricky - particularly if you're targeting hundreds of concurrent users. Follow some basic rule of thumbs - create multiple map services instead of one; no more than 8 to 12 layers in a single map service; try to keep symbols as simple as possible; try not to use Definition Query; follow the n+1 rule while setting up for the 'maximum number of instances per machine', n being the number of cores; allow Windows to manage page file automatically (in case of virtual memory).

3. Avail free base maps and other services from Bing, Google or ESRI
No matter how cleverly you prepare your base maps, I can assure you, they are not better than all the base maps that are available for free. Instead of concentrating on a killing base maps that you will use as a backdrop for your GIS data, use one that is free - as a bonus, you will be saving the trouble of updating them as well.

4. Choose your frontend technology carefully
Not many options are currently available for delivering a frontend API.  For a wider audience, use  JavaScript and HTML5 - unless you're developing some features that are not mature enough in this environment.

5. Keep mobile devices in mind during design
More and more people are online on mobile devices, than through their PCs. However, though the majority of these online users are mainly in the social networking sites during this time, they do see maps in their mobile devices (http://marketingland.com/outside-us-60-percent-internet-access-mostly-mobile-74498). Think of different screen sizes your users will be using to browse your app and plan for accommodating 'Tap's along with 'Click's.

6. Initial load time should never exceed 8 seconds
The average adult's attention span, for a page-load event, is around 8 seconds (http://www.entrepreneur.com/article/232266). Today's users, with the availability of information at their fingertips (taps?), become increasingly impatient for the wait time. On opening a page, if it takes more than 8 seconds, majority of the users will 'X' it out. If you want a wider foot print for your web application restrict the initial load time to 8 seconds, quicker the better.

7. Display non-spatial data spatially
Integration is commonplace in today's GIS. Display of Non-GIS data within GIS is a norm rather than an exception. There are various ways you can integrate - try displaying data on the map as graphic texts rather than in a table within the map. Spatial distribution helps us see patterns that tabular display fails to provide.

8. Pay more attention to User Experience over User Interface
User Experience (UX) is mostly (but not completely) achieved through User Interface (UI). For example, when you provide a zoom-in feature in a mapping application, you can implement that as a command (fixed zoom-In) or a tool (for a user to draw a polygon on the map to zoom into). This is UI. However, implementing a zoom-in feature as a tool can have a different UX depending on how you have programmed the cursor for 'after zoom-in event'  -  retain it in zoom-in mode, or take it back to the default mode(which is usually a pan), when finished. For a better UX, always provide a feedback to the user for each action they perform.

9. Know your users (behind the scene!)
Knowing your users is the best thing you can do for your application. There are some products out there that can capture user statistics, map server performance, number of hits, etc. but they cannot capture an individual user's feedback. If dissatisfied while using your product, majority of the users will not complain and issues but will stop using your application. User survey is another option but they fail to give you a clear picture because of poor participation. It is always a good idea to capture the user feedback behind the scene. For example, if you have a customized 'search' button, log each of its click events.  Try to capture who is searching what and how long is it taking before they see the results. You can fine tune your application based on this log; even, give them a 'hint' on effective searching.

10. Secure your application
Security comes with a price. While Confidentiality and Integrity are achieved easily, Availability is sometimes compromised. It restricts your application to a lesser footprint. Whether to 'Share' or to 'Secure', will be dictated by the business requirements.  At the least, you should always secure your map services through token and Secure Socket Layer; make sure the Server Manager is not visible from outside of your firewall.


Continue reading "10 key pointers for an effective Web-GIS implementation leveraging ArcGIS Server" »

June 3, 2015

I want obedience

I am increasingly becoming frustrated by 'smart tech', systems designed to help us complete tasks, but that all too often  actually impede delivery. For example, my work means I need to leave home early and, as I wish to watch the news when eating breakfast, I activate my satellite box. This is however connected to the television in my lounge, and that TV decides it also should switch on: I then have to quickly switch this TV off to avoid waking the rest of the family!

Talking to others in the utility industry it appears they suffer similar frustrations. Whilst IT applications deciding you meant to type one word when you meant another can be annoying, IT that decides to wrongly alter the settings of operational equipment can have very severe consequences. Even in sectors that are extremely safety focussed, such as amusement parks, software errors have caused serious incidents, such as in the Big One rollercoaster crash at Blackpool (http://www.computerweekly.com/news/2240040871/Software-fix-failed-to-avert-Blackpool-crash). In utilities, where errors can have fatal consequences, the need for caution is even greater.

The benefits of the latest 'smart tech' are however great, and utilities are keen to embrace them, but at the same time are rightly concerned about the potential risk. I thus believe that we need to focus on 'obedient tech', systems and devices that can advise and help us to make effective decisions, but require human input to effect an action. A good example of this is a 'well known' web based shopping service, that gives excellent advice on potential purchases (similar items, etc.) and guides users through simple processes, but leaves all decisions to the user. In utilities an example is a grid operator who uses Operational Decision Support Analytics to monitor transformer temperature against ambient temperature and load, building normal profiles. The system then flags in near real time when transformer temperature goes outside of the determined normal operating ranges, however it is the operator who decides on the appropriate intervention. There are systems that do automate actions (e.g. the Cardiff East Control Strategy - http://www.waterprojectsonline.com/case_studies/2010/DCWW_Cardiff_East_2010.pdf), but these work within very strictly controlled boundaries. It may be that 'self learning' devices are developed to a point where utilities have confidence in their decisions, however I suspect that will not be for some time.

Perhaps it is time we all aimed to be a little less 'smart', and a bit more 'obedient'!

April 2, 2015

The Utilities Data Dilemma

Increasingly utilities are being directed to big data, and all the benefits that appears to offer. However such calls miss a fundamental issue, in that asset data is an expensive element for utilities, both to obtain and to maintain. Most utility physical assets are geographically widely spaced, sometimes in locations difficult to access. Costs can be quite high, for example a manhole survey can average >$100. The EPA estimates 12 million municipal manholes in the US, so a 5% validation survey would cost circa $60 million! Surveys can also have complex health and safety risks that need to be managed. For these reasons asset data is often limited, and of dubious quality. Sensors and instrumentation are improving, being both cheaper to install, run and maintain, and more robust, nonetheless they are still relatively expensive items.

With asset data being limited, suspect, and costly to improve, and sensors and instrumentation expensive to deploy, smarter utilities are looking to make better use of the information they already hold. By using a combination of engineering knowledge coupled with effective analytics, trends can be mapped and normal asset behaviour determined. Where data is readily available such analysis is relatively simple, however where asset data is limited engineering knowledge and understanding can be used to define relationships between the seemingly unrelated data sets. The key is in understanding how data sources can be meaningfully linked.

Large Business Information systems may thus be of limited value to utilities in terms of managing their assets. Of more value is the effective linking of dispersed data sources, coupled with an effective, easily configurable analytics engine. Such tools have already been used to answer many asset related questions, such as the viability of rainwater harvesting in differing regions and climates. It is indeed possible to answer many of the asset related questions posed by utilities, even with the limited asset data many hold. Each question is however individual to the specific situation, so only those who can understand both the engineering and system elements will be able to successfully deliver beneficial results.

March 21, 2014

Need a frame work for up gradation of Oracle MDM V1.6 to MDM V2

Need a frame work for up gradation  of Oracle  MDM V1.6 to MDM V2

 

Overview : IT company need to come up with framework for oracle meter data management  upgrade to its new 2.0 version .This framework will provide utility  client a clear visibility into upgrade project ,include its scope, time and effort involved.

 

Meter data management : All the utilities ,whether electric ,gas or water deal with huge amount of data related to consumption of the underlying commodity .A meter data management system collects this information ,called meter data ,and perform the validation edit and estimation  operation, MDM stores and provides the data required for the downstream business processes e.g. Billing, settlement forecasting etc. .and therefore forms heart of the utility landscape. Oracle utilities MDM is one of the leading MDM systems and has large install -base across America, Europe and Asia pacific region.

 

Why need of upgrade: Oracle has released a new version of MDM, i.e. Oracle Utilities MDM 2.0 and therefore MDM v1.X is set to be phased out. Existing clients using MDM versions prior to 2.0 are looking to upgrade to the newer version. MDM V2.X offers plenty of benefits in terms of advanced functionality. It has a much easier navigation and user friendly GUI. MDM V2.X is completely different from V1.X as it is a completely new product all together. It has a totally new framework and would require a completely new skillset for upgrade and support. All in all it's a new re-implementation altogether if the customer decides to upgrade from the older versions to MDM 2.X.

Therefore the need for an upgrade framework which can effectively migrate maximum possible data as well as the configurations from the older version to the newer one is felt today by utilities.

 

       Migrating key set of data will reduce the time of implementation.

       Manual entry from old system to new, may create some data errors due to wrong entries resulting from Manual errors.

       It may be impossible to manually configure all the different sets of data manually within a short timeline.

       The effort involved may be too huge if this is performed manually.

There is a need of a tool that can be easily configured to the client system requirements which can handle migration of different categories of data and speed up the implementation. 

 

 

Up grating tool will be a breather for utility those are using Oracle MDM V1.6  

 

The objective is to develop a migration tool which can migrate data from different data categories in a reasonable time period. It should be easily customizable and maintainable considering that this could be used across multiple clients.  There are Administrative, Master and Transactional data categories, each having a different functional use and expected volume of data.

 

Ø  Administrative data is typically very low in volume but defines the setup of other application areas. Need to identify some key customizations in any typical MDM implementation which would require some admin data migration.

Ø  Master data includes entities such as Service Points and Meters that define physical and financial aspects of business operations. To support the integration of MDM V2 with CC&B, tools for "Initial Sync" migration of Master Data have already been built into MDM V2.

Ø  Transactional data from business operations, such as meter reads, is typically very high volume; therefore a direct to database approach has to be developed to minimize migration time.

 

October 29, 2013

Leverage consumption data to enhance rental house promotion

Another commercial use of smart meter data I can think of is in the rental websites for apartment and commercial buildings. Almost all house rental web sites provide detail description about house structure, amenities and ease of access to nearby facilities like school, hospital, freeways etc. House description includes floor plan, HVAC availability, thermostat, lighting that help the prospective tenant to choose best house/building space as per his needs. In addition to these house attributes, house owners and managers can place utility usage data and a minimum bill information on house/building advertisements. Utilities, through smart meters, are collecting and sharing the meter reads with their customers. Apartment owners can leverage analytical tools to determine seasonal minimum/average consumption and neighborhood comparison data, and utilize these metrics in advertising their house. Customers are getting sensitive about their utility bills and this information about site's basic electricity/water/gas cost will help them to make decisions on their budgetary planning. This consumption information's use in advertisement would also provide an opportunity for managers to highlight the savings in utility bills because of energy efficient techniques adopted like better house/building layout e.g. better natural lighting, low heat loss, solar protection, controlled ventilation and measures to reduce energy consumption by energy-efficient equipment installation, self-controlled switches etc. Including the usage data in house advertisements will also compel the owners to implement energy efficient measures on their houses and surrounding property to maintain the competitive edge. Along with adding these details with other house attributes, rental websites can add common search filters and side-comparison options on usage and billing data which will enhance their competitiveness with similar websites. Additional data like equipment age, maintenance schedule, replacement plan shall be helpful for long term occupants. These information may aid tenants to determine an approximate bill and may use it choosing energy supplier or site offering a cheaper tariff. Commercial tenants can plan their operations as per scheduled off-peak times to minimize consumptions during on-peak hours and hence saving on energy bill.

The current tenants must be convinced for sharing utility data ensuring them of guarding their privacy concern by anonymous, secure use and by inferring a basic bill cost without disclosing the actual figures. Moreover the house advertisement would use 'derived' minimum/average consumption and comparisons data on a particular type of house, withholding consumption pattern and end users details. Key to success of this idea is to develop the analytical tools that can extract meaningful value from historic usage and energy efficiency features of the house and present it in form that makes that house attractive for occupation.

Benefits for Apartment Owner -

  1. Showcase the energy/water/gas saving features in house and their impact on tenant's energy bill.
  2. Increase in tenant leads.
  3. Prominent listing in "Consumption" based search on rental website.
  4. Appear in rental website promotions for new attribute.

Benefits for Tenants -

  1. Would-be renters get additional attribute in form of approximate average utility bill to decide on the apartment.
  2. Helps the tenants to plan for buying electrical equipment.
  3. Helps the commercial house tenants in planning operational cost.

Benefits for Renting portals -

  1. This house usage information with other comparison analytical tools would attract more visitors to their site.
  2. It provides advantage over competitors and increases portals popularity.
  3. With data in hand, house owners will try to utilize these portal features for their house advertisement and hence contributing to portals revenue.

September 4, 2013

Using 4G technology in Smart Grid

Smart grid initiatives are helping electric utilities to improve operational efficiency, improve power reliability, data analytics, decision making and much more. Communication plays a key role to achieve true value out of smart grid applications. With the deployment of smart meter a huge amount of bi directional data is flowing across the grid network and handling of this data is a real challenge for utilities. For this utilities are focusing on new and innovative technologies that can help them in developing a fast, robust and secure communication infrastructure so that network and data traffic can be managed effectively.

Smart grid applications require fast, effective, secure and reliable communication technology for real time bi directional data transfer across the network. Utilities do not want transmission of smart grid data coming from various sources on such a network that is shared by other customers. On the other side fourth generation wireless technologies provides better control and security unlike other technologies like GSM/GPRS, Wi-fi, RF etc. it also provides real time application support including voice and video with two way communications.  With 4G it is quick to find and restore outages remotely hence increases grid reliability.

Utilities will have following advantages if they choose 4G:

·         Fast and reliable network with advance traffic management tools which are under control of utility which is under control of service providers while using other networks.

·         Low latency due to which real time application support including video and voice data.

·         Provides support to applications like mobile workforce, SCADA, demand response and HAN which requires real time data.

·         Highly secure network allows utility to rest assure about confidentiality.

·         Availability of Wide range of frequency spectrum.

·         Backhaul connectivity with other networks

However 4G being a new technology also has some drawbacks like it is highly capital intensive and infrastructure setup cost is very high. Technical expertise is needed to setup and maintain the network if utility choose to deploy its own network. If we consider the long term vision then using 4G will result in huge ROI.

 

June 19, 2013

Are water utilities actively engaging their customers?

 Is it innovative for a Utility to have active customer engagement? I think the answer historically is that it would be.

However, there have been significant changes to customer perception and expectations on how they would want to be engaged. A key factor driving change in the water utility sector is the change in the regulatory environment - specifically the penalties and incentives for customer service captured in the Service Incentive Mechanism (SIM). Another regulatory impact is the move towards 'retail' separation which is starting with non-domestic customers in the form of ability to select supply contracts.

Today, through our Water UK partnership, we are participating at their Innovation Hub day focused on Active Customer Engagement - many executives and managers from key UK water companies will be there along with other interested and influential parties such as CC Water, regulators etc. (http://www.water.org.uk/events/view_event.php?eventid=73&setstyle=&dateid=45)

I will be posting updates later based upon discussions held throughout the day.