Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.

« November 2008 | Main | January 2009 »

December 30, 2008

Is Perfect Estimation Achievable

Historically it has been observed that lots of projects experience cost, effort and schedule overrun or poor quality. In most of cases, the project end up taking alternate paths to fulfil the budget constraint and ends up delivering an inadequate product/application/service. Further changing market Scenario does create estimation overruns. In such cases is it ever possible to get a Perfect Estimation.

To analyze this let us start with understanding what are the possible factors affecting estimations of any project:

Lack of information necessary for estimation

Lack of knowledge on how to estimate and estimation techniques

Inexperience in doing estimation for similar kind of work

Inappropriate estimation methodologies

Historical information availability, Historic Conditions and their Understanding and comparison with current conditions.

Inadequate timeframe to perform estimates

How does one overcome the above shortcomings.

Have an in-depth knowledge on the estimation Tools and Techniques through proper Training Session.

Apply the Estimation Tools and Techniques on smaller projects and then over Bigger Sized projects.

Use different Techniques of estimation over the different phases of the project based on the phase type and the expectation from the phase.

Estimation is an ongoing process and not a onetime process throughout the project life cycle. So re-estimate in case of any change in the assumptions to estimations.

Baseline Estimations by following proper guidelines.

Document Change Request and re-estimate accordingly.

Provide for Known and unknown contingencies in the Estimation.

Document project inputs and Outputs clearly for  better estimation.  

Leverage historical information about project's effort, schedule, cost, risk, and resources which can be referred as lessons learned / best practices from engagements executed in the past. Statistical baselines should be created for each factor affecting project effort for e.g. user training effort baseline or project management effort baseline. These statistical baselines should be revised periodically.

Understand the Scope of the project very well.

Break down the entire work into work Packages and estimation should be at the lower level and then summed up to the higher level.

 

Every organization has influence on overall project effort which should be considered while estimating the project timelines as each organization has their own tried and tested way of executing project based on their available skill set and capability with the client.

Estimation should not only be based for Cost and Time but also for Size. Size estimation would help in calculating the Budgeted Organizational productivity.

Proper Risk Management will help in reducing the uncertainties and will help to provide better estimates.

Project Health and Decision making should be validated regularly through proper tools and techniques.

What does one Estimate?

Some of the parameters that one needs to estimate for a successful execution of a project are

Project Duration

Finish Time

Cost

Resource

Success rate

Quality

Why estimations deviate from the real.

At the Start of every project Objective or business requirement is just a thought and our understanding of the Business of the Customer. Once requirements are converted to core Designs things get clearer and development is nearing completion uncertainties start reducing. 

Does that mean that we cannot give estimates?

Estimates can always be provided as they represent costs and other values based on certain assumptions.  Distance between Pune and Mumbai approximately  94 Miles and the averge speed of a vechile @ 60KM’s Per hour reaches in 2.5 Hrs. Does that mean that we can commit to reach Mumbai from Pune in 2.5 Hrs. The answer is No. But surely one can commit that we can reach Mumbai to Pune in a minimum of 2.5 Hrs and maximum of 3.2 Hrs based on the speed of the Vehicle, Traffic on the Road, Stoppage time in between etc.

Conclusion

The above recommended solutions will help in improved estimation effectiveness will enhance business to:

Make better investment decisions

Generate more return on investment (ROI)

Gain advantage over competitors by taking appropriate and timely decisions

More control over project execution

Finally improve organizational productivity.

Essentially estimation effectiveness is nothing more than how close your estimates are to actual.

 

December 24, 2008

Are you still using function points for estimating the size of Package Implementation?

Package Points is the buzzword in Oracle Practice now at Infosys Technologies Ltd for sizing an implementation project for the reasons which I am putting forward based on my experience in involving in sizing many development and package implementation projects.

Function Point Methodology is a universally accepted size measure for any software project which takes into consideration a user view of requirements while sizing. However, package implementation projects have certain characteristics which hinder the effective usage of FP methodology for sizing. Implementing an ERP mostly involves configuring certain parameters in the package and custom development is only a part of the project. Also, implementation projects can be done in partial life cycles. For example, scoping or discovery can be done a vendor A and downstream implementation by vendor B. Or Discovery can happen once for the entire client organization and implementation can happen in multiple phases. In view of the above characteristics, Package Points is a better fit for sizing implementations as it differentiates size of configuration work and custom development work; it addresses partial implementation life cycles also. Often Size is used as an input to decide the cost of implementation due to which the above parameters if not addressed properly can shake the confidence of the estimates on either side.

Package Points Methodology also scores high in terms of usability when compared to Function Point Methodology. A Project Manager or an Implementation Consultant needs to be enabled enough on either methodology to derive accurate estimates for a project. Function Point methodology seems to be more complicated in terms of understanding and arriving at the five components -- External Inputs, External Outputs, External Inquiries, Internal Logical Files and External Interface Files. Understanding Record Element Types still remains a difficult job for many project managers. Without thoroughly understanding these concepts, it’s highly improbable that an accurate estimate of FP size can be arrived at. To use FP methodology, every Project Manager has to be trained thoroughly before put into the task of estimation. On the other hand, Package Points methodology is designed in such a way that the underlying framework and the calculations need not be understood by every user of the methodology. A user just needs to know what are the functionalities along with setups to be done on the package, life cycle tasks the project involves and client complexity parameters. With these inputs, he/she can derive the size of the implementation. So, a user of the methodology doesn’t require being an expert nor formal training on the methodology to use it for sizing.

Please post your views on the Package Points methodology especially in comparison with other popular techniques available in the industry or from your experience.

 

Other blogs -

http://infosysblogs.com/oracle/2008/08/improving_package_implementati.html#trackback

http://infosysblogs.com/oracle/2008/12/how_quick_can_oracles_peopleso.html

http://infosysblogs.com/oracle/2008/12/is_perfect_estimation_achievab.html

http://infosysblogs.com/oracle/2009/01/operational_excellence_metrics_1.html#more

http://infosysblogs.com/oracle/2008/10/earned_value_management_for_er.html

 

 

 

 

December 22, 2008

Redefine your measurements to stimulate operational excellence

A difficult business climate as today’s provides corporations with an opportunity to take a hard look at their operational procedures to weed out inefficiencies that might exist in various business functions. Focusing on reducing waste and improving operational processes helps businesses in their journey towards attaining operational excellence. In addition, it also allows for cost savings, which can provide stability to the profit margins in adverse economic times as these.

Today most businesses have formal processes for continuous improvements in manufacturing and supply chain. However, often these improvement initiatives stop short of attaining their maximum potential because of the way people and processes are measured within the organization. Traditional accounting and operational measures sometimes come in the way of achieving operational excellence because these measures often promote waste such as excess inventory.

As an example, let us consider idle capacity. Usually idle time is considered bad, and as such higher machine utilization is considered good. Higher machine utilization is desirable only if there is sufficient demand to back up the supply created by a highly utilized machine. A machine operating 90% of the time and producing goods filling up warehouses (without any immediate customer demand), would have high machine utilization, but it would lead to higher inventory levels as well. In a lean manufacturing environment, a machine should run only in response to pull from customer demand; targeting a high utilization alone is contrary to that. As such, a better measure of utilization would be schedule reliability- what percentage of the schedule was met on time so as to satisfy the demand created by customer pull. Also, operating the machine only in response to customer demand would mean that flexible capacity is being created and machine maintenance could also be planned better.

Another traditional measure often used is the purchase price variance- the lesser the variance from the standard price, the better. Often suppliers give discounts for large order quantities. In order to use those discounts to keep procurement costs down, buyers place orders for larger than required quantities, or place orders for items that might not have immediate demand. This does reduce the purchase price variance in general, but this also adds to the inventory levels in the warehouse. In addition to the working capital that gets tied up, the administrative costs related to maintaining the inventory also goes up. An alternative measure for the buyer could be their contribution in right- sizing the inventory.

Another example of traditional costing measurements not providing the best results is in the factory overhead allocation. Overheads rates are usually determined based on maximum theoretical capacity, and the actual overhead costs within a period are pro- rated to each unit that gets manufactured irrespective of how much each product consumes the overhead. When profitability analyses are done by product lines/ families, this could inflate or deflate costs incorrectly for different products. Rationalization of product lines to limit the product offering based on product profitability is a lean principle that leads to better capacity management, reduction of inventory and increased flexibility. However, because of the way high overhead costs are pro-rated, the product profitability analyses could lead to erroneous results thereby hiding the non-profitable products.

An important underlying concept directly attached to on-time delivery is lead time. This is a common measure I have seen customers use across industries. They often measure their on- time delivery against the date scheduled by their ERP systems. To ensure on- time delivery,  planners often build a buffer into the lead times so that ERP schedules it with some amount of safety built into the date. If a BOM consists of several levels, then a little bit of buffer in each components’ lead time would considerably inflate the parent lead time. Thus, components would be procured and made in advance of the actual time. This would also add to the inventory and might need rescheduling of the transportation activity. Having said that, on- time delivery is an important measurement and people are justifiably appraised on it, however, efforts should be made to ensure that lead times are not inflated just to get a good grade on on- time delivery.

Another way of looking at this is that traditional measures do not give due credit to those parameters that improve as an outcome of operational improvements. For example, a lean operation would lead to reduced inventory levels, reduced cycle times, increased working capital, reduction in bottlenecks etc.. However, even though these may be measured, they are not tied to any incentives. In my consulting experience, I have observed that employees are appraised on the traditional measures, and seldom on how much they have reduced inventory in a period while meeting customer demand, or what is the cycle time reduction a department has achieved, by what percentage has supplier reliability gone up, what is the schedule adherence for a line etc..

Operational excellence is a journey and the commitment to get leaner is an important part of that journey. However, backing this journey up should be appropriate accounting and operational measures that provide incentives for these initiatives. Traditional accounting measures usually serve the needs of the finance department, however, in modern day world, it should also serve the needs of a lean operational system.

The above are just some examples of measurements that need to be reexamined onb a case by case basis. Can you think of other such measurements that would create the incentives for operational excellence?

Subscribe to this blog's feed

Follow us on

Blogger Profiles

Infosys on Twitter