1、Hierarchy probability cost analysis model incorporate MAIMS principle for EPC project cost estimation 4. Hierarchy integrated probability cost analysis (HIPCA) models for EPC cost estimation. In this section we introduce hierarchy probability cost analysis (HIPCA) methodology, which incorporates all
2、 aforementioned concepts for determining the total project cost (TPC) of EPC projects. Our objective is to develop an optimal but realistic TPC for a given probability of success (PoS) that we assume has been specified by allocating the baseline budgets, and managing contingency, based on the desire
3、 to win the project and risk tolerance. 4.1. Correlation coefficient and its feasible verification Once historical data is available, two different measures are used to reflect the degree of relation between cost elements in literature. The first one is an ordinary product-moment (Pearson) correlati
4、on coefficient and the second is a rank (Spearman) correlation coefficient. A non-parametric (distribution-free) rank statistic proposed by Spearman in 1904 as a measure of the strength of the associations between two variables ( Lehmann & DAbrera, 1998). The Spearman rank correlation coefficient ca
5、n be used to give a real estimate, and is a measure of monotone association that is used when the distribution of the data make Pearsons correlation coefficient undesirable or misleading. While it may be difficult to justify use of a specific numeric value to represent the correlation between two co
6、st elements, it is important to avoid the temptation to omit the correlation altogether when a precise value for it cannot be established. Such an omission will set the correlation in question to the exact value of zero; whereas positive values of the correlation coefficient tend to widen the total-
7、cost probability distribution and thus increase the gap between a specific cost percentile (e.g., 70%) and the best-estimate cost. That is to say, the contingency could be larger. Therefore, using reasonable non-zero values, such as 0.2 or 0.3, generally leads to a more realistic representation of t
8、otal-cost uncertainty. Subjective judgment also finds application in specifying the cor-relations between cost elements qualitatively. To this respect, researchers can subjectively choose two groups of correlations to assess strong, moderate, and weak relations: 0.8,0.45,0.15 ( Touran, 1993) and 0.8
9、5,0.55,0.25 ( Chau, 1995). Other more recent scholars explain, simply, as a rule of thumb, we can say that correlations of less than 0.30 indicate little if any relationship between the variables. Reasonable correlation values in the range 0.30.6 should lead to more realistic cost estimates than the
10、 overly optimistic values assuming independence or the overly pessimistic values assuming perfect correlation ( Kujawski et al., 2004). Matrix theory implies that a correlation matrix will not have any negative determinants in real life. When a correlation matrix is used in simulation, an important
11、requirement is to ensure its feasibility, which restricts the matrix to be positive semi-definite regardless of its type (product-moment or rank) or the way it is estimated (historical or subjective) ( Lurie & Goldberg, 1998). Being positive semi-definite means the eigenvalues of the correlation mat
12、rix must be non-negative. That is to say, internal consistency checking between cost elements is necessary for cost estimation. In the literature, it has frequently occurred that the correlation matrix is not positive definite as indicated by Ranasinghe (2000). This is particularly an issue when the
13、 number of dimensions increases because the possibility of having an infeasible correlation matrix will grow rapidly as the dimension increases ( Kurowicka & Cooke, 2001). Tourans approach was to reduce all the correlations slightly (say 0.01) and repeat until the correlation matrix becomes feasible
14、 ( Touran, 1993). This approach overlooks the possibility of increasing some correlations while reducing others. Ranasinghe (2000) developed a computer program to iteratively calculate and list the bounds of each correlation to make the matrix positive semi-definite. The program then asks the estima
15、tor to change the original values and wait until the program re-checks the feasibility and new bounds. This process continues until reaching the feasibility. This approach, however, may be time consuming due to its iterative nature. Yang (2005) developed an automatic procedure to check the feasibili
16、ty of the correlation matrix and adjust it if necessary. It is complicated and difficult to understand due to decomposing the correlation matrix into a diagonal vector of the eigenvalues, and normalization of the diagonal elements to ensure unit diagonals. Here, we advocate that Crystal Ball can be
17、adopted to conduct the eigenvalue test, on the correlation matrix to uncover this problem. The program warns the user of the inconsistent correlations as Fig. 2. Adjusting the coefficients allows the user to ensure that the correlation matrix is at least not demonstrably impossible. A simple approac
18、h to using the correlation algorithm in the program is to adjust the coefficients permanently after writing down what they were originally. In this way the analyst will find out after the simulation what Crystal Ball had to do to the coefficients to make them possible. This is a minimal test and doe
19、s not ensure that the correlation coefficients are right in any sense. After examining what the program needed to do, the risk analyst still must take responsibility for the coefficients actually used. 4.2. Dilemma for PCA methodology The only point value from independent constituent distributions t
20、hat can be added to obtain the corresponding statistical point value from the sum of the constituent distributions is the mean value. Therefore, task-level contingencies derived from individual task distributions cannot be added to obtain the project total contingency. Traditional contingency calcul
21、ations that add an arbitrary factor to task-level costs and then sum these amounts to a project total, which can produce very conservative project budgets that would be completely outside the calculated distribution of expected results. We review some statistical notations in order to discover the potential problem for typical PCA. For the two random variables x and y, we can have following notations on the basis of probability and statistical theory. 4.3. Hierarchy PCA model During the bidding stage, the EPC project must be structured into a limited