Design Driven by Accurate Cost Data

Using independent, up-to-date sources helps assure project success
[ Page 4 of 5 ]  previous page Page 1 Page 2 Page 3 Page 4 Page 5 next page
Sponsored by RSMeans data from Gordian
By Peter J. Arsenault, FAIA, NCARB, LEED AP
This test is no longer available for credit

Predicting Future Costs

There is one other aspect about doing cost estimates that needs to be taken into account—namely, time. Projects of any size can take years to plan, design, and develop before any actual construction contracts are signed and costs agreed to. Therefore, an estimate done during the design process needs to be based on the future point when the construction is actually being paid for. While historical data and inflation rate trends for construction have been typically used to adjust current costs, that all assumes a stable construction industry environment and little volatility in costs. Anyone who lived through 2008 knows that those conditions are never guaranteed. Hence, construction cost estimates done during design may become quickly outdated once it’s time to start construction if conditions change.

Predictive cost data is based on a combination of traditional cost estimating and data-mining principles with the use of contemporary computer capabilities.

The recession of 2008 caused considerable disruption in the construction industry and is the source of a number of valuable, albeit painful lessons in addressing future costs. In particular, cost-estimating specialists and economists have observed some of the following.

  • Labor force: Following the crash, a significant number of subcontractors and smaller contractors left the construction industry. A few years later, owners and serial builders began to slowly plan for regrowth; however, in the midst of this planning, the construction labor force reduced by three-fifths (60 percent). This reduced supply of qualified labor has not only pushed up the cost of that labor, but it has also led to real time delays, which can have its own impact on total project costs.
  • Material costs: The volatility and price fluctuations of individual materials used can vary greatly year-over-year based on interactions of various commodities markets and construction demand (i.e., volume of materials needed). Natural disasters and economic conditions around the country have also created volatility in the availability and pricing of many construction materials. In recent times, international trade issues have affected the cost of materials, including the imposition of tariffs, which has created uncertain pricing.
  • Total cost impacts: Bare material, labor, and equipment rates account for 79 percent of total construction costs on average. Thus, the volatility of commodity markets and decreased supply of labor has a dramatic impact on the cost to build. Historical build costs and factors used in previous years became obsolete, and more importantly, building owners became keenly aware of these escalating costs. This led to design and construction professionals being held more accountable than ever to manage to their forecasted total budgets.

All of these impacts directly affect the ability of any estimator to create a trustworthy cost estimate. Without a means to address them, serious problems of project management and liability can occur.

Predictive Cost Data

Based on the above, it is easy to see why traditional forecast data, developed during a time of far less computing power and availability of “big data,” simply does not meet the needs of today’s professionals for accuracy of planning and future budgeting. A technique to address these shortcomings has been developed by at least one construction-data company and is referred to as “predictive cost data.” It was created and is maintained using a hybrid methodology that combines classical econometric techniques with contemporary data-mining methodology. However, it differs from traditional econometric forecasts in two ways.

First, traditional econometric forecasts are based on macroeconomic theory, even when analyses of historical values of those macroeconomic indicators demonstrate them to be statistically insignificant predictors. Predictive cost models disregard theory altogether and are instead based exclusively on data-driven empirical evidence. This empirical evidence is the result of extensive exploratory data analyses and pattern-seeking visualizations of historical cost data with economic and market indicators prior to model development. This approach has been extensively researched and validated by Dr. Edward Leamer, professor of global economics and management at the University of California, Los Angeles. Only economic indicators that have been approved in exploratory analyses become candidates for model development, testing, validation, and resulting predictive cost estimates.

Second, predictive cost data uses mining techniques and principles to improve traditional econometric modeling practices. This family of processes and analyses has evolved since the 1990s from a mix of classical statistical principles and more contemporary computer science and machine-learning methods. In this case, the data-mining methodology is specifically designed to analyze observational data instead of experimental data, as in classical statistical and econometric techniques. It is a robust methodology that takes advantage of recent increases in computing power, data visualization techniques, and updated statistical procedures to find patterns and determine structural drivers of construction material and labor costs. Measures of these drivers and their relationships to each other and to construction costs, along with their associated lead or lag times, are then represented in a statistical algorithm that predicts future values for a defined material and location.

Predictive cost data models have been shown to be very accurate when used properly.

How well does this predictive cost data model work? Combined validation results have shown 95.6 percent of predictive values are within plus or minus 3 percent of year-over-year actual values when applied to the approximately 56,000 surveyed materials. Further, quality predictive models are constantly monitored for degeneration of accuracy, which is to be expected as economic and market conditions change. Decisions can be made as to whether a model needs to be refit, remodeled or rebuilt altogether based upon ongoing quarterly updates of external economic, construction-specific, and market-condition indicators data. In addition, special analyses and model checking can be performed as changes in market conditions are announced, such as with recent tariffs imposed on steel and aluminum.

 

[ Page 4 of 5 ]  previous page Page 1 Page 2 Page 3 Page 4 Page 5 next page
Originally published in Architectural Record
Originally published in August 2019

Notice

Academies