The Future of Practice

Sponsored by Benjamin Moore & Co.
Architectural Record
1 AIA LU/Elective; 1 AIBD P-CE; 0.1 IACET CEU*; AAA 1 Structured Learning Hour; AANB 1 Hour of Core Learning; AAPEI 1 Structured Learning Hour; This course can be self-reported to the AIBC, as per their CE Guidelines.; MAA 1 Structured Learning Hour; NLAA 1 Hour of Core Learning; NSAA 1 Hour of Core Learning; NWTAA 1 Structured Learning Hour; OAA 1 Learning Hour; SAA 1 Hour of Core Learning

Learning Objectives:

  1. Explain the promise and limitations of machine learning as they relate to design and architecture.
  2. Discuss how architecture firms of various sizes are adapting to and staying ahead of rapid technological change.
  3. Describe new services and areas of practice that firms are expanding into, beyond the traditional boundaries of architecture.
  4. Explain how an outcome-based delivery method could improve architectural compensation.

This course is part of the Interiors Academy

[ Page 8 of 9 ]            

Thinking in Approximations

A structural engineer offers a perspective.

By Robert Silman

The use of computers in analyzing building structures is undeniably a great step forward in our profession. When I trained as a structural engineer in the 1950s, computers were a brand-new wonder, and there were no packaged programs available. If you wanted to use a computer, you had to write the program yourself.

Frank Lloyd Wright’s Falling-water (1937) in Bear Run, PA, in 2002

PHOTOGRAPHY: COURTESY SILMAN

Silman completed the renovations of Frank Lloyd Wright’s Falling-water (1937) in Bear Run, PA, in 2002. The process required the firm to shore up the main-floor cantilever as well as the waterfall’s rocky ledge.

Our firm, Silman, founded in 1966, was one of the first to write its own structural-analysis and design programs. In 1970, we took our successful composite-steel-beam design program to the New York City Department of Buildings and asked them how we should file calculations. Fortunately, they realized that this was the wave of the future and suggested that we develop prototype calculations by hand in the conventional way and then submit parallel results performed by the computer, illustrating that the solutions were the same. To do so, we rented an IBM 1130 with 8k capacity, which was fed by decks of punch cards grinding away for many minutes on fairly simple problems. This became standard protocol for the Department of Buildings, and the first nine programs filed were from our office.

So I am a great advocate of the use of computers for structural analysis and design, and I always have been. But there are drawbacks. When I was studying structural engineering, I used a slide rule, a wonderful apparatus and now an archaeological artifact. Slide rules help to multiply and divide, provide exponential functions, do logarithms and trigonometry. But the slide rule does not tell you where to place the decimal point. Is the answer 10.00 or 100.00 or 1,000.00?

So most of us, before we even started to fiddle with the slider and the cursor window, estimated the answer in advance. We learned to think in approximations. I can remember designing flat-plate concrete buildings with completely irregular column layouts. We used Hardy Cross’s method of moment distribution and generated pages of incredible calculations for different column configurations. The process become repetitive, and we could guess the required reinforcing pretty accurately before putting pen to paper.

This arcane process gave us a “feel” for the buildings that we were designing. They were not some abstract product of machine technology but were rather tactile creations of our very selves. We had used our intuition, which became sharper with experience. There was no way that a large-scale mistake would find its way into the work–we would notice it as a glaring intruder on our orderly process.

In my present role, I review drawings produced by the engineering staff. When I spot an error, the young engineer inevitably will say, “How did you see that so quickly?” I shrug and reply that it was how I was trained, to think about the approximate answer before figuring out the answers. When skipping that intuitive step, one can be easily seduced by computer results that look so neat and orderly.

I am not a Luddite: Our early design methods had enormous shortcomings. Perhaps two of the most grievous were the inability to model the building in three dimensions, as a whole entity, as well as the difficulty in computing building movements. Even structuralanalysis problems of modest indeterminacy were often impossible to solve. Anyone could write the compatibility equations, but as the unknowns grew beyond four or five, finding solutions loomed as a lifetime chore.

So we developed neat techniques called approximate methods. Large mathematical matrices of the compatibility equations could be partitioned and manipulated with all sorts of tricks. Indeed, some very complicated buildings were analyzed using tricks, and they have behaved beautifully over their lifespans, much to the credit of their designers.

For sure, the complicated geometries and configurations of buildings today could never have been analyzed with any degree of confidence using some of these approximate techniques. Computer analysis provides a higher level of mathematical certainty about the behavior of a structure—advantageous in new construction as well as in the renovation of historic buildings. One example is Falling­water, which we helped renovate in 2002. To fix the sagging cantilevers, we needed to determine the stresses in the main cantilever girders that support the house. We knew accurately the building geometry and the reinforcing in the girders, as well as the actual deflections that had occurred over the first 60 years. By performing a three-dimensional analysis, and accounting for the participation of the slabs in two-way action by computer, we were able to manipulate various stiffness factors until the calculated deflections of every cantilever matched the actual measured deflections. With this information we could then design the repair, placing the right amount of post-tensioning where needed. Approximate methods would not have provided the precise answer required.

So how do we train ourselves to get the utmost out of computer analysis without losing an intuitive sense of how a building should behave and what its constituent members should look like? And, as our buildings become more complicated, is it really possible to develop that sort of grasp of their structural elements? We should at least start with some training in approximate analysis of simple structures. Like my professor in my first graduate course in indeterminate structures, instructors should demand that, for the first four weeks of the class, students not be allowed to use any mechanical aids–no calculator, no slide rule, and certainly no computer. Professors should encourage them to sketch the shear and moment diagrams and the shape of the deflected structure; they should thus be able to determine the critical points and quantify them within 15 percent accuracy.

It seems to me that we cannot depend wholly on the answers high technology can give us. Rather we must develop a feel for structures by using some of the educational techniques of the past—fostering the ability to see the whole, which technology supports but cannot replace.

Robert Silman, president emeritus of Silman, the structural engineering firm, is on the faculty of the Graduate School of Design at Harvard University.

This test is no longer available for credit
[ Page 8 of 9 ]            
Originally published in Architectural Record
Originally published in June 2018

Notice

Academies