Abstract
This paper investigated the accuracy of the height-to-thickness ratio (h/t) correction factors presented in the American Society for Testing and Materials (ASTM) standards to modify the compressive strength of a prism obtained from a test to a standard prism size. In international masonry codes, an h/t of 5.0 is commonly considered as a reference with a correction factor of 1.0. The ASTM code considers the same h/t of 5.0 as a reference for clay masonry; however, for concrete masonry the reference h/t is 2.0. Moreover, while the ASTM standard allows using both full-block length and half-block length prisms in compression tests, in terms of correction factors no difference is specified between the two prism sizes. In this study, finite element models were developed for concrete masonry prisms and calibrated with experimental results. A parametric study was then performed to examine the effect of the prism thickness, h/t, and length-to-thickness ratio (l/t) on the compressive strength. According to the results, considering an h/t value of 2.0 as a reference, the compressive strength of a full-block concrete masonry prism could be over-predicted by about 22 %, which is a significant error on the unsafe side. Moreover, disregarding the effect of the length of the prism provided significant errors in estimating the compressive strength. The conclusion was that the ASTM standard does not accurately evaluate the actual strength of concrete masonry. As a result, an immediate revision for reevaluating the compressive strength of masonry seems necessary. It was recommended to normalize the strength of concrete masonry prisms according to an h/t of 5.0. Moreover, the effect of the l/t and the thickness of the prism should be considered in estimating the strength using correction factors. Some correction factors were suggested based on the finite element model results.