146 resultados para American Society for Psychical Research (1906- )
Resumo:
Accurate owner budget estimates are critical to the initial decision-to-build process for highway construction projects. However, transportation projects have historically experienced significant construction cost overruns from the time the decision to build has been taken by the owner. This paper addresses the problem of why highway projects overrun their predicted costs. It identifies the owner risk variables that contribute to significant cost overrun and then uses factor analysis, expert elicitation, and the nominal group technique to establish groups of importance ranked owner risks. Stepwise multivariate regression analysis is also used to investigate any correlation of the percentage of cost overrun with risks, together with attributes such as highway project type, indexed cost, geographics location, and project delivery method. The research results indicate a correlation between the reciprocal of project budgets size and percentage cost overrun. This can be useful for owners in determining more realistic decision-to-build highway budget estimates by taking into account the economies of scale associated with larger projects.
Resumo:
This project discusses a component of the research study conducted to provide construction organizations with a generic benchmarking framework to assess their extent of information communication technology (ICT) adoption for building project management processes. It defines benchmarking and discusses objectives of the required benchmarking framework and development of the framework. The study focuses on ICT adoption by small and medium enterprises (SMEs) in the construction industry and with respect to SMEs it is important to understand processes, their indicators, and measures in the local context. Structure of the suggested benchmarking framework has been derived after extensive literature survey and a questionnaire survey conducted in the Indian construction industry. The suggested benchmarking process is an iterative process divided into four stages. It can be implemented at organization and industry levels for rating the construction organizations for ICT adoption and performance measurement. The framework has a generic structure and can be generalized and applied for other countries with due considerations.
Resumo:
Focuses on a study which introduced an iterative modeling method that combines properties of ordinary least squares (OLS) with hierarchical tree-based regression (HTBR) in transportation engineering. Information on OLS and HTBR; Comparison and contrasts of OLS and HTBR; Conclusions.
Resumo:
Large trucks are involved in a disproportionately small fraction of the total crashes but a disproportionately large fraction of fatal crashes. Large truck crashes often result in significant congestion due to their large physical dimensions and from difficulties in clearing crash scenes. Consequently, preventing large truck crashes is critical to improving highway safety and operations. This study identifies high risk sites (hot spots) for large truck crashes in Arizona and examines potential risk factors related to the design and operation of the high risk sites. High risk sites were identified using both state of the practice methods (accident reduction potential using negative binomial regression with long crash histories) and a newly proposed method using Property Damage Only Equivalents (PDOE). The hot spots identified via the count model generally exhibited low fatalities and major injuries but large minor injuries and PDOs, while the opposite trend was observed using the PDOE methodology. The hot spots based on the count model exhibited large AADTs, whereas those based on the PDOE showed relatively small AADTs but large fractions of trucks and high posted speed limits. Documented site investigations of hot spots revealed numerous potential risk factors, including weaving activities near freeway junctions and ramps, absence of acceleration lanes near on-ramps, small shoulders to accommodate large trucks, narrow lane widths, inadequate signage, and poor lighting conditions within a tunnel.
Resumo:
This paper presents the results of a structural equation model (SEM) that describes and quantifies the relationships between corporate culture and safety performance. The SEM is estimated using 196 individual questionnaire responses from three companies with better than average safety records. A multiattribute analysis of corporate safety culture characteristics resulted in a hierarchical description of corporate safety culture comprised of three major categories people, process, and value. These three major categories were decomposed into 54 measurable questions and used to develop a questionnaire to quantify corporate safety culture. The SEM identified five latent variables that describe corporate safety culture: (1) a companys safety commitment; (2) the safety incentives that are offered to field personal for safe performance; (3) the subcontractor involvement in the company culture; (4) the field safety accountability and dedication; and (5) the disincentives for unsafe behaviors. These characteristics of company safety culture serve as indicators for a companys safety performance. Based on the findings from this limited sample of three companies, this paper proposes a list of practices that companies may consider to improve corporate safety culture and safety performance. A more comprehensive study based on a larger sample is recommended to corroborate the findings of this study.
Resumo:
Many studies focused on the development of crash prediction models have resulted in aggregate crash prediction models to quantify the safety effects of geometric, traffic, and environmental factors on the expected number of total, fatal, injury, and/or property damage crashes at specific locations. Crash prediction models focused on predicting different crash types, however, have rarely been developed. Crash type models are useful for at least three reasons. The first is motivated by the need to identify sites that are high risk with respect to specific crash types but that may not be revealed through crash totals. Second, countermeasures are likely to affect only a subset of all crashesusually called target crashesand so examination of crash types will lead to improved ability to identify effective countermeasures. Finally, there is a priori reason to believe that different crash types (e.g., rear-end, angle, etc.) are associated with road geometry, the environment, and traffic variables in different ways and as a result justify the estimation of individual predictive models. The objectives of this paper are to (1) demonstrate that different crash types are associated to predictor variables in different ways (as theorized) and (2) show that estimation of crash type models may lead to greater insights regarding crash occurrence and countermeasure effectiveness. This paper first describes the estimation results of crash prediction models for angle, head-on, rear-end, sideswipe (same direction and opposite direction), and pedestrian-involved crash types. Serving as a basis for comparison, a crash prediction model is estimated for total crashes. Based on 837 motor vehicle crashes collected on two-lane rural intersections in the state of Georgia, six prediction models are estimated resulting in two Poisson (P) models and four NB (NB) models. The analysis reveals that factors such as the annual average daily traffic, the presence of turning lanes, and the number of driveways have a positive association with each type of crash, whereas median widths and the presence of lighting are negatively associated. For the best fitting models covariates are related to crash types in different ways, suggesting that crash types are associated with different precrash conditions and that modeling total crash frequency may not be helpful for identifying specific countermeasures.
Resumo:
The track allocation problem (TAP) at a multi-track, multi-platform mainline railway station is defined by the station track layout and service timetable, which implies combinations of spatial and temporal conflicts. Feasible solutions are available from either traditional planning or advanced intelligent searching methods and their evaluations with respect to operational requirements are essential for the operators. To facilitate thorough analysis, a timed Coloured Petri Nets (CPN) model is presented here to encapsulate the inter-relationships of the spatial and temporal constraints in the TAP.
Resumo:
This paper demonstrates the application of the reliability-centred maintenance (RCM) process to analyse and develop preventive maintenance tasks for electric multiple units (EMU) in the East Rail of the Kowloon-Canton Railway Corporation (KCRC). Two systems, the 25 kV electrical power supply and the air-conditioning system of the EMU, have been chosen for the study. RCM approach on the two systems is delineated step by step in the paper. This study confirms the feasibility and effectiveness of RCM applications on the maintenance of electric trains.
Resumo:
This paper presents a Genetic Algorithms (GA) approach to search the optimized path for a class of transportation problems. The formulation of the problems for suitable application of GA will be discussed. Exchanging genetic information in the sense of neighborhoods will be introduced for generation reproduction. The performance of the GA will be evaluated by computer simulation. The proposed algorithm use simple coding with population size 1 converged in reasonable optimality within several minutes.
Resumo:
This research shows that gross pollutant traps (GPTs) continue to play an important role in preventing visible street wastegross pollutantsfrom contaminating the environment. The demand for these GPTs calls for stringent quality control and this research provides a foundation to rigorously examine the devices. A novel and comprehensive testing approach to examine a dry sump GPT was developed. The GPT is designed with internal screens to capture gross pollutantsorganic matter and anthropogenic litter. This device has not been previously investigated. Apart from the review of GPTs and gross pollutant data, the testing approach includes four additional aspects to this research, which are: field work and an historical overview of street waste/stormwater pollution, calibration of equipment, hydrodynamic studies and gross pollutant capture/retention investigations. This work is the first comprehensive investigation of its kind and provides valuable practical information for the current research and any future work pertaining to the operations of GPTs and management of street waste in the urban environment. Gross pollutant trapsincluding patented and registered designs developed by industryhave specific internal configurations and hydrodynamic separation characteristics which demand individual testing and performance assessments. Stormwater devices are usually evaluated by environmental protection agencies (EPAs), professional bodies and water research centres. In the USA, the American Society of Civil Engineers (ASCE) and the Environmental Water Resource Institute (EWRI) are examples of professional and research organisations actively involved in these evaluation/verification programs. These programs largely rely on field evaluations alone that are limited in scope, mainly for cost and logistical reasons. In Australia, evaluation/verification programs of new devices in the stormwater industry are not well established. The current limitations in the evaluation methodologies of GPTs have been addressed in this research by establishing a new testing approach. This approach uses a combination of physical and theoretical models to examine in detail the hydrodynamic and capture/retention characteristics of the GPT. The physical model consisted of a 50% scale model GPT rig with screen blockages varying from 0 to 100%. This rig was placed in a 20 m flume and various inlet and outflow operating conditions were modelled on observations made during the field monitoring of GPTs. Due to infrequent cleaning, the retaining screens inside the GPTs were often observed to be blocked with organic matter. Blocked screens can radically change the hydrodynamic and gross pollutant capture/retention characteristics of a GPT as shown from this research. This research involved the use of equipment, such as acoustic Doppler velocimeters (ADVs) and dye concentration (Komori) probes, which were deployed for the first time in a dry sump GPT. Hence, it was necessary to rigorously evaluate the capability and performance of these devices, particularly in the case of the custom made Komori probes, about which little was known. The evaluation revealed that the Komori probes have a frequency response of up to 100 Hz which is dependent upon fluid velocitiesand this was adequate to measure the relevant fluctuations of dye introduced into the GPT flow domain. The outcome of this evaluation resulted in establishing methodologies for the hydrodynamic measurements and gross pollutant capture/retention experiments. The hydrodynamic measurements consisted of point-based acoustic Doppler velocimeter (ADV) measurements, flow field particle image velocimetry (PIV) capture, head loss experiments and computational fluid dynamics (CFD) simulation. The gross pollutant capture/retention experiments included the use of anthropogenic litter components, tracer dye and custom modified artificial gross pollutants. Anthropogenic litter was limited to tin cans, bottle caps and plastic bags, while the artificial pollutants consisted of 40 mm spheres with a range of four buoyancies. The hydrodynamic results led to the definition of global and local flow features. The gross pollutant capture/retention results showed that when the internal retaining screens are fully blocked, the capture/retention performance of the GPT rapidly deteriorates. The overall results showed that the GPT will operate efficiently until at least 70% of the screens are blocked, particularly at high flow rates. This important finding indicates that cleaning operations could be more effectively planned when the GPT capture/retention performance deteriorates. At lower flow rates, the capture/retention performance trends were reversed. There is little difference in the poor capture/retention performance between a fully blocked GPT and a partially filled or empty GPT with 100% screen blockages. The results also revealed that the GPT is designed with an efficient high flow bypass system to avoid upstream blockages. The capture/retention performance of the GPT at medium to high inlet flow rates is close to maximum efficiency (100%). With regard to the design appraisal of the GPT, a raised inlet offers a better capture/retention performance, particularly at lower flow rates. Further design appraisals of the GPT are recommended.
Resumo:
Information behavior models generally focus on one of many aspects of information behavior, either information finding, conceptualized as information seeking, information foraging or information sense-making, information organizing and information using. This ongoing study is developing an integrated model of information behavior. The research design involves a 2-week-long daily information journal self-maintained by the participants, combined with two interviews, one before, and one after the journal-keeping period. The data from the study will be analyzed using grounded theory to identify when the participants engage in the various behaviors that have already been observed, identified, and defined in previous models, in order to generate useful sequential data and an integrated model.
Resumo:
Project alliancing is a new alternative to traditional project delivery systems, especially in the commercial building sector. The Collaborative Process is a theoretical model of people and systems characteristics that are required to reduce the adversarial nature of most construction projects. Although developed separately, both are responses to the same pressures. Project alliancing was just used successfully to complete the National Museum of Australia. This project was analyzed as a case study to determine the extent to which it could be classified as a collaborative project. Five key elements of The Collaborative Process were reviewed and numerous examples from the management of this project were cited that support the theoretical recommendations of this model. In the case of this project, significant added value was delivered to the client and many innovations resulted from the collective work of the parties to the contract. It was concluded that project alliances for commercial buildings offer many advantages over traditional project delivery systems, which are related to increasing the levels of collaboration among a project management team.
Resumo:
Pragmatic construction professionals, accustomed to intense price competition and focused on the bottom line, have difficulty justifying investments in advanced technology. Researchers and industry professionals need improved tools to analyze how technology affects the performance of the firm. This paper reports the results of research to begin answering the question, does technology matter? The researchers developed a set of five dimensions for technology strategy, collected information regarding these dimensions along with four measures of competitive performance in five bridge construction firms, and analyzed the information to identify relationships between technology strategy and competitive performance. Three technology strategy dimensionscompetitive positioning, depth of technology strategy, and organizational fitshowed particularly strong correlations with the competitive performance indicators of absolute growth in contract awards and contract award value per technical employee. These findings indicate that technology does matter. The research also provides ways to analyze options for approaching technology and ways to relate technology to competitive performance for use by managers. It also provides a valuable set of research measures for technology strategy.
Resumo:
Rapid mineralization of cultured osteoblasts could be a useful characteristic in stem-cell mediated therapies for fracture and other orthopaedic problems. Dimethyl sulfoxide (DMSO) is a small amphipathic solvent molecule capable of simulating cell differentiation. We report that, in primary human osteoblasts, DMSO dose-dependently enhanced the expression of osteoblast differentiation markers alkaline phosphatase (ALP) activity and extracellular matrix mineralization. Furthermore, similar DMSO mediated mineralization enhancement was observed in primary osteoblast-like cells differentiated from mouse mesenchymal cells derived from fat, a promising source of starter cells for cell-based therapy. Using a convenient mouse pre-osteoblast model cell line MC3T3-E1 we further investigated this phenomenon showing that numerous osteoblast-expressed genes were elevated in response to DMSO treatment and correlated with enhanced mineralization. Myocyte enhancer factor 2c (Mef2c) was identified as the transcription factor most induced by DMSO, among numerous DMSO-induced genes, suggesting a role for Mef2c in osteoblast gene regulation. Immunohistochemistry confirmed expression of Mef2c in osteoblast-like cells in mouse mandible, cortical and trabecular bone. shRNAi-mediated Mef2c gene silencing resulted in defective osteoblast differentiation, decreased ALP activity and matrix mineralization and knockdown of osteoblast specific gene expression, including osteocalcin and bone sialoprotein. Flow on knockdown of bone specific transcription factors, Runx2 and osterix by shRNAi knockdown of Mef2c suggests that Mef2c lies upstream of these two important factors in the cascade of gene expression in osteoblasts.