964 resultados para Crash Predictions
Resumo:
The Highway Safety Manual (HSM) estimates roadway safety performance based on predictive models that were calibrated using national data. Calibration factors are then used to adjust these predictive models to local conditions for local applications. The HSM recommends that local calibration factors be estimated using 30 to 50 randomly selected sites that experienced at least a total of 100 crashes per year. It also recommends that the factors be updated every two to three years, preferably on an annual basis. However, these recommendations are primarily based on expert opinions rather than data-driven research findings. Furthermore, most agencies do not have data for many of the input variables recommended in the HSM. This dissertation is aimed at determining the best way to meet three major data needs affecting the estimation of calibration factors: (1) the required minimum sample sizes for different roadway facilities, (2) the required frequency for calibration factor updates, and (3) the influential variables affecting calibration factors. In this dissertation, statewide segment and intersection data were first collected for most of the HSM recommended calibration variables using a Google Maps application. In addition, eight years (2005-2012) of traffic and crash data were retrieved from existing databases from the Florida Department of Transportation. With these data, the effect of sample size criterion on calibration factor estimates was first studied using a sensitivity analysis. The results showed that the minimum sample sizes not only vary across different roadway facilities, but they are also significantly higher than those recommended in the HSM. In addition, results from paired sample t-tests showed that calibration factors in Florida need to be updated annually. To identify influential variables affecting the calibration factors for roadway segments, the variables were prioritized by combining the results from three different methods: negative binomial regression, random forests, and boosted regression trees. Only a few variables were found to explain most of the variation in the crash data. Traffic volume was consistently found to be the most influential. In addition, roadside object density, major and minor commercial driveway densities, and minor residential driveway density were also identified as influential variables.
Resumo:
Literature presents a huge number of different simulations of gas-solid flows in risers applying two-fluid modeling. In spite of that, the related quantitative accuracy issue remains mostly untouched. This state of affairs seems to be mainly a consequence of modeling shortcomings, notably regarding the lack of realistic closures. In this article predictions from a two-fluid model are compared to other published two-fluid model predictions applying the same Closures, and to experimental data. A particular matter of concern is whether the predictions are generated or not inside the statistical steady state regime that characterizes the riser flows. The present simulation was performed inside the statistical steady state regime. Time-averaged results are presented for different time-averaging intervals of 5, 10, 15 and 20 s inside the statistical steady state regime. The independence of the averaged results regarding the time-averaging interval is addressed and the results averaged over the intervals of 10 and 20 s are compared to both experiment and other two-fluid predictions. It is concluded that the two-fluid model used is still very crude, and cannot provide quantitative accurate results, at least for the particular case that was considered. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
The DSSAT/CANEGRO model was parameterized and its predictions evaluated using data from five sugarcane (Sacchetrum spp.) experiments conducted in southern Brazil. The data used are from two of the most important Brazilian cultivars. Some parameters whose values were either directly measured or considered to be well known were not adjusted. Ten of the 20 parameters were optimized using a Generalized Likelihood Uncertainty Estimation (GLUE) algorithm using the leave-one-out cross-validation technique. Model predictions were evaluated using measured data of leaf area index (LA!), stalk and aerial dry mass, sucrose content, and soil water content, using bias, root mean squared error (RMSE), modeling efficiency (Eff), correlation coefficient, and agreement index. The Decision Support System for Agrotechnology Transfer (DSSAT)/CANEGRO model simulated the sugarcane crop in southern Brazil well, using the parameterization reported here. The soil water content predictions were better for rainfed (mean RMSE = 0.122mm) than for irrigated treatment (mean RMSE = 0.214mm). Predictions were best for aerial dry mass (Eff = 0.850), followed by stalk dry mass (Eff = 0.765) and then sucrose mass (Eff = 0.170). Number of green leaves showed the worst fit (Eff = -2.300). The cross-validation technique permits using multiple datasets that would have limited use if used independently because of the heterogeneity of measures and measurement strategies.
Resumo:
In this work, we have used molecular dynamics, density functional theory, virtual screening, ADMET predictions, and molecular interaction field studies to design and propose eight novel potential inhibitors of CDK2. The eight molecules proposed showed interesting structural characteristics that are required for inhibiting the CDK2 activity and show potential as drug candidates for the treatment of cancer. The parameters related to the Rule of Five were calculated, and only one of the molecules violated more than one parameter. One of the proposals and one of the drug-like compounds selected by virtual screening indicated to be promising candidates for CDK2-based cancer therapy.
Resumo:
We have used various computational methodologies including molecular dynamics, density functional theory, virtual screening, ADMET predictions and molecular interaction field studies to design and analyze four novel potential inhibitors of farnesyltransferase (FTase). Evaluation of two proposals regarding their drug potential as well as lead compounds have indicated them as novel promising FTase inhibitors, with theoretically interesting pharmacotherapeutic profiles, when Compared to the very active and most cited FTase inhibitors that have activity data reported, which are launched drugs or compounds in clinical tests. One of our two proposals appears to be a more promising drug candidate and FTase inhibitor, but both derivative molecules indicate potentially very good pharmacotherapeutic profiles in comparison with Tipifarnib and Lonafarnib, two reference pharmaceuticals. Two other proposals have been selected with virtual screening approaches and investigated by LIS, which suggest novel and alternatives scaffolds to design future potential FTase inhibitors. Such compounds can be explored as promising molecules to initiate a research protocol in order to discover novel anticancer drug candidates targeting farnesyltransferase, in the fight against cancer. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Monoamine oxidase is a flavoenzyme bound to the mitochondrial outer membranes of the cells, which is responsible for the oxidative deamination of neurotransmitter and dietary amines. It has two distinct isozymic forms, designated MAO-A and MAO-B, each displaying different substrate and inhibitor specificities. They are the well-known targets for antidepressant, Parkinson`s disease, and neuroprotective drugs. Elucidation of the x-ray crystallographic structure of MAO-B has opened the way for the molecular modeling studies. In this work we have used molecular modeling, density functional theory with correlation, virtual screening, flexible docking, molecular dynamics, ADMET predictions, and molecular interaction field studies in order to design new molecules with potential higher selectivity and enzymatic inhibitory activity over MAO-B.
Resumo:
Models of population dynamics are commonly used to predict risks in ecology, particularly risks of population decline. There is often considerable uncertainty associated with these predictions. However, alternatives to predictions based on population models have not been assessed. We used simulation models of hypothetical species to generate the kinds of data that might typically be available to ecologists and then invited other researchers to predict risks of population declines using these data. The accuracy of the predictions was assessed by comparison with the forecasts of the original model. The researchers used either population models or subjective judgement to make their predictions. Predictions made using models were only slightly more accurate than subjective judgements of risk. However, predictions using models tended to be unbiased, while subjective judgements were biased towards over-estimation. Psychology literature suggests that the bias of subjective judgements is likely to vary somewhat unpredictably among people, depending on their stake in the outcome. This will make subjective predictions more uncertain and less transparent than those based on models. (C) 2004 Elsevier SAS. All rights reserved.
Resumo:
This paper describes the construction of Australia-wide soil property predictions from a compiled national soils point database. Those properties considered include pH, organic carbon, total phosphorus, total nitrogen, thickness. texture, and clay content. Many of these soil properties are used directly in environmental process modelling including global climate change models. Models are constructed at the 250-m resolution using decision trees. These relate the soil property to the environment through a suite of environmental predictors at the locations where measurements are observed. These models are then used to extend predictions to the continental extent by applying the rules derived to the exhaustively available environmental predictors. The methodology and performance is described in detail for pH and summarized for other properties. Environmental variables are found to be important predictors, even at the 250-m resolution at which they are available here as they can describe the broad changes in soil property.
Resumo:
We show that stochastic electrodynamics and quantum mechanics give quantitatively different predictions for the quantum nondemolition (QND) correlations in travelling wave second harmonic generation. Using phase space methods and stochastic integration, we calculate correlations in both the positive-P and truncated Wigner representations, the latter being equivalent to the semi-classical theory of stochastic electrodynamics. We show that the semiclassical results are different in the regions where the system performs best in relation to the QND criteria, and that they significantly overestimate the performance in these regions. (C) 2001 Published by Elsevier Science B.V.
Resumo:
In this paper, we present a method for estimating local thickness distribution in nite element models, applied to injection molded and cast engineering parts. This method features considerable improved performance compared to two previously proposed approaches, and has been validated against thickness measured by di erent human operators. We also demonstrate that the use of this method for assigning a distribution of local thickness in FEM crash simulations results in a much more accurate prediction of the real part performance, thus increasing the bene ts of computer simulations in engineering design by enabling zero-prototyping and thus reducing product development costs. The simulation results have been compared to experimental tests, evidencing the advantage of the proposed method. Thus, the proposed approach to consider local thickness distribution in FEM crash simulations has high potential on the product development process of complex and highly demanding injection molded and casted parts and is currently being used by Ford Motor Company.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics
Resumo:
In traumatic financial times, both shareholders and the media promptly blame companies for lack of decent corporate governance mechanisms. Proxy statement proposals have increasingly been used by the more active shareholders as to vindicate managers to correct anomalies and restore financial markets’ confidence. I examine the proposals of the largest companies in the S&P 500 index after the Lehmann Brothers crash and their effect on stock prices. Proposals initiated by shareholders negatively impact the company’s stock price, particularly if the proposers are unions, pension funds and institutional investors. Also, I find corporate governance proposals to harm firm’s market performance, unlike compensation and social policy proposals whose effects are intangible. The exception to these disappointing attempts to improve companies’ conduct relies on proposals shared by several investors.
Resumo:
We examine whether earnings manipulation around seasoned equity offerings (SEOs) is associated with an increase in the likelihood of a stock price crash post-issue and test whether the enactment of securities regulations attenuate the relation between SEOs and crash risk. Empirical evidence documents that managerial tendency to conceal bad news increases the likelihood of a stock price crash (Jin and Myers, 2006; Hutton, Marcus, and Tehranian, 2009). We test this hypothesis using a sample of firms from 29 EU countries that enacted the Market Abuse Directive (MAD). Consistent with our hypothesis, we find that equity issuers that engage in earnings management experience a significant increase in crash risk post-SEO relative to control groups of non-issuers; this effect is stronger for equity issuers with poor information environments. In addition, our findings show a significant decline in crash risk post-issue after the enactment of MAD that is stronger for firms that actively manage earnings. This decline in post-issue crash risk is more effective in countries with high ex-ante institutional quality and enforcement. These results suggest that the implementation of MAD helps to mitigate managers’ ability to manipulate earnings around SEOs.
Resumo:
Tese de Doutoramento em Ciências Empresariais.