946 resultados para Over-dispersion, Crash prediction, Bayesian method, Intersection safety


Relevância:

100.00% 100.00%

Publicador:

Resumo:

CHAPTER II - This study evaluated the effects of two different types of acute aerobic exercise on the osmotic stability of human erythrocyte membrane and on different hematological and biochemical variables that are associated with this membrane property. The study population consisted of 20 healthy and active men. Participants performed single sessions of two types of exercise. The first session consisted of 60 min of moderate-intensity continuous exercise (MICE). The second session, executed a week later, consisted of high-intensity interval exercise (HIIE) until exhaustion. The osmotic stability of the erythrocyte membrane was represented by the inverse of the salt concentration (1/H50) at the midpoint of the sigmoidal curve of dependence between the absorbance of hemoglobin and the NaCl concentration. The values of 1/H50 changed from 2.29 ± 0.1 to 2.33 ± 0.09 after MICE and from 2.30 ± 0.08 to 2.23 ± 0.12 after HIIE. In MICE has occurred an increase in the mean corpuscular volume, probably due to in vivo lysis of older erythrocytes, with preservation of cells that were larger and more resistant to in vitro lysis. The study showed that a single bout of acute exercise affected the erythrocyte osmotic stability, which increased after MICE and decreased after HIIE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les modèles incrémentaux sont des modèles statistiques qui ont été développés initialement dans le domaine du marketing. Ils sont composés de deux groupes, un groupe contrôle et un groupe traitement, tous deux comparés par rapport à une variable réponse binaire (le choix de réponses est « oui » ou « non »). Ces modèles ont pour but de détecter l’effet du traitement sur les individus à l’étude. Ces individus n’étant pas tous des clients, nous les appellerons : « prospects ». Cet effet peut être négatif, nul ou positif selon les caractéristiques des individus composants les différents groupes. Ce mémoire a pour objectif de comparer des modèles incrémentaux d’un point de vue bayésien et d’un point de vue fréquentiste. Les modèles incrémentaux utilisés en pratique sont ceux de Lo (2002) et de Lai (2004). Ils sont initialement réalisés d’un point de vue fréquentiste. Ainsi, dans ce mémoire, l’approche bayésienne est utilisée et comparée à l’approche fréquentiste. Les simulations sont e ectuées sur des données générées avec des régressions logistiques. Puis, les paramètres de ces régressions sont estimés avec des simulations Monte-Carlo dans l’approche bayésienne et comparés à ceux obtenus dans l’approche fréquentiste. L’estimation des paramètres a une influence directe sur la capacité du modèle à bien prédire l’effet du traitement sur les individus. Nous considérons l’utilisation de trois lois a priori pour l’estimation des paramètres de façon bayésienne. Elles sont choisies de manière à ce que les lois a priori soient non informatives. Les trois lois utilisées sont les suivantes : la loi bêta transformée, la loi Cauchy et la loi normale. Au cours de l’étude, nous remarquerons que les méthodes bayésiennes ont un réel impact positif sur le ciblage des individus composant les échantillons de petite taille.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La caractérisation détaillée de vastes territoires pose un défi de taille et est souvent limitée par les ressources disponibles et le temps. Les travaux de cette maîtrise s’incorporent au projet ParaChute qui porte sur le développement d’une Méthode québécoise d’Évaluation du Danger des Chutes de Pierres (MEDCP) le long d’infrastructures linéaires. Pour optimiser l’utilisation des ressources et du temps, une méthode partiellement automatisée facilitant la planification des travaux de terrain a été développée. Elle se base principalement sur la modélisation des trajectoires de chutes de pierres 3D pour mieux cibler les falaises naturelles potentiellement problématiques. Des outils d’automatisation ont été développés afin de permettre la réalisation des modélisations sur de vastes territoires. Les secteurs où l’infrastructure a le plus de potentiel d’être atteinte par d’éventuelles chutes de pierres sont identifiés à partir des portions de l’infrastructure les plus traversées par les trajectoires simulées. La méthode a été appliquée le long du chemin de fer de la compagnie ArcelorMittal Infrastructures Canada. Le secteur couvert par l’étude débute à une dizaine de kilomètres au nord de Port-Cartier (Québec) et s’étend sur 260 km jusqu’au nord des monts Groulx. La topographie obtenue de levés LiDAR aéroportés est utilisée afin de modéliser les trajectoires en 3D à l’aide du logiciel Rockyfor3D. Dans ce mémoire, une approche facilitant la caractérisation des chutes de pierres le long d’un tracé linéaire est présentée. Des études de trajectoires préliminaires sont réalisées avant les travaux sur le terrain. Les informations tirées de ces modélisations permettent de cibler les secteurs potentiellement problématiques et d’éliminer ceux qui ne sont pas susceptibles de générer des chutes de pierres avec le potentiel d’atteindre les éléments à risque le long de l’infrastructure linéaire.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les modèles incrémentaux sont des modèles statistiques qui ont été développés initialement dans le domaine du marketing. Ils sont composés de deux groupes, un groupe contrôle et un groupe traitement, tous deux comparés par rapport à une variable réponse binaire (le choix de réponses est « oui » ou « non »). Ces modèles ont pour but de détecter l’effet du traitement sur les individus à l’étude. Ces individus n’étant pas tous des clients, nous les appellerons : « prospects ». Cet effet peut être négatif, nul ou positif selon les caractéristiques des individus composants les différents groupes. Ce mémoire a pour objectif de comparer des modèles incrémentaux d’un point de vue bayésien et d’un point de vue fréquentiste. Les modèles incrémentaux utilisés en pratique sont ceux de Lo (2002) et de Lai (2004). Ils sont initialement réalisés d’un point de vue fréquentiste. Ainsi, dans ce mémoire, l’approche bayésienne est utilisée et comparée à l’approche fréquentiste. Les simulations sont e ectuées sur des données générées avec des régressions logistiques. Puis, les paramètres de ces régressions sont estimés avec des simulations Monte-Carlo dans l’approche bayésienne et comparés à ceux obtenus dans l’approche fréquentiste. L’estimation des paramètres a une influence directe sur la capacité du modèle à bien prédire l’effet du traitement sur les individus. Nous considérons l’utilisation de trois lois a priori pour l’estimation des paramètres de façon bayésienne. Elles sont choisies de manière à ce que les lois a priori soient non informatives. Les trois lois utilisées sont les suivantes : la loi bêta transformée, la loi Cauchy et la loi normale. Au cours de l’étude, nous remarquerons que les méthodes bayésiennes ont un réel impact positif sur le ciblage des individus composant les échantillons de petite taille.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Secure Multi-party Computation (MPC) enables a set of parties to collaboratively compute, using cryptographic protocols, a function over their private data in a way that the participants do not see each other's data, they only see the final output. Typical MPC examples include statistical computations over joint private data, private set intersection, and auctions. While these applications are examples of monolithic MPC, richer MPC applications move between "normal" (i.e., per-party local) and "secure" (i.e., joint, multi-party secure) modes repeatedly, resulting overall in mixed-mode computations. For example, we might use MPC to implement the role of the dealer in a game of mental poker -- the game will be divided into rounds of local decision-making (e.g. bidding) and joint interaction (e.g. dealing). Mixed-mode computations are also used to improve performance over monolithic secure computations. Starting with the Fairplay project, several MPC frameworks have been proposed in the last decade to help programmers write MPC applications in a high-level language, while the toolchain manages the low-level details. However, these frameworks are either not expressive enough to allow writing mixed-mode applications or lack formal specification, and reasoning capabilities, thereby diminishing the parties' trust in such tools, and the programs written using them. Furthermore, none of the frameworks provides a verified toolchain to run the MPC programs, leaving the potential of security holes that can compromise the privacy of parties' data. This dissertation presents language-based techniques to make MPC more practical and trustworthy. First, it presents the design and implementation of a new MPC Domain Specific Language, called Wysteria, for writing rich mixed-mode MPC applications. Wysteria provides several benefits over previous languages, including a conceptual single thread of control, generic support for more than two parties, high-level abstractions for secret shares, and a fully formalized type system and operational semantics. Using Wysteria, we have implemented several MPC applications, including, for the first time, a card dealing application. The dissertation next presents Wys*, an embedding of Wysteria in F*, a full-featured verification oriented programming language. Wys* improves on Wysteria along three lines: (a) It enables programmers to formally verify the correctness and security properties of their programs. As far as we know, Wys* is the first language to provide verification capabilities for MPC programs. (b) It provides a partially verified toolchain to run MPC programs, and finally (c) It enables the MPC programs to use, with no extra effort, standard language constructs from the host language F*, thereby making it more usable and scalable. Finally, the dissertation develops static analyses that help optimize monolithic MPC programs into mixed-mode MPC programs, while providing similar privacy guarantees as the monolithic versions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Realizou-se um trabalho experimental com o objectivo de obter informação sobre a evolução do crescimento de Staphylococcus aureus. Foram utilizadas duas estirpes de Staphylococcus aureus, uma isolada a partir de rissóis de frango e uma estirpe de referência, a ATCC n. 9213, estas estirpes foram sujeitas a 3 valores de pH diferentes (que representam os valores de pH que é possível, ou seja pH 4, 5,5 e 7, a 3 valores de concentração de NaCl, nomeadamente, 0,5%, 7% e 15%. A temperatura de desenvolvimento será de 7°C, 37°C e 50aC. Utilizaram-se dois métodos para avaliar o crescimento de Staphylococcus aureus, ao longo do tempo, nomeadamente o Método Turbidímétrico e o Método de contagem de unidades formadoras de colónias (método das diluições sucessivas). ABSTRACT: Carried out experimental work in order to obtain information on the evolution of the growth of Staphylococcus aureus. We used two strains of Staphylococcus aureus, a strain isolated from a chicken patties and one reference strain, ATCC Nº 29213, these strains were subjected to 3 different pH values (which represent the values of pH it is possible, or is pH 4, 5.5 and 7, the 3 values of NaCI concentration, namely, 0.5%, 7% and 15%. The growth temperature is 7 °C, 37°C and 50ºC. We used two methods to evaluate the growth of Staphylococcus aureus, over time, including the turbidimetric method and the method of counting colony forming units (method of successive dilutions).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

On the October 7 and 8, 2008, a road safety audit was conducted for the intersection of US 61/Harrison Street and West Locust Street in Davenport, Iowa. US 61/Harrison Street is a one-way street and a principal arterial route through Davenport, with three southbound lanes. Locust Street is a four-lane, two-way minor arterial running across the city from west to east. The last major improvement at this intersection was implemented approximately 20 years ago. The Iowa Department of Transportation requested a safety audit of this intersection in response to a high incidence of crashes at the location over the past several years, in view of the fact that no major improvements are anticipated for this intersection in the immediate future. The road safety audit team discussed current conditions at the intersection and reviewed the last seven years of crash data. The team also made daytime and nighttime field visits to the intersection to examine field conditions and observe traffic flow and crossing guard operations with younger pedestrians. After discussing key issues, the road safety audit team drew conclusions and suggested possible enforcement, engineering, public information, and educational strategies for mitigation.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

We analyze crash data collected by the Iowa Department of Transportation using Bayesian methods. The data set includes monthly crash numbers, estimated monthly traffic volumes, site length and other information collected at 30 paired sites in Iowa over more than 20 years during which an intervention experiment was set up. The intervention consisted in transforming 15 undivided road segments from four-lane to three lanes, while an additional 15 segments, thought to be comparable in terms of traffic safety-related characteristics were not converted. The main objective of this work is to find out whether the intervention reduces the number of crashes and the crash rates at the treated sites. We fitted a hierarchical Poisson regression model with a change-point to the number of monthly crashes per mile at each of the sites. Explanatory variables in the model included estimated monthly traffic volume, time, an indicator for intervention reflecting whether the site was a “treatment” or a “control” site, and various interactions. We accounted for seasonal effects in the number of crashes at a site by including smooth trigonometric functions with three different periods to reflect the four seasons of the year. A change-point at the month and year in which the intervention was completed for treated sites was also included. The number of crashes at a site can be thought to follow a Poisson distribution. To estimate the association between crashes and the explanatory variables, we used a log link function and added a random effect to account for overdispersion and for autocorrelation among observations obtained at the same site. We used proper but non-informative priors for all parameters in the model, and carried out all calculations using Markov chain Monte Carlo methods implemented in WinBUGS. We evaluated the effect of the four to three-lane conversion by comparing the expected number of crashes per year per mile during the years preceding the conversion and following the conversion for treatment and control sites. We estimated this difference using the observed traffic volumes at each site and also on a per 100,000,000 vehicles. We also conducted a prospective analysis to forecast the expected number of crashes per mile at each site in the study one year, three years and five years following the four to three-lane conversion. Posterior predictive distributions of the number of crashes, the crash rate and the percent reduction in crashes per mile were obtained for each site for the months of January and June one, three and five years after completion of the intervention. The model appears to fit the data well. We found that in most sites, the intervention was effective and reduced the number of crashes. Overall, and for the observed traffic volumes, the reduction in the expected number of crashes per year and mile at converted sites was 32.3% (31.4% to 33.5% with 95% probability) while at the control sites, the reduction was estimated to be 7.1% (5.7% to 8.2% with 95% probability). When the reduction in the expected number of crashes per year, mile and 100,000,000 AADT was computed, the estimates were 44.3% (43.9% to 44.6%) and 25.5% (24.6% to 26.0%) for converted and control sites, respectively. In both cases, the difference in the percent reduction in the expected number of crashes during the years following the conversion was significantly larger at converted sites than at control sites, even though the number of crashes appears to decline over time at all sites. Results indicate that the reduction in the expected number of sites per mile has a steeper negative slope at converted than at control sites. Consistent with this, the forecasted reduction in the number of crashes per year and mile during the years after completion of the conversion at converted sites is more pronounced than at control sites. Seasonal effects on the number of crashes have been well-documented. In this dataset, we found that, as expected, the expected number of monthly crashes per mile tends to be higher during winter months than during the rest of the year. Perhaps more interestingly, we found that there is an interaction between the four to three-lane conversion and season; the reduction in the number of crashes appears to be more pronounced during months, when the weather is nice than during other times of the year, even though a reduction was estimated for the entire year. Thus, it appears that the four to three-lane conversion, while effective year-round, is particularly effective in reducing the expected number of crashes in nice weather.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Traffic safety engineers are among the early adopters of Bayesian statistical tools for analyzing crash data. As in many other areas of application, empirical Bayes methods were their first choice, perhaps because they represent an intuitively appealing, yet relatively easy to implement alternative to purely classical approaches. With the enormous progress in numerical methods made in recent years and with the availability of free, easy to use software that permits implementing a fully Bayesian approach, however, there is now ample justification to progress towards fully Bayesian analyses of crash data. The fully Bayesian approach, in particular as implemented via multi-level hierarchical models, has many advantages over the empirical Bayes approach. In a full Bayesian analysis, prior information and all available data are seamlessly integrated into posterior distributions on which practitioners can base their inferences. All uncertainties are thus accounted for in the analyses and there is no need to pre-process data to obtain Safety Performance Functions and other such prior estimates of the effect of covariates on the outcome of interest. In this slight, fully Bayesian methods may well be less costly to implement and may result in safety estimates with more realistic standard errors. In this manuscript, we present the full Bayesian approach to analyzing traffic safety data and focus on highlighting the differences between the empirical Bayes and the full Bayes approaches. We use an illustrative example to discuss a step-by-step Bayesian analysis of the data and to show some of the types of inferences that are possible within the full Bayesian framework.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The Iowa Department of Transportation (DOT) requested a road safety audit (RSA) of the US 59/IA 9 intersection in northwestern Iowa, just south of the Minnesota border, to assess intersection environmental issues and crash history and recommend appropriate mitigation to address the identified safety issues at the intersection. Although the number of crashes at the location has not been significantly higher than the statewide average for similar intersections, the severity of these crashes has been of concern. This RSA was unique in that it included intersection video observation and recorded traffic conflict data analysis, along with the daylight and nighttime field reviews. This report outlines the findings and recommendations of the RSA team for addressing the safety concerns at this intersection.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The Highway Safety Manual is the national safety manual that provides quantitative methods for analyzing highway safety. The HSM presents crash modification factors related to work zone characteristics such as work zone duration and length. These crash modification factors were based on high-impact work zones in California. Therefore there was a need to use work zone and safety data from the Midwest to calibrate these crash modification factors for use in the Midwest. Almost 11,000 Missouri freeway work zones were analyzed to derive a representative and stratified sample of 162 work zones. The 162 work zones was more than four times the number of work zones used in the HSM. This dataset was used for modeling and testing crash modification factors applicable to the Midwest. The dataset contained work zones ranging from 0.76 mile to 9.24 miles and with durations from 16 days to 590 days. A combined fatal/injury/non-injury model produced a R2 fit of 0.9079 and a prediction slope of 0.963. The resulting crash modification factors of 1.01 for duration and 0.58 for length were smaller than the values in the HSM. Two practical application examples illustrate the use of the crash modification factors for comparing alternate work zone setups.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In April 2008 a preliminary investigation of fatal and major injury crashes on Iowa’s primary road system from 2001 through 2007 was conducted by the Iowa Department of Transportation, Office of Traffic and Safety. A mapping of these data revealed an apparent concentration of these serious crashes on a section of Iowa 25 north of Creston. Based on this information, a road safety audit of this roadway section was requested by the Office of Traffic and Safety. Iowa 25 is a two-lane asphaltic concrete pavement roadway, 22 ft in width with approximately 6 ft wide granular shoulders. Originally constructed in 1939, the roadway was last rehabilitated in 1996 with a 4-in. asphalt overlay. Except for shoulder paving through a curve area, no additional work beyond routine maintenance has been accomplished in the section. The 2004 traffic map indicates that IA 25 has a traffic volume of approximately 2070 vehicles per day with 160 commercial vehicles. The posted speed is 55 mph. This report contains a discussion of audit team findings, crash and roadway data, and recommendations for possible mitigation of safety concerns for this roadway section.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Beginning on June 22, 2009, a road safety audit was initiated for the intersection of US 218 and County Road C-57 in Black Hawk County, Iowa. Due to the traffic volumes and the number of conflicting traffic movements on these two roadways, this intersection has developed a crash history that concerns the Iowa Department of Transportation (Iowa DOT), Iowa State Patrol, and local agencies. This intersection is ranked seventh in Iowa for the highest number of at-grade expressway intersection crashes. Considering this, Black Hawk County and the Iowa DOT requested that a road safety audit be conducted to address the safety concerns and recommend possible mitigation strategies.