955 resultados para event-driven simulation
Resumo:
In Switzerland, the annual cost of damage by natural elements has been increasing for several years despite the introduction of protective measures. Mainly induced by material destruction building insurance companies have to pay the majority of this cost. In many European countries, governments and insurance companies consider prevention strategies to reduce vulnerability. In Switzerland, since 2004, the cost of damage due to natural hazards has surpassed the cost of damage due to fire; a traditional activity of the Cantonal Insurance company (EGA). Therefore, the strategy for efficient fire prevention incorporates a reduction of the vulnerability of buildings. The thesis seeks to illustrate the relevance of such an approach when applied to the damage caused by natural hazards. It examines the role of insurance place and its involvement in targeted prevention of natural disasters. Integrated risk management involves a faultless comprehension of all risk parameters The first part of the thesis is devoted to the theoretical development of the key concepts that influence risk management, such as: hazard, vulnerability, exposure or damage. The literature on this subject, very prolific in recent years, was taken into account and put in perspective in the context of this study. Among the risk parameters, it is shown in the thesis that vulnerability is a factor that we can influence efficiently in order to limit the cost of damage to buildings. This is confirmed through the development of an analysis method. This method has led to the development of a tool to assess damage to buildings by flooding. The tool, designed for the property insurer or owner, proposes several steps, namely: - Vulnerability and damage potential assessment; - Proposals for remedial measures and risk reduction from an analysis of the costs of a potential flood; - Adaptation of a global strategy in high-risk areas based on the elements at risk. The final part of the thesis is devoted to the study of a hail event in order to provide a better understanding of damage to buildings. For this, two samples from the available claims data were selected and analysed in the study. The results allow the identification of new trends A second objective of the study was to develop a hail model based on the available data The model simulates a random distribution of intensities and coupled with a risk model, proposes a simulation of damage costs for the determined study area. Le coût annuel des dommages provoqués par les éléments naturels en Suisse est conséquent et sa tendance est en augmentation depuis plusieurs années, malgré la mise en place d'ouvrages de protection et la mise en oeuvre de moyens importants. Majoritairement induit par des dégâts matériels, le coût est supporté en partie par les assurances immobilières en ce qui concerne les dommages aux bâtiments. Dans de nombreux pays européens, les gouvernements et les compagnies d'assurance se sont mis à concevoir leur stratégie de prévention en termes de réduction de la vulnérabilité. Depuis 2004, en Suisse, ce coût a dépassé celui des dommages dus à l'incendie, activité traditionnelle des établissements cantonaux d'assurance (ECA). Ce fait, aux implications stratégiques nombreuses dans le domaine public de la gestion des risques, résulte en particulier d'une politique de prévention des incendies menée efficacement depuis plusieurs années, notamment par le biais de la diminution de la vulnérabilité des bâtiments. La thèse, par la mise en valeur de données actuarielles ainsi que par le développement d'outils d'analyse, cherche à illustrer la pertinence d'une telle approche appliquée aux dommages induits par les phénomènes naturels. Elle s'interroge sur la place de l'assurance et son implication dans une prévention ciblée des catastrophes naturelles. La gestion intégrale des risques passe par une juste maîtrise de ses paramètres et de leur compréhension. La première partie de la thèse est ainsi consacrée au développement théorique des concepts clés ayant une influence sur la gestion des risques, comme l'aléa, la vulnérabilité, l'exposition ou le dommage. La littérature à ce sujet, très prolifique ces dernières années, a été repnse et mise en perspective dans le contexte de l'étude, à savoir l'assurance immobilière. Parmi les paramètres du risque, il est démontré dans la thèse que la vulnérabilité est un facteur sur lequel il est possible d'influer de manière efficace dans le but de limiter les coûts des dommages aux bâtiments. Ce raisonnement est confirmé dans un premier temps dans le cadre de l'élaboration d'une méthode d'analyse ayant débouché sur le développement d'un outil d'estimation des dommages aux bâtiments dus aux inondations. L'outil, destiné aux assurances immobilières, et le cas échéant aux propriétaires, offre plusieurs étapes, à savoir : - l'analyse de la vulnérabilité et le potentiel de dommages ; - des propositions de mesures de remédiation et de réduction du risque issues d'une analyse des coûts engendrés par une inondation potentielle; - l'adaptation d'une stratégie globale dans les zones à risque en fonction des éléments à risque. La dernière partie de la thèse est consacrée à l'étude d'un événement de grêle dans le but de fournir une meilleure compréhension des dommages aux bâtiments et de leur structure. Pour cela, deux échantillons ont été sélectionnés et analysés parmi les données de sinistres à disposition de l'étude. Les résultats obtenus, tant au niveau du portefeuille assuré que de l'analyse individuelle, permettent de dégager des tendances nouvelles. Un deuxième objectif de l'étude a consisté à élaborer une modélisation d'événements de grêle basée sur les données à disposition. Le modèle permet de simuler une distribution aléatoire des intensités et, couplé à un modèle d'estimation des risques, offre une simulation des coûts de dommages envisagés pour une zone d'étude déterminée. Les perspectives de ce travail permettent une meilleure focalisation du rôle de l'assurance et de ses besoins en matière de prévention.
Resumo:
BACKGROUND: Risks of significant infant drug exposurethrough breastmilk are poorly defined for many drugs, and largescalepopulation data are lacking. We used population pharmacokinetics(PK) modeling to predict fluoxetine exposure levels ofinfants via mother's milk in a simulated population of 1000 motherinfantpairs.METHODS: Using our original data on fluoxetine PK of 25breastfeeding women, a population PK model was developed withNONMEM and parameters, including milk concentrations, wereestimated. An exponential distribution model was used to account forindividual variation. Simulation random and distribution-constrainedassignment of doses, dosing time, feeding intervals and milk volumewas conducted to generate 1000 mother-infant pairs with characteristicssuch as the steady-state serum concentrations (Css) and infantdose relative to the maternal weight-adjusted dose (relative infantdose: RID). Full bioavailability and a conservative point estimate of1-month-old infant CYP2D6 activity to be 20% of the adult value(adjusted by weigth) according to a recent study, were assumed forinfant Css calculations.RESULTS: A linear 2-compartment model was selected as thebest model. Derived parameters, including milk-to-plasma ratios(mean: 0.66; SD: 0.34; range, 0 - 1.1) were consistent with the valuesreported in the literature. The estimated RID was below 10% in >95%of infants. The model predicted median infant-mother Css ratio was0.096 (range 0.035 - 0.25); literature reported mean was 0.07 (range0-0.59). Moreover, the predicted incidence of infant-mother Css ratioof >0.2 was less than 1%.CONCLUSION: Our in silico model prediction is consistent withclinical observations, suggesting that substantial systemic fluoxetineexposure in infants through human milk is rare, but further analysisshould include active metabolites. Our approach may be valid forother drugs. [supported by CIHR and Swiss National Science Foundation(SNSF)]
Resumo:
Rockfall propagation areas can be determined using a simple geometric rule known as shadow angle or energy line method based on a simple Coulomb frictional model implemented in the CONEFALL computer program. Runout zones are estimated from a digital terrain model (DTM) and a grid file containing the cells representing rockfall potential source areas. The cells of the DTM that are lowest in altitude and located within a cone centered on a rockfall source cell belong to the potential propagation area associated with that grid cell. In addition, the CONEFALL method allows estimation of mean and maximum velocities and energies of blocks in the rockfall propagation areas. Previous studies indicate that the slope angle cone ranges from 27° to 37° depending on the assumptions made, i.e. slope morphology, probability of reaching a point, maximum run-out, field observations. Different solutions based on previous work and an example of an actual rockfall event are presented here.
Resumo:
In the last decades, the globalized competition among cities and regions made them develop new strategies for branding and promoting their territory to attract tourists, investors, companies and residents. Major sports events - such as the Olympic Games, the FIFA World Cup or World and Continental Championships - have played an integral part in these strategies. Believing, with or without evidence, in the capacity of those events to improve the visibility and the economy of the host destination, many cities, regions and even countries have engaged in establishing sports events hosting strategies. The problem of the globalized competition in the sports events "market" is that many cities and regions do not have the resources - either financial, human or in terms of infrastructure - to compete in hosting major sports events. Consequently, many cities or regions have to turn to second-tier sports events. To organise those smaller events means less media coverage and more difficulty in finding sponsors, while the costs - both financial and in terms of services - stay high for the community. This paper analyses how Heritage Sporting Events (HSE) might be an opportunity for cities and regions engaged in sports events hosting strategies. HSE is an emerging concept that to date has been under-researched in the academic literature. Therefore, this paper aims to define the concept of HSE through an exploratory research study. A multidisciplinary literature review reveals two major characteristics of HSEs: the sustainability in the territory and the authenticity of the event constructed through a differentiation process. These characteristics, defined through multiple variables, give us the opportunity to observe the construction process of a sports event into a heritage object. This paper argues that HSEs can be seen as territorial resources that can represent a competitive advantage for host destinations. In conclusion, academics are invited to further research HSEs to better understand their construction process and their impacts on the territory, while local authorities are invited to consider HSEs for the branding and the promotion of their territory.
Resumo:
OBJECTIVETo identify the association between the use of web simulation electrocardiography and the learning approaches, strategies and styles of nursing degree students.METHODA descriptive and correlational design with a one-group pretest-posttest measurement was used. The study sample included 246 students in a Basic and Advanced Cardiac Life Support nursing class of nursing degree.RESULTSNo significant differences between genders were found in any dimension of learning styles and approaches to learning. After the introduction of web simulation electrocardiography, significant differences were found in some item scores of learning styles: theorist (p < 0.040), pragmatic (p < 0.010) and approaches to learning.CONCLUSIONThe use of a web electrocardiogram (ECG) simulation is associated with the development of active and reflexive learning styles, improving motivation and a deep approach in nursing students.
Resumo:
In this paper the core functions of an artificial intelligence (AI) for controlling a debris collector robot are designed and implemented. Using the robot operating system (ROS) as the base of this work a multi-agent system is built with abilities for task planning.
Resumo:
Le modèle développé à l'Institut universitaire de médecine sociale et préventive de Lausanne utilise un programme informatique pour simuler les mouvements d'entrées et de sorties des hôpitaux de soins généraux. Cette simulation se fonde sur les données récoltées de routine dans les hôpitaux; elle tient notamment compte de certaines variations journalières et saisonnières, du nombre d'entrées, ainsi que du "Case-Mix" de l'hôpital, c'est-à-dire de la répartition des cas selon les groupes cliniques et l'âge des patients.
Resumo:
Introduction: Streptomycin, as other aminoglycosides, exhibits concentration-dependent bacterial killing but has a narrow therapeutic window. It is primarily eliminated unchanged by the kidneys. Data and dosing information to achieve a safe regimen in patients with chronic renal failure undergoing hemodialysis (HD) are scarce. Although main adverse reactions are related to prolonged, elevated serum concentrations, literature recommendation is to administer streptomycin after each HD. Patients (or Materials) and Methods: We report the case of a patient with end-stage renal failure, undergoing HD, who was successfully treated with streptomycin for gentamicin-resistant Enterococcus faecalis bacteremia with prosthetic arteriovenous fistula infection. Streptomycin was administered intravenously 7.5 mg/kg, 3 hours before each dialysis (3 times a week) during 6 weeks in combination with amoxicillin. Streptomycin plasma levels were monitored with repeated blood sampling before, after, and between HD sessions. A 2-compartment model was used to reconstruct the concentration time profile over days on and off HD. Results: Streptomycin trough plasma-concentration was 2.8 mg/L. It peaked to 21.4 mg/L 30 minutes after intravenous administration, decreased to 18.2 mg/L immediately before HD, and dropped to 4.5 mg/L at the end of a 4-hour HD session. Plasma level increased again to 5.7 mg/L 2 hours after the end of HD and was 2.8 mg/L 48 hours later, before the next administration and HD. The pharmacokinetics of streptomycin was best described with a 2-compartment model. The computer simulation fitted fairly well to the observed concentrations during or between HD sessions. Redistribution between the 2 compartments after the end of HD reproduced the rebound of plasma concentrations after HD. No significant toxicity was observed during treatment. The outcome of the infection was favorable, and no sign of relapse was observed after a follow-up of 3 months. Conclusion: Streptomycin administration of 7.5 mg/kg 3 hours before HD sessions in a patient with end-stage renal failure resulted in an effective and safe dosing regimen. Monitoring plasma levels along with pharmacokinetic simulation document the suitability of this dosing scheme, which should replace current dosage recommendations for streptomycin in HD.
Resumo:
In conditions of T lymphopenia, interleukin (IL) 7 levels rise and, via T cell receptor for antigen-self-major histocompatibility complex (MHC) interaction, induce residual naive T cells to proliferate. This pattern of lymphopenia-induced "homeostatic" proliferation is typically quite slow and causes a gradual increase in total T cell numbers and differentiation into cells with features of memory cells. In contrast, we describe a novel form of homeostatic proliferation that occurs when naive T cells encounter raised levels of IL-2 and IL-15 in vivo. In this situation, CD8(+) T cells undergo massive expansion and rapid differentiation into effector cells, thus closely resembling the T cell response to foreign antigens. However, the responses induced by IL-2/IL-15 are not seen in MHC-deficient hosts, implying that the responses are driven by self-ligands. Hence, homeostatic proliferation of naive T cells can be either slow or fast, with the quality of the response to self being dictated by the particular cytokine (IL-7 vs. IL-2/IL-15) concerned. The relevance of the data to the gradual transition of naive T cells into memory-phenotype (MP) cells with age is discussed.