983 resultados para structural calculate criteria


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bioinformatics, in the last few decades, has played a fundamental role to give sense to the huge amount of data produced. Obtained the complete sequence of a genome, the major problem of knowing as much as possible of its coding regions, is crucial. Protein sequence annotation is challenging and, due to the size of the problem, only computational approaches can provide a feasible solution. As it has been recently pointed out by the Critical Assessment of Function Annotations (CAFA), most accurate methods are those based on the transfer-by-homology approach and the most incisive contribution is given by cross-genome comparisons. In the present thesis it is described a non-hierarchical sequence clustering method for protein automatic large-scale annotation, called “The Bologna Annotation Resource Plus” (BAR+). The method is based on an all-against-all alignment of more than 13 millions protein sequences characterized by a very stringent metric. BAR+ can safely transfer functional features (Gene Ontology and Pfam terms) inside clusters by means of a statistical validation, even in the case of multi-domain proteins. Within BAR+ clusters it is also possible to transfer the three dimensional structure (when a template is available). This is possible by the way of cluster-specific HMM profiles that can be used to calculate reliable template-to-target alignments even in the case of distantly related proteins (sequence identity < 30%). Other BAR+ based applications have been developed during my doctorate including the prediction of Magnesium binding sites in human proteins, the ABC transporters superfamily classification and the functional prediction (GO terms) of the CAFA targets. Remarkably, in the CAFA assessment, BAR+ placed among the ten most accurate methods. At present, as a web server for the functional and structural protein sequence annotation, BAR+ is freely available at http://bar.biocomp.unibo.it/bar2.0.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aims of this research study is to explore the opportunity to set up Performance Objectives (POs) parameters for specific risks in RTE products to propose for food industries and food authorities. In fact, even if microbiological criteria for Salmonella and Listeria monocytogenes Ready-to-Eat (RTE) products are included in the European Regulation, these parameters are not risk based and no microbiological criteria for Bacillus cereus in RTE products is present. For these reasons the behaviour of Salmonella enterica in RTE mixed salad, the microbiological characteristics in RTE spelt salad, and the definition of POs for Bacillus cereus and Listeria monocytogenes in RTE spelt salad has been assessed. Based on the data produced can be drawn the following conclusions: 1. A rapid growth of Salmonella enterica may occurr in mixed ingredient salads, and strict temperature control during the production chain of the product is critical. 2. Spelt salad is characterized by the presence of high number of Lactic Acid Bacteria. Listeria spp. and Enterobacteriaceae, on the contrary, did not grow during the shlef life, probably due to the relevant metabolic activity of LAB. 3. The use of spelt and cheese compliant with the suggested POs might significantly reduce the incidence of foodborne intoxications due to Bacillus cereus and Listeria monocytogenes and the proportions of recalls, causing huge economic losses for food companies commercializing RTE products. 4. The approach to calculate the POs values and reported in my work can be easily adapted to different food/risk combination as well as to any changes in the formulation of the same food products. 5. The optimized sampling plans in term of number of samples to collect can be derive in order to verify the compliance to POs values selected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: In a previous study, twenty consecutive patients with a rerupture of the rotator cuff, as documented with magnetic resonance imaging, were found to have significantly less pain and better function and strength, compared with the preoperative state, at 3.2 years postoperatively. It was the purpose of this study to determine the clinical and structural outcomes of these reruptures in the same twenty patients after a longer period of follow-up. METHODS: At a mean of 7.6 years postoperatively, the twenty patients were reexamined clinically and with standard radiographs and magnetic resonance imaging with use of the same clinical, radiographic, and magnetic resonance imaging criteria as were utilized in the review at 3.2 years. The mean age at the time of final follow-up was sixty-six years. RESULTS: Nineteen of the twenty patients continued to be either very satisfied or satisfied with the outcome. The relative Constant score averaged 88% and was not significantly different from the score at 3.2 years, which averaged 83%. The mean scores for pain, function, and strength also had not changed significantly. Overall, the twenty reruptures had not increased in size, and eight of them had healed structurally at the time of the 7.6-year follow-up. Seven of these eight reruptures had been of the supraspinatus tendon only, and seven had been smaller than 400 mm(2) at 3.2 years. Twelve reruptures persisted, and five were larger than the preoperative tear. Fatty infiltration of the infraspinatus muscle progressed significantly (p = 0.015) and the acromiohumeral distance decreased significantly (p = 0.006) between the two follow-up periods. Neither fatty infiltration of the supraspinatus and subscapularis muscles nor glenohumeral osteoarthritis progressed significantly. CONCLUSIONS: At an average of 7.6 years, the clinical outcomes after structural failure of rotator cuff repairs remained significantly improved over the preoperative state in terms of pain, function, strength, and patient satisfaction. Overall, the reruptures that had been present at 3.2 years did not increase in size. We also found that reruptures of the supraspinatus that had been smaller than 400 mm(2) had the potential to heal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indications for the most frequently used imaging modalities in implant dentistry are proposed based on clinical need and biologic risk for the patient. To calculate the biologic risk, the authors carried out dose measurements. They demonstrated that the risk from a periapical radiograph is 20% of that from a panoramic radiograph. A panoramic radiograph and a series of 4 conventional tomographs of a single-tooth gap in the molar region carry 5% and 13% of the risk from computed tomography of the maxilla, respectively. Panoramic radiography is considered the standard radiographic examination for treatment planning of implant patients, because it imparts a low dose while giving the best radiographic survey. Periapical radiographs are used to elucidate details or to complete the findings obtained from the panoramic radiograph. Other radiographic methods, such as conventional film tomography or computed tomography, are applied only in special circumstances, film tomography being preferred for smaller regions of interest and computed tomography being justified for the complete maxilla or mandible when methods for dose reduction are followed. During follow-up, intraoral radiography is considered the standard radiographic examination, particularly for implants in the anterior region of the maxilla or for scientific studies. In patients requiring more than 5 periapical images, panoramic radiography is preferred.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High flexural strength and stiffness can be achieved by forming a thin panel into a wave shape perpendicular to the bending direction. The use of corrugated shapes to gain flexural strength and stiffness is common in metal and reinforced plastic products. However, there is no commercial production of corrugated wood composite panels. This research focuses on the application of corrugated shapes to wood strand composite panels. Beam theory, classical plate theory and finite element models were used to analyze the bending behavior of corrugated panels. The most promising shallow corrugated panel configuration was identified based on structural performance and compatibility with construction practices. The corrugation profile selected has a wavelength equal to 8”, a channel depth equal to ¾”, a sidewall angle equal to 45 degrees and a panel thickness equal to 3/8”. 16”x16” panels were produced using random mats and 3-layer aligned mats with surface flakes parallel to the channels. Strong axis and weak axis bending tests were conducted. The test results indicate that flake orientation has little effect on the strong axis bending stiffness. The 3/8” thick random mat corrugated panels exhibit bending stiffness (400,000 lbs-in2/ft) and bending strength (3,000 in-lbs/ft) higher than 23/32” or 3/4” thick APA Rated Sturd-I-Floor with a 24” o.c. span rating. Shear and bearing test results show that the corrugated panel can withstand more than 50 psf of uniform load at 48” joist spacings. Molding trials on 16”x16” panels provided data for full size panel production. Full size 4’x8’ shallow corrugated panels were produced with only minor changes to the current oriented strandboard manufacturing process. Panel testing was done to simulate floor loading during construction, without a top underlayment layer, and during occupancy, with an underlayment over the panel to form a composite deck. Flexural tests were performed in single-span and two-span bending with line loads applied at mid-span. The average strong axis bending stiffness and bending strength of the full size corrugated panels (without the underlayment) were over 400,000 lbs-in2/ft and 3,000 in-lbs/ft, respectively. The composite deck system, which consisted of an OSB sheathing (15/32” thick) nailed-glued (using 3d ringshank nails and AFG-01 subfloor adhesive) to the corrugated subfloor achieved about 60% of the full composite stiffness resulting in about 3 times the bending stiffness of the corrugated subfloor (1,250,000 lbs-in2/ft). Based on the LRFD design criteria, the corrugated composite floor system can carry 40 psf of unfactored uniform loads, limited by the L/480 deflection limit state, at 48” joist spacings. Four 10-ft long composite T-beam specimens were built and tested for the composite action and the load sharing between a 24” wide corrugated deck system and the supporting I-joist. The average bending stiffness of the composite T-beam was 1.6 times higher than the bending stiffness of the I-joist. A 8-ft x 12-ft mock up floor was built to evaluate construction procedures. The assembly of the composite floor system is relatively simple. The corrugated composite floor system might be able to offset the cheaper labor costs of the single-layer Sturd-IFloor through the material savings. However, no conclusive result can be drawn, in terms of the construction costs, at this point without an in depth cost analysis of the two systems. The shallow corrugated composite floor system might be a potential alternative to the Sturd-I-Floor in the near future because of the excellent flexural stiffness provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Responses of many real-world problems can only be evaluated perturbed by noise. In order to make an efficient optimization of these problems possible, intelligent optimization strategies successfully coping with noisy evaluations are required. In this article, a comprehensive review of existing kriging-based methods for the optimization of noisy functions is provided. In summary, ten methods for choosing the sequential samples are described using a unified formalism. They are compared on analytical benchmark problems, whereby the usual assumption of homoscedastic Gaussian noise made in the underlying models is meet. Different problem configurations (noise level, maximum number of observations, initial number of observations) and setups (covariance functions, budget, initial sample size) are considered. It is found that the choices of the initial sample size and the covariance function are not critical. The choice of the method, however, can result in significant differences in the performance. In particular, the three most intuitive criteria are found as poor alternatives. Although no criterion is found consistently more efficient than the others, two specialized methods appear more robust on average.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Discrepancies in finite-element model predictions of bone strength may be attributed to the simplified modeling of bone as an isotropic structure due to the resolution limitations of clinical-level Computed Tomography (CT) data. The aim of this study is to calculate the preferential orientations of bone (the principal directions) and the extent to which bone is deposited more in one direction compared to another (degree of anisotropy). Using 100 femoral trabecular samples, the principal directions and degree of anisotropy were calculated with a Gradient Structure Tensor (GST) and a Sobel Structure Tensor (SST) using clinical-level CT. The results were compared against those calculated with the gold standard Mean-Intercept-Length (MIL) fabric tensor using micro-CT. There was no significant difference between the GST and SST in the calculation of the main principal direction (median error=28°), and the error was inversely correlated to the degree of transverse isotropy (r=−0.34, p<0.01). The degree of anisotropy measured using the structure tensors was weakly correlated with the MIL-based measurements (r=0.2, p<0.001). Combining the principal directions with the degree of anisotropy resulted in a significant increase in the correlation of the tensor distributions (r=0.79, p<0.001). Both structure tensors were robust against simulated noise, kernel sizes, and bone volume fraction. We recommend the use of the GST because of its computational efficiency and ease of implementation. This methodology has the promise to predict the structural anisotropy of bone in areas with a high degree of anisotropy, and may improve the in vivo characterization of bone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Concurrent cardiac diseases are frequent among elderly patients and invite simultaneous treatment to ensure an overall favourable patient outcome. Aim To investigate the feasibility of combined single-session percutaneous cardiac interventions in the era of transcatheter aortic valve implantation (TAVI). Methods This prospective, case–control study included 10 consecutive patients treated with TAVI, left atrial appendage occlusion and percutaneous coronary interventions. Some in addition had patent foramen ovale or atrial septal defect closure in the same session. The patients were matched in a 1:10 manner with TAVI-only cases treated within the same time period at the same institution regarding their baseline factors. The outcome was validated according to the Valve Academic Research Consortium (VARC) criteria. Results Procedural time (126±42 vs 83±40 min, p=0.0016), radiation time (34±8 vs 22±12 min, p=0.0001) and contrast dye (397±89 vs 250±105 mL, p<0.0001) were higher in the combined intervention group than in the TAVI-only group. Despite these drawbacks, no difference in the VARC endpoints was evident during the in-hospital period and after 30 days (VARC combined safety endpoint 32% for TAVI only and 20% for combined intervention, p=1.0). Conclusions Transcatheter treatment of combined cardiac diseases is feasible even in a single session in a high-volume centre with experienced operators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE Eligibility criteria are a key factor for the feasibility and validity of clinical trials. We aimed to develop an online tool to assess the potential effect of inclusion and exclusion criteria on the proportion of patients eligible for an acute stroke trial. METHODS We identified relevant inclusion and exclusion criteria of acute stroke trials. Based on these criteria and using a cohort of 1537 consecutive patients with acute ischemic stroke from 3 stroke centers, we developed a web portal feasibility platform for stroke studies (FePASS) to estimate proportions of eligible patients for acute stroke trials. We applied the FePASS resource to calculate the proportion of patients eligible for 4 recent stroke studies. RESULTS Sixty-one eligibility criteria were derived from 30 trials on acute ischemic stroke. FePASS, publicly available at http://fepass.uni-muenster.de, displays the proportion of patients in percent to assess the effect of varying values of relevant eligibility criteria, for example, age, symptom onset time, National Institutes of Health Stroke Scale, and prestroke modified Rankin Scale, on this proportion. The proportion of eligible patients for 4 recent stroke studies ranged from 2.1% to 11.3%. Slight variations of the inclusion criteria could substantially increase the proportion of eligible patients. CONCLUSIONS FePASS is an open access online resource to assess the effect of inclusion and exclusion criteria on the proportion of eligible patients for a stroke trial. FePASS can help to design stroke studies, optimize eligibility criteria, and to estimate the potential recruitment rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Schizophrenia patients frequently suffer from complex motor abnormalities including fine and gross motor disturbances, abnormal involuntary movements, neurological soft signs and parkinsonism. These symptoms occur early in the course of the disease, continue in chronic patients and may deteriorate with antipsychotic medication. Furthermore gesture performance is impaired in patients, including the pantomime of tool use. Whether schizophrenia patients would show difficulties of actual tool use has not yet been investigated. Human tool use is complex and relies on a network of distinct and distant brain areas. We therefore aim to test if schizophrenia patients had difficulties in tool use and to assess associations with structural brain imaging using voxel based morphometry (VBM) and tract based spatial statistics (TBSS). Methode: In total, 44 patients with schizophrenia (DSM-5 criteria; 59% men, mean age 38) underwent structural MR imaging and performed the Tool-Use test. The test examines the use of a scoop and a hammer in three conditions: pantomime (without the tool), demonstration (with the tool) and actual use (with a recipient object). T1-weighted images were processed using SPM8 and DTI-data using FSL TBSS routines. To assess structural alterations of impaired tool use we first compared gray matter (GM) volume in VBM and white matter (WM) integrity in TBSS data of patients with and without difficulties of actual tool use. Next we explored correlations of Tool use scores and VBM and TBSS data. Group comparisons were family wise error corrected for multiple tests. Correlations were uncorrected (p < 0.001) with a minimum cluster threshold of 17 voxels (equivalent to a map-wise false positive rate of alpha < 0.0001 using a Monte Carlo procedure). Results: Tool use was impaired in schizophrenia (43.2% pantomime, 11.6% demonstration, 11.6% use). Impairment was related to reduced GM volume and WM integrity. Whole brain analyses detected an effect in the SMA in group analysis. Correlations of tool use scores and brain structure revealed alterations in brain areas of the dorso-dorsal pathway (superior occipital gyrus, superior parietal lobule, and dorsal premotor area) and the ventro-dorsal pathways (middle occipital gyrus, inferior parietal lobule) the action network, as well as the insula and the left hippocampus. Furthermore, significant correlations within connecting fiber tracts - particularly alterations within the bilateral corona radiata superior and anterior as well as the corpus callosum -were associated with Tool use performance. Conclusions: Tool use performance was impaired in schizophrenia, which was associated with reduced GM volume in the action network. Our results are in line with reports of impaired tool use in patients with brain lesions particularly of the dorso-dorsal and ventro-dorsal stream of the action network. In addition an effect of tool use on WM integrity was shown within fiber tracts connecting regions important for planning and executing tool use. Furthermore, hippocampus is part of a brain system responsible for spatial memory and navigation.The results suggest that structural brain alterations in the common praxis network contribute to impaired tool use in schizophrenia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION Conventional 2-dimensional radiography uses defined criteria for outcome assessment of apical surgery. However, these radiographic healing criteria are not applicable for 3-dimensional radiography. The present study evaluated the repeatability and reproducibility of new cone-beam computed tomographic (CBCT)-based healing criteria for the judgment of periapical healing 1 year after apical surgery. METHODS CBCT scans taken 1 year after apical surgery (61 roots of 54 teeth in 54 patients, mean age = 54.4 years) were evaluated by 3 blinded and calibrated observers using 4 different indices. Reformatted buccolingual CBCT sections through the longitudinal axis of the treated roots were analyzed. Radiographic healing was assessed at the resection plane (R index), within the apical area (A index), of the cortical plate (C index), and regarding a combined apical-cortical area (B index). All readings were performed twice to calculate the intraobserver agreement (repeatability). Second-time readings were used for analyzing the interobserver agreement (reproducibility). Various statistical tests (Cohen, kappa, Fisher, and Spearman) were performed to measure the intra- and interobserver concurrence, the variability of score ratios, and the correlation of indices. RESULTS For all indices, the rates of identical first- and second-time scores were always higher than 80% (intraobserver Cohen κ values ranging from 0.793 to 0.963). The B index (94.0%) showed the highest intraobserver agreement. Regarding interobserver agreement, the highest rate was found for the B index (72.1%). The Fleiss' κ values for R and B indices exhibited substantial agreement (0.626 and 0.717, respectively), whereas the values for A and C indices showed moderate agreement (0.561 and 0.573, respectively). The Spearman correlation coefficients for R, A, C, and B indices all exhibited a moderate to very strong correlation with the highest correlation found between C and B indices (rs = 0.8069). CONCLUSIONS All indices showed an excellent intraobserver agreement (repeatability). With regard to interobserver agreement (reproducibility), the B index (healing of apical and cortical defects combined) and the R index (healing on the resection plane) showed substantial congruence and thus are to be recommended in future studies when using buccolingual CBCT sections for radiographic outcome assessment of apical surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to determine the critical wear levels of the contact wire of the catenary on metropolitan lines. The study has focussed on the zones of contact wire where localised wear is produced, normally associated with the appearance of electric arcs. To this end, a finite element model has been developed to study the dynamics of pantograph-catenary interaction. The model includes a zone of localised wear and a singularity in the contact wire in order to simulate the worst case scenario from the point of view of stresses. In order to consider the different stages in the wire wear process, different depths and widths of the localised wear zone were defined. The results of the dynamic simulations performed for each stage of wear let the area of the minimum resistant section of the contact wire be determined for which stresses are greater than the allowable stress. The maximum tensile stress reached in the contact wire shows a clear sensitivity to the size of the local wear zone, defined by its width and depth. In this way, if the wear measurements taken with an overhead line recording vehicle are analysed, it will be possible to calculate the potential breakage risk of the wire. A strong dependence of the tensile forces of the contact wire has also been observed. These results will allow priorities to be set for replacing the most critical sections of wire, thereby making maintenance much more efficient. The results obtained show that the wire replacement criteria currently borne in mind have turned out to be appropriate, although in some wear scenarios these criteria could be adjusted even more, and so prolong the life cycle of the contact wire.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tras el devastador terremoto del 12 de enero de 2010 en Puerto Príncipe, Haití, las autoridades locales, numerosas ONGs y organismos nacionales e internacionales están trabajando en el desarrollo de estrategias para minimizar el elevado riesgo sísmico existente en el país. Para ello es necesario, en primer lugar, estimar dicho riesgo asociado a eventuales terremotos futuros que puedan producirse, evaluando el grado de pérdidas que podrían generar, para dimensionar la catástrofe y actuar en consecuencia, tanto en lo referente a medidas preventivas como a adopción de planes de emergencia. En ese sentido, este Trabajo Fin de Master aporta un análisis detallado del riesgo sísmico asociado a un futuro terremoto que podría producirse con probabilidad razonable, causando importantes daños en Puerto Príncipe. Se propone para ello una metodología de cálculo del riesgo adaptada a los condicionantes de la zona, con modelos calibrados empleando datos del sismo de 2010. Se ha desarrollado en el marco del proyecto de cooperación Sismo-Haití, financiado por la Universidad Politécnica de Madrid, que comenzó diez meses después del terremoto de 2010 como respuesta a una petición de ayuda del gobierno haitiano. El cálculo del riesgo requiere la consideración de dos inputs: la amenaza sísmica o movimiento esperado por el escenario definido (sismo de cierta magnitud y localización) y los elementos expuestos a esta amenaza (una clasificación del parque inmobiliario en diferentes tipologías constructivas, así como su vulnerabilidad). La vulnerabilidad de estas tipologías se describe por medio de funciones de daño: espectros de capacidad, que representan su comportamiento ante las fuerzas horizontales motivadas por los sismos, y curvas de fragilidad, que representan la probabilidad de que las estructuras sufran daños al alcanzar el máximo desplazamiento horizontal entre plantas debido a la mencionada fuerza horizontal. La metodología que se propone especifica determinadas pautas y criterios para estimar el movimiento, asignar la vulnerabilidad y evaluar el daño, cubriendo los tres estados del proceso. Por una parte, se consideran diferentes modelos de movimiento fuerte incluyendo el efecto local, y se identifican los que mejor ajustan a las observaciones de 2010. Por otra se clasifica el parque inmobiliario en diferentes tipologías constructivas, en base a la información extraída en una campaña de campo y utilizando además una base de datos aportada por el Ministerio de Obras Públicas de Haití. Ésta contiene información relevante de todos los edificios de la ciudad, resultando un total de 6 tipologías. Finalmente, para la estimación del daño se aplica el método capacidad-demanda implementado en el programa SELENA (Molina et al., 2010). En primer lugar, utilizado los datos de daño del terremoto de 2010, se ha calibrado el modelo propuesto de cálculo de riesgo sísmico: cuatro modelos de movimiento fuerte, tres modelos de tipo de suelo y un conjunto de funciones de daño. Finalmente, con el modelo calibrado, se ha simulado un escenario sísmico determinista correspondiente a un posible terremoto con epicentro próximo a Puerto Príncipe. Los resultados muestran que los daños estructurales serán considerables y podrán llevar a pérdidas económicas y humanas que causen un gran impacto en el país, lo que pone de manifiesto la alta vulnerabilidad estructural existente. Este resultado será facilitado a las autoridades locales, constituyendo una base sólida para toma de decisiones y adopción de políticas de prevención y mitigación del riesgo. Se recomienda dirigir esfuerzos hacia la reducción de la vulnerabilidad estructural - mediante refuerzo de edificios vulnerables y adopción de una normativa sismorresistente- y hacia el desarrollo de planes de emergencia. Abstract After the devastating 12 January 2010 earthquake that hit the city of Port-au-Prince, Haiti, strategies to minimize the high seismic risk are being developed by local authorities, NGOs, and national and international institutions. Two important tasks to reach this objective are, on the one hand, the evaluation of the seismic risk associated to possible future earthquakes in order to know the dimensions of the catastrophe; on the other hand, the design of preventive measures and emergency plans to minimize the consequences of such events. In this sense, this Master Thesis provides a detailed estimation of the damage that a possible future earthquake will cause in Port-au-Prince. A methodology to calculate the seismic risk is proposed, adapted to the study area conditions. This methodology has been calibrated using data from the 2010 earthquake. It has been conducted in the frame of the Sismo-Haiti cooperative project, supported by the Technical University of Madrid, which started ten months after the 2010 earthquake as an answer to an aid call of the Haitian government. The seismic risk calculation requires two inputs: the seismic hazard (expected ground motion due to a scenario earthquake given by magnitude and location) and the elements exposed to the hazard (classification of the building stock into building typologies, as well as their vulnerability). This vulnerability is described through the damage functions: capacity curves, which represent the structure performance against the horizontal forces caused by the seisms; and fragility curves, which represent the probability of damage as the structure reaches the maximum spectral displacement due to the horizontal force. The proposed methodology specifies certain guidelines and criteria to estimate the ground motion, assign the vulnerability, and evaluate the damage, covering the whole process. Firstly, different ground motion prediction equations including the local effect are considered, and the ones that have the best correlation with the observations of the 2010 earthquake, are identified. Secondly, the classification of building typologies is made by using the information collected during a field campaign, as well as a data base provided by the Ministry of Public Works of Haiti. This data base contains relevant information about all the buildings in the city, leading to a total of 6 different typologies. Finally, the damage is estimated using the capacity-spectrum method as implemented in the software SELENA (Molina et al., 2010). Data about the damage caused by the 2010 earthquake have been used to calibrate the proposed calculation model: different choices of ground motion relationships, soil models, and damage functions. Then, with the calibrated model, a deterministic scenario corresponding to an epicenter close to Port-au-Prince has been simulated. The results show high structural damage, and therefore, they point out the high structural vulnerability in the city. Besides, the economic and human losses associated to the damage would cause a great impact in the country. This result will be provided to the Haitian Government, constituting a scientific base for decision making and for the adoption of measures to prevent and mitigate the seismic risk. It is highly recommended to drive efforts towards the quality control of the new buildings -through reinforcement and construction according to a seismic code- and the development of emergency planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the Bayesian approach as the model selection criteria, the main purpose in this study is to establish a practical road accident model that can provide a better interpretation and prediction performance. For this purpose we are using a structural explanatory model with autoregressive error term. The model estimation is carried out through Bayesian inference and the best model is selected based on the goodness of fit measures. To cross validate the model estimation further prediction analysis were done. As the road safety measures the number of fatal accidents in Spain, during 2000-2011 were employed. The results of the variable selection process show that the factors explaining fatal road accidents are mainly exposure, economic factors, and surveillance and legislative measures. The model selection shows that the impact of economic factors on fatal accidents during the period under study has been higher compared to surveillance and legislative measures.