9 resultados para Probabilistic methodology
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
In this article, we present the first study on probabilistic tsunami hazard assessment for the Northeast (NE) Atlantic region related to earthquake sources. The methodology combines the probabilistic seismic hazard assessment, tsunami numerical modeling, and statistical approaches. We consider three main tsunamigenic areas, namely the Southwest Iberian Margin, the Gloria, and the Caribbean. For each tsunamigenic zone, we derive the annual recurrence rate for each magnitude range, from Mw 8.0 up to Mw 9.0, with a regular interval, using the Bayesian method, which incorporates seismic information from historical and instrumental catalogs. A numerical code, solving the shallow water equations, is employed to simulate the tsunami propagation and compute near shore wave heights. The probability of exceeding a specific tsunami hazard level during a given time period is calculated using the Poisson distribution. The results are presented in terms of the probability of exceedance of a given tsunami amplitude for 100- and 500-year return periods. The hazard level varies along the NE Atlantic coast, being maximum along the northern segment of the Morocco Atlantic coast, the southern Portuguese coast, and the Spanish coast of the Gulf of Cadiz. We find that the probability that a maximum wave height exceeds 1 m somewhere in the NE Atlantic region reaches 60 and 100 % for 100- and 500-year return periods, respectively. These probability values decrease, respectively, to about 15 and 50 % when considering the exceedance threshold of 5 m for the same return periods of 100 and 500 years.
Resumo:
Exposure assessment is an important step of risk assessment process and has evolved more quickly than perhaps any aspect of the four-step risk paradigm (hazard identification, exposure assessment, dose-response analysis, and risk characterization). Nevertheless, some epidemiological studies have associated adverse health effects to a chemical exposure with an inadequate or absent exposure quantification. In addition to the metric used, the truly representation of exposure by measurements depends on: the strategy of sampling, random collection of measurements, and similarity between the measured and unmeasured exposure groups. Two environmental monitoring methodologies for formaldehyde occupational exposure were used to assess the influence of metric selection in exposure assessment and, consequently, in risk assessment process.
Resumo:
Collaborative networks are typically formed by heterogeneous and autonomous entities, and thus it is natural that each member has its own set of core-values. Since these values somehow drive the behaviour of the involved entities, the ability to quickly identify partners with compatible or common core-values represents an important element for the success of collaborative networks. However, tools to assess or measure the level of alignment of core-values are lacking. Since the concept of 'alignment' in this context is still ill-defined and shows a multifaceted nature, three perspectives are discussed. The first one uses a causal maps approach in order to capture, structure, and represent the influence relationships among core-values. This representation provides the basis to measure the alignment in terms of the structural similarity and influence among value systems. The second perspective considers the compatibility and incompatibility among core-values in order to define the alignment level. Under this perspective we propose a fuzzy inference system to estimate the alignment level, since this approach allows dealing with variables that are vaguely defined, and whose inter-relationships are difficult to define. Another advantage provided by this method is the possibility to incorporate expert human judgment in the definition of the alignment level. The last perspective uses a belief Bayesian network method, and was selected in order to assess the alignment level based on members' past behaviour. An example of application is presented where the details of each method are discussed.
Resumo:
Bearing in mind the potential adverse health effects of ultrafine particles, it is of paramount importance to perform effective monitoring of nanosized particles in several microenvironments, which may include ambient air, indoor air, and also occupational environments. In fact, effective and accurate monitoring is the first step to obtaining a set of data that could be used further on to perform subsequent evaluations such as risk assessment and epidemiologic studies, thus proposing good working practices such as containment measures in order to reduce occupational exposure. This paper presents a useful methodology for monitoring ultrafine particles/nanoparticles in several microenvironments, using online analyzers and also sampling systems that allow further characterization on collected nanoparticles. This methodology was validated in three case studies presented in the paper, which assess monitoring of nanosized particles in the outdoor atmosphere, during cooking operations, and in a welding workshop.
Resumo:
The harmony between the stump and the prosthesis is critical to allow it to fulfill its function enabling an efficient gait. A well fitted socket, with an efficient and comfortable suspension, allows the amputee to continue their daily living activities, maintaining the stump functional, making this correlation between socket and suspension very important in the functionality of the prosthesis, mobility and overall satisfaction with the device. Of our knowledge, the quantitative correlation between all of these factors as not yet been assessed. Aim of study: Verify and confirm the process of decision-making for four different trans-tibial prostheses with suspension systems: Hypobaric(A), PIN(B), Classic Suction(C) and Vacuum Active –VASS(D) according data provided by gait efficiency (mlO2/kg/m) imagiology (pistonning) and amputee perception.
Resumo:
This study focus on the probabilistic modelling of mechanical properties of prestressing strands based on data collected from tensile tests carried out in Laboratório Nacional de Engenharia Civil (LNEC), Portugal, for certification purposes, and covers a period of about 9 years of production. The strands studied were produced by six manufacturers from four countries, namely Portugal, Spain, Italy and Thailand. Variability of the most important mechanical properties is examined and the results are compared with the recommendations of the Probabilistic Model Code, as well as the Eurocodes and earlier studies. The obtained results show a very low variability which, of course, benefits structural safety. Based on those results, probabilistic models for the most important mechanical properties of prestressing strands are proposed.
Resumo:
A design methodology for monolithic integration of inductor based DC-DC converters is proposed in this paper. A power loss model of the power stage, including the drive circuits, is defined in order to optimize efficiency. Based on this model and taking as reference a 0.35 mu m CMOS process, a buck converter was designed and fabricated. For a given set of operating conditions the defined power loss model allows to optimize the design parameters for the power stage, including the gate-driver tapering factor and the width of the power MOSFETs. Experimental results obtained from a buck converter at 100 MHz switching frequency are presented to validate the proposed methodology.
Resumo:
Clustering ensemble methods produce a consensus partition of a set of data points by combining the results of a collection of base clustering algorithms. In the evidence accumulation clustering (EAC) paradigm, the clustering ensemble is transformed into a pairwise co-association matrix, thus avoiding the label correspondence problem, which is intrinsic to other clustering ensemble schemes. In this paper, we propose a consensus clustering approach based on the EAC paradigm, which is not limited to crisp partitions and fully exploits the nature of the co-association matrix. Our solution determines probabilistic assignments of data points to clusters by minimizing a Bregman divergence between the observed co-association frequencies and the corresponding co-occurrence probabilities expressed as functions of the unknown assignments. We additionally propose an optimization algorithm to find a solution under any double-convex Bregman divergence. Experiments on both synthetic and real benchmark data show the effectiveness of the proposed approach.
Resumo:
Mestrado em Tecnologia de Diagnóstico e Intervenção Cardiovascular - Área de especialização: Ultrassonografia cardiovascular