917 resultados para FMEA (Failure Mode Effects Analysis)


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Genital human papillomavirus (HPV) is of public health concern because persistent infection with certain HPV types can cause cervical cancer. In response to a nationwide push for cervical cancer legislation, Texas Governor Rick Perry bypassed the traditional legislative process and issued an executive order mandating compulsory HPV vaccinations for all female public school students prior to their entrance in the sixth grade. By bypassing the legislative process Governor Perry did not effectively mitigate the risk perception issues that arose around the need for and usefulness of the vaccine mandate. This policy paper uses a social policy paradigm to identify perception as the key intervening factor on how the public responds to risk information. To demonstrate how the HPV mandate failed, it analyzes four factors, economics, politics, knowledge and culture, that shape perception and influence the public's response. By understanding the factors that influence the public's perception, public health practitioners and policy makers can more effectively create preventive health policy at the state level. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In December, 1980, following increasing congressional and constituent-interest in problems associated with hazardous waste, the Comprehensive Environmental Recovery, Compensation and Liability Act (CERCLA) was passed. During its development, the legislative initiative was seriously compromised which resulted in a less exhaustive approach than was formerly sought. Still, CERCLA (Superfund) which established, among other things, authority to clean up abandoned waste dumps and to respond to emergencies caused by releases of hazardous substances was welcomed by many as an important initial law critical to the cleanup of the nation's hazardous waste. Expectations raised by passage of this bill were tragically unmet. By the end of four years, only six sites had been declared by the EPA as cleaned. Seemingly, even those determinations were liberal; of the six sites, two were identified subsequently as requiring further cleanup.^ This analysis is focused upon the implementation failure of the Superfund. In light of that focus, discussion encompasses development of linkages between flaws in the legislative language and foreclosure of chances for implementation success. Specification of such linkages is achieved through examination of the legislative initiative, identification of its flaws and characterization of attendant deficits in implementation ability. Subsequent analysis is addressed to how such legislative frailities might have been avoided and to attendant regulatory weaknesses which have contributed to implementation failure. Each of these analyses are accomplished through application of an expanded approach to the backward mapping analytic technique as presented by Elmore. Results and recommendations follow.^ Consideration is devoted to a variety of regulatory issues as well as to those pertinent to legislative and implementation analysis. Problems in assessing legal liability associated with hazardous waste management are presented, as is a detailed review of the legislative development of Superfund, and its initial implementation by Gorsuch's EPA. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Mixture modeling is commonly used to model categorical latent variables that represent subpopulations in which population membership is unknown but can be inferred from the data. In relatively recent years, the potential of finite mixture models has been applied in time-to-event data. However, the commonly used survival mixture model assumes that the effects of the covariates involved in failure times differ across latent classes, but the covariate distribution is homogeneous. The aim of this dissertation is to develop a method to examine time-to-event data in the presence of unobserved heterogeneity under a framework of mixture modeling. A joint model is developed to incorporate the latent survival trajectory along with the observed information for the joint analysis of a time-to-event variable, its discrete and continuous covariates, and a latent class variable. It is assumed that the effects of covariates on survival times and the distribution of covariates vary across different latent classes. The unobservable survival trajectories are identified through estimating the probability that a subject belongs to a particular class based on observed information. We applied this method to a Hodgkin lymphoma study with long-term follow-up and observed four distinct latent classes in terms of long-term survival and distributions of prognostic factors. Our results from simulation studies and from the Hodgkin lymphoma study demonstrated the superiority of our joint model compared with the conventional survival model. This flexible inference method provides more accurate estimation and accommodates unobservable heterogeneity among individuals while taking involved interactions between covariates into consideration.^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper introduces a novel method for examining the effects of vertical integration. The basic idea is to estimate the parameters of a vertical entry game. By carefully specifying firms' payoff equations and constructing appropriate tests, it is possible to use estimates on rival profit effects to make inferences about the existence of vertical foreclosure. I estimate the vertical entry model using data from the US generic pharmaceutical industry. The estimates indicate that vertical integration is unlikely to generate anticompetitive foreclosure effects. On the other hand, significant efficiency effects are found to arise from vertical integration. I use the parameter estimates to simulate a policy that bans vertically integrated entry. The simulation results suggest that such a ban is counterproductive; it is likely to reduce entry into smaller markets.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The consideration of real operating conditions for the design and optimization of a multijunction solar cell receiver-concentrator assembly is indispensable. Such a requirement involves the need for suitable modeling and simulation tools in order to complement the experimental work and circumvent its well-known burdens and restrictions. Three-dimensional distributed models have been demonstrated in the past to be a powerful choice for the analysis of distributed phenomena in single- and dual-junction solar cells, as well as for the design of strategies to minimize the solar cell losses when operating under high concentrations. In this paper, we present the application of these models for the analysis of triple-junction solar cells under real operating conditions. The impact of different chromatic aberration profiles on the short-circuit current of triple-junction solar cells is analyzed in detail using the developed distributed model. Current spreading conditions the impact of a given chromatic aberration profile on the solar cell I-V curve. The focus is put on determining the role of current spreading in the connection between photocurrent profile, subcell voltage and current, and semiconductor layers sheet resistance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The research work that here is summarized, it is classed on the area of dynamics and measures of railway safety, specifically in the study of the influence of the cross wind on the high-speed trains as well as the study of new mitigation measures like wind breaking structures or wind fences, with optimized shapes. The work has been developed in the Research Center in Rail Technology (CITEF), and supported by the Universidad Politécnica de Madrid, Spain.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

One of the common pathologies of brickwork masonry structural elements and walls is the cracking associated with the differential settlements and/or excessive deflections of the slabs along the life of the structure. The scarce capacity of the masonry in order to accompany the structural elements that surround it, such as floors, beams or foundations, in their movements makes the brickwork masonry to be an element that frequently presents this kind of problem. This problem is a fracture problem, where the wall is cracked under mixed mode fracture: tensile and shear stresses combination, under static loading. Consequently, it is necessary to advance in the simulation and prediction of brickwork masonry mechanical behaviour under tensile and shear loading. The quasi-brittle behaviour of the brickwork masonry can be studied using the cohesive crack model whose application to other quasibrittle materials like concrete has traditionally provided very satisfactory results.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this work, robustness and stability of continuum damage models applied to material failure in soft tissues are addressed. In the implicit damage models equipped with softening, the presence of negative eigenvalues in the tangent elemental matrix degrades the condition number of the global matrix, leading to a reduction of the computational performance of the numerical model. Two strategies have been adapted from literature to improve the aforementioned computational performance degradation: the IMPL-EX integration scheme [Oliver,2006], which renders the elemental matrix contribution definite positive, and arclength-type continuation methods [Carrera,1994], which allow to capture the unstable softening branch in brittle ruptures. The IMPL-EX integration scheme has as a major drawback the need to use small time steps to keep numerical error below an acceptable value. A convergence study, limiting the maximum allowed increment of internal variables in the damage model, is presented. Finally, numerical simulation of failure problems with fibre reinforced materials illustrates the performance of the adopted methodology.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

AlGaN/GaN high electron mobility transistors (HEMT) are key devices for the next generation of high-power, high-frequency and high-temperature electronics applications. Although significant progress has been recently achieved [1], stability and reliability are still some of the main issues under investigation, particularly at high temperatures [2-3]. Taking into account that the gate contact metallization is one of the weakest points in AlGaN/GaN HEMTs, the reliability of Ni, Mo, Pt and refractory metal gates is crucial [4-6]. This work has been focused on the thermal stress and reliability assessment of AlGaN/GaN HEMTs. After an unbiased storage at 350 o C for 2000 hours, devices with Ni/Au gates exhibited detrimental IDS-VDS degradation in pulsed mode. In contrast, devices with Mo/Au gates showed no degradation after similar storage conditions. Further capacitance-voltage characterization as a function of temperature and frequency revealed two distinct trap-related effects in both kinds of devices. At low frequency (< 1MHz), increased capacitance near the threshold voltage was present at high temperatures and more pronounced for the Ni/Au gate HEMT and as the frequency is lower. Such an anomalous “bump” has been previously related to H-related surface polar charges [7]. This anomalous behavior in the C-V characteristics was also observed in Mo/Au gate HEMTs after 1000 h at a calculated channel temperatures of around from 250 o C (T2) up to 320 ºC (T4), under a DC bias (VDS= 25 V, IDS= 420 mA/mm) (DC-life test). The devices showed a higher “bump” as the channel temperature is higher (Fig. 1). At 1 MHz, the higher C-V curve slope of the Ni/Au gated HEMTs indicated higher trap density than Mo/Au metallization (Fig. 2). These results highlight that temperature is an acceleration factor in the device degradation, in good agreement with [3]. Interface state density analysis is being performed in order to estimate the trap density and activation energy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Magnetoencephalography (MEG) allows the real-time recording of neural activity and oscillatory activity in distributed neural networks. We applied a non-linear complexity analysis to resting-state neural activity as measured using whole-head MEG. Recordings were obtained from 20 unmedicated patients with major depressive disorder and 19 matched healthy controls. Subsequently, after 6 months of pharmacological treatment with the antidepressant mirtazapine 30 mg/day, patients received a second MEG scan. A measure of the complexity of neural signals, the Lempel–Ziv Complexity (LZC), was derived from the MEG time series. We found that depressed patients showed higher pre-treatment complexity values compared with controls, and that complexity values decreased after 6 months of effective pharmacological treatment, although this effect was statistically significant only in younger patients. The main treatment effect was to recover the tendency observed in controls of a positive correlation between age and complexity values. Importantly, the reduction of complexity with treatment correlated with the degree of clinical symptom remission. We suggest that LZC, a formal measure of neural activity complexity, is sensitive to the dynamic physiological changes observed in depression and may potentially offer an objective marker of depression and its remission after treatment.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We propose an analysis for detecting procedures and goals that are deterministic (i.e. that produce at most one solution), or predicates whose clause tests are mutually exclusive (which implies that at most one of their clauses will succeed) even if they are not deterministic (because they cali other predicates that can produce more than one solution). Applications of such determinacy information include detecting programming errors, performing certain high-level program transformations for improving search efñciency, optimizing low level code generation and parallel execution, and estimating tighter upper bounds on the computational costs of goals and data sizes, which can be used for program debugging, resource consumption and granularity control, etc. We have implemented the analysis and integrated it in the CiaoPP system, which also infers automatically the mode and type information that our analysis takes as input. Experiments performed on this implementation show that the analysis is fairly accurate and efncient.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Non-failure analysis aims at inferring that predicate calis in a program will never fail. This type of information has many applications in functional/logic programming. It is essential for determining lower bounds on the computational cost of calis, useful in the context of program parallelization, instrumental in partial evaluation and other program transformations, and has also been used in query optimization. In this paper, we re-cast the non-failure analysis proposed by Debray et al. as an abstract interpretation, which not only allows to investígate it from a standard and well understood theoretical framework, but has also several practical advantages. It allows us to incorpórate non-failure analysis into a standard, generic abstract interpretation engine. The analysis thus benefits from the fixpoint propagation algorithm, which leads to improved information propagation. Also, the analysis takes advantage of the multi-variance of the generic engine, so that it is now able to infer sepárate non-failure information for different cali patterns. Moreover, the implementation is simpler, and allows to perform non-failure and covering analyses alongside other analyses, such as those for modes and types, in the same framework. Finally, besides the precisión improvements and the additional simplicity, our implementation (in the Ciao/CiaoPP multiparadigm programming system) also shows better efRciency.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We propose an analysis for detecting procedures and goals that are deterministic (i.e., that produce at most one solution at most once), or predicates whose clause tests are mutually exclusive (which implies that at most one of their clauses will succeed) even if they are not deterministic. The analysis takes advantage of the pruning operator in order to improve the detection of mutual exclusion and determinacy. It also supports arithmetic equations and disequations, as well as equations and disequations on terms, for which we give a complete satisfiability testing algorithm, w.r.t. available type information. We have implemented the analysis and integrated it in the CiaoPP system, which also infers automatically the mode and type information that our analysis takes as input. Experiments performed on this implementation show that the analysis is fairly accurate and efficient.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

El presente estudio analiza las intenciones de los usuarios acerca del uso de sistemas de tele-enseñanza LMS (Learning Management Systems, basándose en un modelo que integra el Modelo de Aceptación Tecnológica (TAM, Technology Acceptance Model, la Teoría del Comportamiento Percibido (TPB, Theory of Planned Behavior) y la Teoría Unificada de la Aceptación y Uso de la Tecnología (UTAUT, Unified Theory of Acceptance and Use of Technology), tomando la edad como variable moderadora. Así, este artículo estudia la influencia de la intención conductual, la actitud hacia el uso, la facilidad de uso percibida, la utilidad percibida, la norma subjetiva y la influencia social en la intención de utilizar sistemas e-learning LMS. Como antecedentes de estos factores de influencia se plantean las características del sistema y del usuario. El resultado de la revisión teórica es un modelo unificado que ha sido validado con datos recogidos de 94 estudiantes a través de un cuestionario en línea. Estos datos han sido analizados utilizando la técnica de mínimos cuadrados parciales, y los principales resultados confirman la relevancia predictiva del modelo para usuarios de entre 26 y 35 años y de entre 36 y 45 años.