940 resultados para Aircraft failure analysis
Resumo:
Decision-making in product quality is an indispensable stage in product development, in order to reduce product development risk. Based on the identification of the deficiencies of quality function deployment (QFD) and failure modes and effects analysis (FMEA), a novel decision-making method is presented that draws upon a knowledge network of failure scenarios. An ontological expression of failure scenarios is presented together with a framework of failure knowledge network (FKN). According to the roles of quality characteristics (QCs) in failure processing, QCs are set into three categories namely perceptible QCs, restrictive QCs, and controllable QCs, which present the monitor targets, control targets and improvement targets respectively for quality management. A mathematical model and algorithms based on the analytic network process (ANP) is introduced for calculating the priority of QCs with respect to different development scenarios. A case study is provided according to the proposed decision-making procedure based on FKN. This methodology is applied in the propeller design process to solve the problem of prioritising QCs. This paper provides a practical approach for decision-making in product quality. Copyright © 2011 Inderscience Enterprises Ltd.
Resumo:
Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.
Resumo:
The stress sensitivity of polymer optical fibre (POF) based Fabry-Perot sensors formed by two uniform Bragg gratings with finite dimensions is investigated. POF has received high interest in recent years due to its different material properties compared to its silica counterpart. Biocompatibility, a higher failure strain and the highly elastic nature of POF are some of the main advantages. The much lower Young’s modulus of polymer materials compared to silica offers enhanced stress sensitivity to POF based sensors which renders them great candidates for acoustic wave receivers and any kind of force detection. The main drawback in POF technology is perhaps the high fibre loss. In a lossless fibre the sensitivity of an interferometer is proportional to its cavity length. However, the presence of the attenuation along the optical path can significantly reduce the finesse of the Fabry-Perot interferometer and it can negatively affect its sensitivity at some point. The reflectivity of the two gratings used to form the interferometer can be also reduced as the fibre loss increases. In this work, a numerical model is developed to study the performance of POF based Fabry-Perot sensors formed by two uniform Bragg gratings with finite dimensions. Various optical and physical properties are considered such as grating physical length, grating effective length which indicates the point where the light is effectively reflected, refractive index modulation of the grating, cavity length of the interferometer, attenuation and operating wavelength. Using this model, we are able to identify the regimes in which the PMMA based sensor offer enhanced stress sensitivity compared to silica based one.
Resumo:
Private nonprofit human service organizations provide a spectrum of services that aim to resolve societal problems. Their failure may leave needed and desired services unprovided or not provided sufficiently to meet public demand. However, the concept of organizational failure has not been examined for the nonprofit organization. This research addresses the deficiency in the literatures of organization failure and nonprofit organizations.^ An eight category typology, developed from a review of the current literature and findings from expert interviews, is initially presented to define nonprofit organization failure. A multiple case study design is used to test the typology in four nonprofit human service delivery agencies. The case analysis reduces the typology to five types salient to nonprofit organization failure: input failure, legitimacy failure, adaptive failure, management failure and leadership failure.^ The resulting five category typology is useful to both theory builders and nonprofit practitioners. For theory development, the interaction of the failure types extends the literature and lays a foundation for a theory of nonprofit organization failure that diffuses management and leadership across all of the failure types, highlights management and leadership failure as collective functions shared by paid staff and the volunteer board of directors, and emphasizes the importance of organization legitimacy.^ From a practical perspective, the typology provides a tool for diagnosing failure in the nonprofit organization. Using the management indicators developed for the typology, a checklist of the warning signals of potential failure, emphasizing the key types of management and leadership, offers nonprofit decision makers a priori examination of an organization's propensity for failure. ^
Resumo:
Expositions of student work at the end of the extended school year are one of many reform efforts in a specially formed School Improvement Zone in Miami Dade schools. This descriptive analysis offers examples of successful attempts to engender pride even in the face of formidable social and cultural obstacles.
Resumo:
The extensive impact and consequences of the 2010 Deep Water Horizon oil drilling rig failure in the Gulf of Mexico, together with expanding drilling activities in the Cuban Exclusive Economic zone, have cast a spotlight on Cuban oil development. The threat of a drilling rig failure has evolved from being only hypothetical to a potential reality with the commencement of active drilling in Cuban waters. The disastrous consequences of a drilling rig failure in Cuban waters will spread over a number of vital interests of the US and of nations in the Caribbean in the general environs of Cuba. The US fishing and tourist industries will take major blows from a significant oil spill in Cuban waters. Substantial ecological damage and damage to beaches could occur for the US, Mexico, Haiti and other countries as well. The need exists for the US to have the ability to independently monitor the reality of Cuban oceanic oil development. The advantages of having an independent US early warning system providing essential real-time data on the possible failure of a drilling rig in Cuban waters are numerous. An ideal early warning system would timely inform the US that an event has occurred or is likely to occur in, essentially, real-time. Presently operating monitoring systems that could provide early warning information are satellite-based. Such systems can indicate the locations of both drilling rigs and operational drilling platforms. The system discussed/proposed in this paper relies upon low-frequency underwater sound. The proposed system can complement existing monitoring systems, which offer ocean-surface information, by providing sub-ocean surface, near-real time, information. This “integrated system” utilizes and combines (integrates) many different forms of information, some gathered through sub-ocean surface systems, and some through electromagnetic-based remote sensing (satellites, aircraft, unmanned arial vehicles), and other methods as well. Although the proposed integrated system is in the developmental stage, it is based upon well-established technologies.
Resumo:
In their dialogue entitled - The Food Service Industry Environment: Market Volatility Analysis - by Alex F. De Noble, Assistant Professor of Management, San Diego State University and Michael D. Olsen, Associate Professor and Director, Division of Hotel, Restaurant & Institutional Management at Virginia Polytechnic Institute and State University, De Noble and Olson preface the discussion by saying: “Hospitality executives, as a whole, do not believe they exist in a volatile environment and spend little time or effort in assessing how current and future activity in the environment will affect their success or failure. The authors highlight potential differences that may exist between executives' perceptions and objective indicators of environmental volatility within the hospitality industry and suggest that executives change these perceptions by incorporating the assumption of a much more dynamic environment into their future strategic planning efforts. Objective, empirical evidence of the dynamic nature of the hospitality environment is presented and compared to several studies pertaining to environmental perceptions of the industry.” That weighty thesis statement presumes that hospitality executives/managers do not fully comprehend the environment in which they operate. The authors provide a contrast, which conventional wisdom would seem to support and satisfy. “Broadly speaking, the operating environment of an organization is represented by its task domain,” say the authors. “This task domain consists of such elements as a firm's customers, suppliers, competitors, and regulatory groups.” These are dynamic actors and the underpinnings of change, say the authors by way of citation. “The most difficult aspect for management in this regard tends to be the development of a proper definition of the environment of their particular firm. Being able to precisely define who the customers, competitors, suppliers, and regulatory groups are within the environment of the firm is no easy task, yet is imperative if proper planning is to occur,” De Noble and Olson further contribute to support their thesis statement. The article is bloated, and that’s not necessarily a bad thing, with tables both survey and empirically driven, to illustrate market volatility. One such table is the Bates and Eldredge outline; Table-6 in the article. “This comprehensive outline…should prove to be useful to most executives in expanding their perception of the environment of their firm,” say De Noble and Olson. “It is, however, only a suggested outline,” they advise. “…risk should be incorporated into every investment decision, especially in a volatile environment,” say the authors. De Noble and Olson close with an intriguing formula to gauge volatility in an environment.
Resumo:
The adverse health effects of long-term exposure to lead are well established, with major uptake into the human body occurring mainly through oral ingestion by young children. Lead-based paint was frequently used in homes built before 1978, particularly in inner-city areas. Minority populations experience the effects of lead poisoning disproportionately. ^ Lead-based paint abatement is costly. In the United States, residents of about 400,000 homes, occupied by 900,000 young children, lack the means to correct lead-based paint hazards. The magnitude of this problem demands research on affordable methods of hazard control. One method is encapsulation, defined as any covering or coating that acts as a permanent barrier between the lead-based paint surface and the environment. ^ Two encapsulants were tested for reliability and effective life span through an accelerated lifetime experiment that applied stresses exceeding those encountered under normal use conditions. The resulting time-to-failure data were used to extrapolate the failure time under conditions of normal use. Statistical analysis and models of the test data allow forecasting of long-term reliability relative to the 20-year encapsulation requirement. Typical housing material specimens simulating walls and doors coated with lead-based paint were overstressed before encapsulation. A second, un-aged set was also tested. Specimens were monitored after the stress test with a surface chemical testing pad to identify the presence of lead breaking through the encapsulant. ^ Graphical analysis proposed by Shapiro and Meeker and the general log-linear model developed by Cox were used to obtain results. Findings for the 80% reliability time to failure varied, with close to 21 years of life under normal use conditions for encapsulant A. The application of product A on the aged gypsum and aged wood substrates yielded slightly lower times. Encapsulant B had an 80% reliable life of 19.78 years. ^ This study reveals that encapsulation technologies can offer safe and effective control of lead-based paint hazards and may be less expensive than other options. The U.S. Department of Health and Human Services and the CDC are committed to eliminating childhood lead poisoning by 2010. This ambitious target is feasible, provided there is an efficient application of innovative technology, a goal to which this study aims to contribute. ^
Resumo:
This qualitative study used grounded theory methods and purposeful sampling to explore perceptions on caring and being cared-for. Twenty-four adolescent male participants, identified as at-risk for school failure, completed a two phase interview process exploring these phenomena within three relationships; the relationship with the friend, with the most caring person they knew and with the teacher they felt cared for them. ^ Each participant was asked a predetermined set of open questions in an initial semi-structured interview. In addition each participant was encouraged to explore his own reflections on caring. A second interview allowed for member checking and for the participant to continue sharing his meaning of caring and being cared-for. ^ Line by line analysis with open, axial and selective coding was applied to interview transcripts along with a constant comparative method. Results indicated that the core category integrating all other categories was attachment bonding. Participants' stories manifested characteristics of proximity seeking, secure base, safe haven and distress upon involuntary separation from an attachment figure. ^ Strategies facilitating attachment bonding were influenced by the power positions of the relational players. Participants responded positively to the one-caring when they felt cared-for. Results further indicated that participants did not need to feel a sense of belonging in order to feel cared-for. Teacher behaviors indicating openness for authentic connections with students were specific to teacher's friendliness and professional competence. Teachers who nurtured feelings of being cared-for were uncommon in the participants' educational experience. ^ The number of adolescent males leaving high school prematurely is both a personal problem and a social problem. Despite a “mask” of indifference often exhibited by adolescent males at-risk for school failure, teachers might consider the social/emotional needs of these students when implementing the curriculum. In addition, policy makers might consider the social/emotional needs of this vulnerable population when developing programs meant to foster psychological well-being and connectedness for adolescent males at-risk for school failure. ^
Resumo:
NOTCH1 is a member of the NOTCH receptor family, a group of single-pass trans-membrane receptors. NOTCH signaling is highly conserved in evolution and mediates communication between adjacent cells. NOTCH receptors have been implicated in cell fate determination, as well as maintenance and differentiation of stem cells. In the mammalian testis expression of NOTCH1 in somatic and germ cells has been demonstrated, however its role in spermatogenesis was not clear. To study the significance of NOTCH1 in germ cells, we applied a cre/loxP approach in mice to induce NOTCH1 gain- or loss-of function specifically in male germ cells. Using a Stra8-icretransgene we produced mice with conditional activation of the NOTCH1 intracellular domain (NICD) in germ cells. Spermatogenesis in these mutants was progressively affected with age, resulting in decreased testis weight and sperm count. Analysis of downstream target genes of NOTCH1 signaling showed an increased expression of Hes5, with a reduction of the spermatogonial differentiation marker, Neurog3 expression in the mutant testis. Apoptosis was significantly increased in mouse germ cells with the corresponding elevation of pro-apoptotic Trp53 and Trp63genes' expression. We also showed that the conditional germ cell-specific ablation of Notch1 had no effect on spermatogenesis or male fertility. Our data suggest the importance of NOTCH signaling regulation in male germ cells for their survival and differentiation.
Resumo:
Unequaled improvements in processor and I/O speeds make many applications such as databases and operating systems to be increasingly I/O bound. Many schemes such as disk caching and disk mirroring have been proposed to address the problem. In this thesis we focus only on disk mirroring. In disk mirroring, a logical disk image is maintained on two physical disks allowing a single disk failure to be transparent to application programs. Although disk mirroring improves data availability and reliability, it has two major drawbacks. First, writes are expensive because both disks must be updated. Second, load balancing during failure mode operation is poor because all requests are serviced by the surviving disk. Distorted mirrors was proposed to address the write problem and interleaved declustering to address the load balancing problem. In this thesis we perform a comparative study of these two schemes under various operating modes. In addition we also study traditional mirroring to provide a common basis for comparison.
Resumo:
This research explores Bayesian updating as a tool for estimating parameters probabilistically by dynamic analysis of data sequences. Two distinct Bayesian updating methodologies are assessed. The first approach focuses on Bayesian updating of failure rates for primary events in fault trees. A Poisson Exponentially Moving Average (PEWMA) model is implemnented to carry out Bayesian updating of failure rates for individual primary events in the fault tree. To provide a basis for testing of the PEWMA model, a fault tree is developed based on the Texas City Refinery incident which occurred in 2005. A qualitative fault tree analysis is then carried out to obtain a logical expression for the top event. A dynamic Fault Tree analysis is carried out by evaluating the top event probability at each Bayesian updating step by Monte Carlo sampling from posterior failure rate distributions. It is demonstrated that PEWMA modeling is advantageous over conventional conjugate Poisson-Gamma updating techniques when failure data is collected over long time spans. The second approach focuses on Bayesian updating of parameters in non-linear forward models. Specifically, the technique is applied to the hydrocarbon material balance equation. In order to test the accuracy of the implemented Bayesian updating models, a synthetic data set is developed using the Eclipse reservoir simulator. Both structured grid and MCMC sampling based solution techniques are implemented and are shown to model the synthetic data set with good accuracy. Furthermore, a graphical analysis shows that the implemented MCMC model displays good convergence properties. A case study demonstrates that Likelihood variance affects the rate at which the posterior assimilates information from the measured data sequence. Error in the measured data significantly affects the accuracy of the posterior parameter distributions. Increasing the likelihood variance mitigates random measurement errors, but casuses the overall variance of the posterior to increase. Bayesian updating is shown to be advantageous over deterministic regression techniques as it allows for incorporation of prior belief and full modeling uncertainty over the parameter ranges. As such, the Bayesian approach to estimation of parameters in the material balance equation shows utility for incorporation into reservoir engineering workflows.
Resumo:
The exploration and development of oil and gas reserves located in harsh offshore environments are characterized with high risk. Some of these reserves would be uneconomical if produced using conventional drilling technology due to increased drilling problems and prolonged non-productive time. Seeking new ways to reduce drilling cost and minimize risks has led to the development of Managed Pressure Drilling techniques. Managed pressure drilling methods address the drawbacks of conventional overbalanced and underbalanced drilling techniques. As managed pressure drilling techniques are evolving, there are many unanswered questions related to safety and operating pressure regimes. Quantitative risk assessment techniques are often used to answer these questions. Quantitative risk assessment is conducted for the various stages of drilling operations – drilling ahead, tripping operation, casing and cementing. A diagnostic model for analyzing the rotating control device, the main component of managed pressure drilling techniques, is also studied. The logic concept of Noisy-OR is explored to capture the unique relationship between casing and cementing operations in leading to well integrity failure as well as its usage to model the critical components of constant bottom-hole pressure drilling technique of managed pressure drilling during tripping operation. Relevant safety functions and inherent safety principles are utilized to improve well integrity operations. Loss function modelling approach to enable dynamic consequence analysis is adopted to study blowout risk for real-time decision making. The aggregation of the blowout loss categories, comprising: production, asset, human health, environmental response and reputation losses leads to risk estimation using dynamically determined probability of occurrence. Lastly, various sub-models developed for the stages/sub-operations of drilling operations and the consequence modelling approach are integrated for a holistic risk analysis of drilling operations.
Resumo:
As sustainability becomes an integral design driver for current civil structures, new materials and forms are investigated. The aim of this study is to investigate analytically and numerically the mechanical behavior of monolithic domes composed of mycological fungi. The study focuses on hemispherical and elliptical forms, as the most typical solution for domes. The influence of different types of loading, geometrical parameters, material properties and boundary conditions is investigated in this study. For the cases covered by the classical shell theory, a comparison between the analytical and the finite element solution is given. Two case studies regarding the dome of basilica of “San Luca” (Bologna, Italy) and the dome of sanctuary of “Vicoforte” (Vicoforte, Italy) are included. After the linear analysis under loading, buckling is also investigated as a critical type of failure through a parametric study using finite elements model. Since shells rely on their shape, form-found domes are also investigated and a comparison between the behavior of the form-found domes and the hemispherical domes under the linear and buckling analysis is conducted. From the analysis it emerges that form-finding can enhance the structural response of mycelium-based domes, although buckling becomes even more critical for their design. Furthermore, an optimal height to span ratio for the buckling of form-found domes is identified. This study highlights the importance of investigating appropriate forms for the design of novel biomaterial-based structures.
Resumo:
Acknowledgments The authors would like to thank the participants of the EPIC-Norfolk cohort. We thank the nutritionist team and data management team of the EPIC-Norfolk cohort. The EPIC-Norfolk study was supported by grants from the Medical Research Council and Cancer Research UK. Funders had no role in study design or interpretation of the findings.