991 resultados para Multiple failure


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article analyzes the effect of devising a new failure envelope by the combination of the most commonly used failure criteria for the composite laminates, on the design of composite structures. The failure criteria considered for the study are maximum stress and Tsai-Wu criteria. In addition to these popular phenomenological-based failure criteria, a micromechanics-based failure criterion called failure mechanism-based failure criterion is also considered. The failure envelopes obtained by these failure criteria are superimposed over one another and a new failure envelope is constructed based on the lowest absolute values of the strengths predicted by these failure criteria. Thus, the new failure envelope so obtained is named as most conservative failure envelope. A minimum weight design of composite laminates is performed using genetic algorithms. In addition to this, the effect of stacking sequence on the minimum weight of the laminate is also studied. Results are compared for the different failure envelopes and the conservative design is evaluated, with respect to the designs obtained by using only one failure criteria. The design approach is recommended for structures where composites are the key load-carrying members such as helicopter rotor blades.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The uncertainty in material properties and traffic characterization in the design of flexible pavements has led to significant efforts in recent years to incorporate reliability methods and probabilistic design procedures for the design, rehabilitation, and maintenance of pavements. In the mechanistic-empirical (ME) design of pavements, despite the fact that there are multiple failure modes, the design criteria applied in the majority of analytical pavement design methods guard only against fatigue cracking and subgrade rutting, which are usually considered as independent failure events. This study carries out the reliability analysis for a flexible pavement section for these failure criteria based on the first-order reliability method (FORM) and the second-order reliability method (SORM) techniques and the crude Monte Carlo simulation. Through a sensitivity analysis, the most critical parameter affecting the design reliability for both fatigue and rutting failure criteria was identified as the surface layer thickness. However, reliability analysis in pavement design is most useful if it can be efficiently and accurately applied to components of pavement design and the combination of these components in an overall system analysis. The study shows that for the pavement section considered, there is a high degree of dependence between the two failure modes, and demonstrates that the probability of simultaneous occurrence of failures can be almost as high as the probability of component failures. Thus, the need to consider the system reliability in the pavement analysis is highlighted, and the study indicates that the improvement of pavement performance should be tackled in the light of reducing this undesirable event of simultaneous failure and not merely the consideration of the more critical failure mode. Furthermore, this probability of simultaneous occurrence of failures is seen to increase considerably with small increments in the mean traffic loads, which also results in wider system reliability bounds. The study also advocates the use of narrow bounds to the probability of failure, which provides a better estimate of the probability of failure, as validated from the results obtained from Monte Carlo simulation (MCS).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

L’évolution récente des commutateurs de sélection de longueurs d’onde (WSS -Wavelength Selective Switch) favorise le développement du multiplexeur optique d’insertionextraction reconfigurable (ROADM - Reconfigurable Optical Add/Drop Multiplexers) à plusieurs degrés sans orientation ni coloration, considéré comme un équipement fort prometteur pour les réseaux maillés du futur relativement au multiplexage en longueur d’onde (WDM -Wavelength Division Multiplexing ). Cependant, leur propriété de commutation asymétrique complique la question de l’acheminement et de l’attribution des longueur d’ondes (RWA - Routing andWavelength Assignment). Or la plupart des algorithmes de RWA existants ne tiennent pas compte de cette propriété d’asymétrie. L’interruption des services causée par des défauts d’équipements sur les chemins optiques (résultat provenant de la résolution du problème RWA) a pour conséquence la perte d’une grande quantité de données. Les recherches deviennent ainsi incontournables afin d’assurer la survie fonctionnelle des réseaux optiques, à savoir, le maintien des services, en particulier en cas de pannes d’équipement. La plupart des publications antérieures portaient particulièrement sur l’utilisation d’un système de protection permettant de garantir le reroutage du trafic en cas d’un défaut d’un lien. Cependant, la conception de la protection contre le défaut d’un lien ne s’avère pas toujours suffisante en termes de survie des réseaux WDM à partir de nombreux cas des autres types de pannes devenant courant de nos jours, tels que les bris d’équipements, les pannes de deux ou trois liens, etc. En outre, il y a des défis considérables pour protéger les grands réseaux optiques multidomaines composés de réseaux associés à un domaine simple, interconnectés par des liens interdomaines, où les détails topologiques internes d’un domaine ne sont généralement pas partagés à l’extérieur. La présente thèse a pour objectif de proposer des modèles d’optimisation de grande taille et des solutions aux problèmes mentionnés ci-dessus. Ces modèles-ci permettent de générer des solutions optimales ou quasi-optimales avec des écarts d’optimalité mathématiquement prouvée. Pour ce faire, nous avons recours à la technique de génération de colonnes afin de résoudre les problèmes inhérents à la programmation linéaire de grande envergure. Concernant la question de l’approvisionnement dans les réseaux optiques, nous proposons un nouveau modèle de programmation linéaire en nombres entiers (ILP - Integer Linear Programming) au problème RWA afin de maximiser le nombre de requêtes acceptées (GoS - Grade of Service). Le modèle résultant constitue celui de l’optimisation d’un ILP de grande taille, ce qui permet d’obtenir la solution exacte des instances RWA assez grandes, en supposant que tous les noeuds soient asymétriques et accompagnés d’une matrice de connectivité de commutation donnée. Ensuite, nous modifions le modèle et proposons une solution au problème RWA afin de trouver la meilleure matrice de commutation pour un nombre donné de ports et de connexions de commutation, tout en satisfaisant/maximisant la qualité d’écoulement du trafic GoS. Relativement à la protection des réseaux d’un domaine simple, nous proposons des solutions favorisant la protection contre les pannes multiples. En effet, nous développons la protection d’un réseau d’un domaine simple contre des pannes multiples, en utilisant les p-cycles de protection avec un chemin indépendant des pannes (FIPP - Failure Independent Path Protecting) et de la protection avec un chemin dépendant des pannes (FDPP - Failure Dependent Path-Protecting). Nous proposons ensuite une nouvelle formulation en termes de modèles de flots pour les p-cycles FDPP soumis à des pannes multiples. Le nouveau modèle soulève un problème de taille, qui a un nombre exponentiel de contraintes en raison de certaines contraintes d’élimination de sous-tour. Par conséquent, afin de résoudre efficacement ce problème, on examine : (i) une décomposition hiérarchique du problème auxiliaire dans le modèle de décomposition, (ii) des heuristiques pour gérer efficacement le grand nombre de contraintes. À propos de la protection dans les réseaux multidomaines, nous proposons des systèmes de protection contre les pannes d’un lien. Tout d’abord, un modèle d’optimisation est proposé pour un système de protection centralisée, en supposant que la gestion du réseau soit au courant de tous les détails des topologies physiques des domaines. Nous proposons ensuite un modèle distribué de l’optimisation de la protection dans les réseaux optiques multidomaines, une formulation beaucoup plus réaliste car elle est basée sur l’hypothèse d’une gestion de réseau distribué. Ensuite, nous ajoutons une bande pasiv sante partagée afin de réduire le coût de la protection. Plus précisément, la bande passante de chaque lien intra-domaine est partagée entre les p-cycles FIPP et les p-cycles dans une première étude, puis entre les chemins pour lien/chemin de protection dans une deuxième étude. Enfin, nous recommandons des stratégies parallèles aux solutions de grands réseaux optiques multidomaines. Les résultats de l’étude permettent d’élaborer une conception efficace d’un système de protection pour un très large réseau multidomaine (45 domaines), le plus large examiné dans la littérature, avec un système à la fois centralisé et distribué.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It has been 21 years since the decision in Rogers v Whitaker and the legal principles concerning informed consent and liability for negligence are still strongly grounded in this landmark High Court decision. This paper considers more recent developments in the law concerning the failure to disclose inherent risks in medical procedures, focusing on the decision in Wallace v Kam [2013] HCA 19. In this case, the appellant underwent a surgical procedure that carried a number of risks. The surgery itself was not performed in a sub-standard way, but the surgeon failed to disclose two risks to the patient, a failure that constituted a breach of the surgeon’s duty of care in negligence. One of the undisclosed risks was considered to be less serious than the other, and this lesser risk eventuated causing injury to the appellant. The more serious risk did not eventuate, but the appellant argued that if the more serious risk had been disclosed, he would have avoided his injuries completely because he would have refused to undergo the procedure. Liability was disputed by the surgeon, with particular reference to causation principles. The High Court of Australia held that the appellant should not be compensated for harm that resulted from a risk he would have been willing to run. We examine the policy reasons underpinning the law of negligence in this specific context and consider some of the issues raised by this unusual case. We question whether some of the judicial reasoning adopted in this case, represents a significant shift in traditional causation principles.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Two different spatial levels are involved concerning damage accumulation to eventual failure. nucleation and growth rates of microdamage nN* and V*. It is found that the trans-scale length ratio c*/L does not directly affect the process. Instead, two independent dimensionless numbers: the trans-scale one * * ( V*)including the * **5 * N c V including mesoscopic parameters only, play the key role in the process of damage accumulation to failure. The above implies that there are three time scales involved in the process: the macroscopic imposed time scale tim = /a and two meso-scopic time scales, nucleation and growth of damage, (* *4) N N t =1 n c and tV=c*/V*. Clearly, the dimensionless number De*=tV/tim refers to the ratio of microdamage growth time scale over the macroscopically imposed time scale. So, analogous to the definition of Deborah number as the ratio of relaxation time over external one in rheology. Let De be the imposed Deborah number while De represents the competition and coupling between the microdamage growth and the macroscopically imposed wave loading. In stress-wave induced tensile failure (spallation) De* < 1, this means that microdamage has enough time to grow during the macroscopic wave loading. Thus, the microdamage growth appears to be the predominate mechanism governing the failure. Moreover, the dimensionless number D* = tV/tN characterizes the ratio of two intrinsic mesoscopic time scales: growth over nucleation. Similarly let D be the “intrinsic Deborah number”. Both time scales are relevant to intrinsic relaxation rather than imposed one. Furthermore, the intrinsic Deborah number D* implies a certain characteristic damage. In particular, it is derived that D* is a proper indicator of macroscopic critical damage to damage localization, like D* ∼ (10–3~10–2) in spallation. More importantly, we found that this small intrinsic Deborah number D* indicates the energy partition of microdamage dissipation over bulk plastic work. This explains why spallation can not be formulated by macroscopic energy criterion and must be treated by multi-scale analysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objectives: This study sought to investigate the effect of a multiple micronutrient supplement on left ventricular ejection fraction (LVEF) in patients with heart failure. Background: Observational studies suggest that patients with heart failure have reduced intake and lower concentrations of a number of micronutrients. However, there have been very few intervention studies investigating the effect of micronutrient supplementation in patients with heart failure. Methods: This was a randomized, double-blind, placebo-controlled, parallel-group study involving 74 patients with chronic stable heart failure that compared multiple micronutrient supplementation taken once daily versus placebo for 12 months. The primary endpoint was LVEF assessed by cardiovascular magnetic resonance imaging or 3-dimensional echocardiography. Secondary endpoints were Minnesota Living With Heart Failure Questionnaire score, 6-min walk test distance, blood concentrations of N-terminal prohormone of brain natriuretic peptide, C-reactive protein, tumor necrosis factor alpha, interleukin-6, interleukin-10, and urinary levels of 8-iso-prostaglandin F2 alpha. Results: Blood concentrations of a number of micronutrients increased significantly in the micronutrient supplement group, indicating excellent compliance with the intervention. There was no significant difference in mean LVEF at 12 months between treatment groups after adjusting for baseline (mean difference: 1.6%, 95% confidence interval: -2.6 to 5.8, p = 0.441). There was also no significant difference in any of the secondary endpoints at 12 months between treatment groups. Conclusions: This study provides no evidence to support the routine treatment of patients with chronic stable heart failure with a multiple micronutrient supplement. (Micronutrient Supplementation in Patients With Heart Failure [MINT-HF]; NCT01005303).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Chronic Heart Failure (CHF) has a high mortality and morbidity. Large scale randomised controlled trials have proven the benefits of beta blockade and ACE inhibitors in reducing mortality in patients with CHF and expert guidelines mandate their use. In spite of these recommendations, important therapies are under-prescribed and under-utilised.

Method: 1015 consecutive patients enrolled in CHF management programs across Australia were surveyed during 2005-2006 to determine prescribing patterns in heart failure medications. These patients were followed-up for a period of 6 months.

Results: The survey revealed that beta blockers were prescribed to 80% of patients (more than 85% were on sub-optimal doses) and 70% were prescribed Angiotensin converting enzyme (ACE) inhibitors (approximately 50% were on sub-optimal dose). 19% of patients were prescribed Angiotensin receptor blockers (ARBs). By 6 months <25% of the patients who were on sub-optimal dose beta blockers or ACE inhibitors at baseline, had been up-titrated to maximum dose (p<0.0001). In CHF programs, were nurses were able to titrate medications, 75% of patients reached optimal dose of beta blockers compared to those programs with no nurse-led medication titration, where only 25% of patients reached optimal dose (p<0.004). When examining optimal dosage for any two of these mandatory medications, less patients were on optimal therapy. Beta blockers and ACE inhibitors, were both prescribed in combination in 60% of patients. While beta blockers and ARBs were prescribed to 15% of patients.

Conclusion: Whilst prescribing rates for a single medication strategy of beta blockers, or ACE inhibitors were greater than 70%, an increase in dosage of these medications and utilisation of proven combination therapy of these medications was poor. It is suggested that clinical outcomes for this cohort of patients could be further improved by adherence to evidence-based practice, ESC guidelines, and optimisation of these medications by heart failure nurses in a CHF program. On the basis of these findings and in the absence of ready access to a polypill, focussing on evidence-based practice to increase utilisation and optimal dosage of combination medication therapy is critical.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We experimentally examine posted pricing and directed search. In one treatment, capacity-constrained sellers post fixed prices, which buyers observe before choosing whom to visit. In the other, firms post both “single-buyer” (applied when one buyer visits) and “multibuyer” (when multiple buyers visit) prices. We find, based on a 2 × 2 (two buyers and two sellers) market and a follow-up experiment with 3 and 2 × 3 markets, that multibuyer prices can be lower than single-buyer prices or prices in the one-price treatment. Also, allowing the multibuyer price does not affect seller profits and increases market frictions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND Pyogenic tonsillitis may often be observed in the general Western population. In severe cases, it may require antibiotic treatment or even hospitalization and often a prompt clinical response will be noted. Here we present an unusual case of progressive multiple organ failure including fulminant liver failure following acute tonsillitis initially mistaken for "classic" pyogenic (that is bacterial) tonsillitis. CASE PRESENTATION A 68-year-old previously healthy white man was referred with suspicion of pyogenic angina. After tonsillectomy, he developed acute liver failure and consecutive multiple organ failure including acute hemodynamic, pulmonary and dialysis-dependent renal failure. Immunohistopathological analysis of his tonsils and liver as well as serum polymerase chain reaction analyses revealed herpes simplex virus-2 to be the causative pathogen. Treatment included high-dose acyclovir and multiorgan supportive intensive care therapy. His final outcome was favorable. CONCLUSIONS Fulminant herpes simplex virus-2-induced multiple organ failure is rarely observed in the Western hemisphere and should be considered a potential diagnosis in patients with tonsillitis and multiple organ failure including acute liver failure. From a clinical perspective, it seems important to note that fulminant herpes simplex virus-2 infection may masquerade as "routine" bacterial severe sepsis/septic shock. This persevering condition should be diagnosed early and treated goal-oriented in order to gain control of this life-threatening condition.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We study the dynamical states of a small-world network of recurrently coupled excitable neurons, through both numerical and analytical methods. The dynamics of this system depend mostly on both the number of long-range connections or ?shortcuts?, and the delay associated with neuronal interactions. We find that persistent activity emerges at low density of shortcuts, and that the system undergoes a transition to failure as their density reaches a critical value. The state of persistent activity below this transition consists of multiple stable periodic attractors, whose number increases at least as fast as the number of neurons in the network. At large shortcut density and for long enough delays the network dynamics exhibit exceedingly long chaotic transients, whose failure times follow a stretched exponential distribution. We show that this functional form arises for the ensemble-averaged activity if the failure time for each individual network realization is exponen- tially distributed

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Case-based reasoning (CBR) is a unique tool for the evaluation of possible failure of firms (EOPFOF) for its eases of interpretation and implementation. Ensemble computing, a variation of group decision in society, provides a potential means of improving predictive performance of CBR-based EOPFOF. This research aims to integrate bagging and proportion case-basing with CBR to generate a method of proportion bagging CBR for EOPFOF. Diverse multiple case bases are first produced by multiple case-basing, in which a volume parameter is introduced to control the size of each case base. Then, the classic case retrieval algorithm is implemented to generate diverse member CBR predictors. Majority voting, the most frequently used mechanism in ensemble computing, is finally used to aggregate outputs of member CBR predictors in order to produce final prediction of the CBR ensemble. In an empirical experiment, we statistically validated the results of the CBR ensemble from multiple case bases by comparing them with those of multivariate discriminant analysis, logistic regression, classic CBR, the best member CBR predictor and bagging CBR ensemble. The results from Chinese EOPFOF prior to 3 years indicate that the new CBR ensemble, which significantly improved CBRs predictive ability, outperformed all the comparative methods.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Although the incidence of Gram-positive sepsis has risen strongly, it is unclear how Gram-positive organisms (without endotoxin) initiate septic shock. We investigated whether two cell wall components from Staphylococcus aureus, peptidoglycan (PepG) and lipoteichoic acid (LTA), can induce the inflammatory response and multiple organ dysfunction syndrome (MODS) associated with septic shock caused by Gram-positive organisms. In cultured macrophages, LTA (10 micrograms/ml), but not PepG (100 micrograms/ml), induces the release of nitric oxide measured as nitrite. PepG, however, caused a 4-fold increase in the production of nitrite elicited by LTA. Furthermore, PepG antibodies inhibited the release of nitrite elicited by killed S. aureus. Administration of both PepG (10 mg/kg; i.v.) and LTA (3 mg/kg; i.v.) in anesthetized rats resulted in the release of tumor necrosis factor alpha and interferon gamma and MODS, as indicated by a decrease in arterial oxygen pressure (lung) and an increase in plasma concentrations of bilirubin and alanine aminotransferase (liver), creatinine and urea (kidney), lipase (pancreas), and creatine kinase (heart or skeletal muscle). There was also the expression of inducible nitric oxide synthase in these organs, circulatory failure, and 50% mortality. These effects were not observed after administration of PepG or LTA alone. Even a high dose of LTA (10 mg/kg) causes only circulatory failure but no MODS. Thus, our results demonstrate that the two bacterial wall components, PepG and LTA, work together to cause systemic inflammation and multiple systems failure associated with Gram-positive organisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When complex projects go wrong they can go horribly wrong with severe financial consequences. We are undertaking research to develop leading performance indicators for complex projects, metrics to provide early warning of potential difficulties. The assessment of success of complex projects can be made by a range of stakeholders over different time scales, against different levels of project results: the project’s outputs at the end of the project; the project’s outcomes in the months following project completion; and the project’s impact in the years following completion. We aim to identify leading performance indicators, which may include both success criteria and success factors, and which can be measured by the project team during project delivery to forecast success as assessed by key stakeholders in the days, months and years following the project. The hope is the leading performance indicators will act as alarm bells to show if a project is diverting from plan so early corrective action can be taken. It may be that different combinations of the leading performance indicators will be appropriate depending on the nature of project complexity. In this paper we develop a new model of project success, whereby success is assessed by different stakeholders over different time frames against different levels of project results. We then relate this to measurements that can be taken during project delivery. A methodology is described to evaluate the early parts of this model. Its implications and limitations are described. This paper describes work in progress.