871 resultados para Dependent Failures, Interactive Failures, Interactive Coefficients, Reliability, Complex System


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The aim of this study was to examine minor physical anomalies and quantitative measures of the head and face in patients with psychosis vs healthy controls. Methods: Based on a comprehensive prevalence study of psychosis, we recruited 310 individuals with psychosis and 303 controls. From this sample, we matched 180 case-control pairs for age and sex. Individual minor physical anomalies and quantitative measures related to head size and facial height and depth were compared within the matched pairs. Based on all subjects, we examined the specificity of the findings by comparing craniofacial summary scores in patients with nonaffective or affective psychosis and controls. Results: The odds of having a psychotic disorder were increased in those with wider skull bases (odds ratio [OR], 1.40; 95% confidence interval [CI], 1.02-1.17), smaller lower-facial heights (glabella to subnasal) (OR, 0.57; 95% CI, 0.44-0.75), protruding ears (OR, 1.72; 95% CI, 1.05-2.82), and shorter (OR, 2.29; 95% CI, 1.37-3.82) and wider (OR, 2.28; 95% CI, 1.43-3.65) palates. Compared with controls, those with psychotic disorder had skulls that were more brachycephalic. These differences were found to distinguish patients with nonaffective and affective psychoses from controls. Conclusions: Several of the features that differentiate patients from controls relate to the development of the neuro-basicranial complex and the adjacent temporal and frontal lobes. Future research should examine both the temporal lobe and the middle cranial fossa to reconcile our anthropomorphic findings and the literature showing smaller temporal lobes in patients with schizophrenia. Closer attention to the skull base may provide clues to the nature and timing of altered brain development in patients with psychosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a new methodology to reduce the probability of occurring states that cause load curtailment, while minimizing the involved costs to achieve that reduction. The methodology is supported by a hybrid method based on Fuzzy Set and Monte Carlo Simulation to catch both randomness and fuzziness of component outage parameters of transmission power system. The novelty of this research work consists in proposing two fundamentals approaches: 1) a global steady approach which deals with building the model of a faulted transmission power system aiming at minimizing the unavailability corresponding to each faulted component in transmission power system. This, results in the minimal global cost investment for the faulted components in a system states sample of the transmission network; 2) a dynamic iterative approach that checks individually the investment’s effect on the transmission network. A case study using the Reliability Test System (RTS) 1996 IEEE 24 Buses is presented to illustrate in detail the application of the proposed methodology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents the Fuzzy Monte Carlo Model for Transmission Power Systems Reliability based studies (FMC-TRel) methodology, which is based on statistical failure and repair data of the transmission power system components and uses fuzzyprobabilistic modeling for system component outage parameters. Using statistical records allows developing the fuzzy membership functions of system component outage parameters. The proposed hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models allows catching both randomness and fuzziness of component outage parameters. A network contingency analysis to identify any overloading or voltage violation in the network is performed once obtained the system states. This is followed by a remedial action algorithm, based on Optimal Power Flow, to reschedule generations and alleviate constraint violations and, at the same time, to avoid any load curtailment, if possible, or, otherwise, to minimize the total load curtailment, for the states identified by the contingency analysis. For the system states that cause load curtailment, an optimization approach is applied to reduce the probability of occurrence of these states while minimizing the costs to achieve that reduction. This methodology is of most importance for supporting the transmission system operator decision making, namely in the identification of critical components and in the planning of future investments in the transmission power system. A case study based on Reliability Test System (RTS) 1996 IEEE 24 Bus is presented to illustrate with detail the application of the proposed methodology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

No presente trabalho procura-se evidenciar algumas soluções para aplicação de simulação estocástica num contexto de gestão dos ativos, aplicado a um sistema de abastecimento de água, tirando partido da informação disponível sobre a manutenção que vem realizando, ao longo dos anos. Procura-se também descrever como estas metodologias podem ser aplicadas noutros casos, futuramente, beneficiando ainda da recolha de informação de colaboradores da empresa, com experiência no cargo e com elevado conhecimento do funcionamento das infraestruturas. A simulação estocástica é uma área cujas ferramentas podem dar uma preciosa ajuda no processo de tomada de decisão. Por outro lado, as organizações preocupam-se, cada vez mais, com o tema da gestão de ativos e com os custos a si associados, começando a investir mais tempo e dinheiro nessa matéria com o objetivo de delinearem estratégias para aumentar o período de vida útil dos seus ativos e otimizarem os seus investimentos de renovação. Nesse contexto, evidencia-se que um adequado plano de intervenções de manutenção e operação é uma boa metodologia, para garantir a redução de falhas no sistema de abastecimento de uma empresa desse setor, bem como garantir que as infraestruturas se encontram em condições de funcionamento. Contudo, esta abordagem tradicional não será suficiente para garantir as melhores práticas e os objetivos que se pretendem alcançar com uma gestão de ativos atual. O trabalho inclui, ainda, um estudo de caso com que se aplicaram as ferramentas estudadas a um caso real de um grupo de bombagem, de uma das Estações Elevatórias da empresa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de Mestrado em Engenharia Informática 2º Semestre, 2011/2012

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kidneys are the main regulator of salt homeostasis and blood pressure. In the distal region of the tubule active Na-transport is finely tuned. This transport is regulated by various hormonal pathways including aldosterone that regulates the reabsorption at the level of the ASDN, comprising the late DCT, the CNT and the CCD. In the ASDN, the amiloride-sensitive epithelial Na-channel (ENaC) plays a major role in Na-homeostasis, as evidenced by gain-of function mutations in the genes encoding ENaC, causing Liddle's syndrome, a severe form of salt-sensitive hypertension. In this disease, regulation of ENaC is compromised due to mutations that delete or mutate a PY-motif in ENaC. Such mutations interfere with Nedd4-2- dependent ubiquitylation of ENaC, leading to reduced endocytosis of the channel, and consequently to increased channel activity at the cell surface. After endocytosis ENaC is targeted to the lysosome and rapidly degraded. Similarly to other ubiquitylated and endocytosed plasma membrane proteins (such as the EGFR), it is likely that the multi-protein complex system ESCRT is involved. To investigate the involvement of this system we tested the role of one of the ESCRT proteins, Tsg101. Here we show that Tsg101 interacts endogenously and in transfected HEK-293 cells with all three ENaC sub-units. Furthermore, mutations of cytoplasmic lysines of ENaC subunits lead to the disruption of this interaction, indicating a potential involvement of ubiquitin in Tsg101 / ENaC interaction. Tsg101 knockdown in renal epithelial cells increases the total and cell surface pool of ENaC, thus implying TsglOl and consequently the ESCRT system in ENaC degradation by the endosomal/lysosomal system. - Les reins sont les principaux organes responsables de la régulation de la pression artérielle ainsi que de la balance saline du corps. Dans la région distale du tubule, le transport actif de sodium est finement régulé. Ce transport est contrôlé par plusieurs hormones comme l'aldostérone, qui régule la réabsorption au niveau de l'ASDN, segment comprenant la fin du DCT, le CNT et le CCD. Dans l'ASDN, le canal à sodium épithélial sensible à l'amiloride (ENaC) joue un rôle majeur dans l'homéostasie sodique, comme cela fut démontré par les mutations « gain de fonction » dans les gênes encodant ENaC, causant ainsi le syndrome de Liddle, une forme sévère d'hypertension sensible au sel. Dans cette maladie, la régulation d'ENaC est compromise du fait des mutations qui supprime ou mute le domaine PY présent sur les sous-unités d'ENaC. Ces mutations préviennent l'ubiquitylation d'ENaC par Nedd4-2, conduisant ainsi à une baisse de l'endocytose du canal et par conséquent une activité accrue d'ENaC à la surface membranaire. Après endocytose, ENaC est envoyé vers le lysosome et rapidement dégradé. Comme d'autres protéines membranaires ubiquitylées et endocytées (comme l'EGFR), il est probable que le complexe multi-protéique ESCRT est impliqué dans le transport d'ENaC au lysosome. Pour étudier l'implication du système d'ESCRT dans la régulation d'ENaC nous avons testé le rôle d'une protéine de ces complexes, TsglOl. Notre étude nous a permis de démontrer que TsglOl se lie aux trois sous-unités ENaC aussi bien en co-transfection dans des cellules HEK-293 que de manière endogène. De plus, nous avons pu démontrer l'importance de l'ubiquitine dans cette interaction par la mutation de toutes les lysines placées du côté cytoplasmique des sous-unités d'ENaC, empêchant ainsi l'ubiquitylation de ces sous-unités. Enfin, le « knockdown » de TsglOl dans des cellules épithéliales de rein induit une augmentation de l'expression d'ENaC aussi bien dans le «pool» total qu'à la surface membranaire, indiquant ainsi un rôle pour TsglOl et par conséquent du système d'ESCRT dans la dégradation d'ENaC par la voie endosome / lysosome. - Le corps humain est composé d'organes chacun spécialisé dans une fonction précise. Chaque organe est composé de cellules, qui assurent la fonction de l'organe en question. Ces cellules se caractérisent par : - une membrane qui leur permet d'isoler leur compartiment interne (milieu intracellulaire ou cytoplasme) du liquide externe (milieu extracellulaire), - un noyau, où l'ADN est situé, - des protéines, sortent d'unités fonctionnelles ayant une fonction bien définie dans la cellule. La séparation entre l'extérieure et l'intérieure de la cellule est essentielle pour le maintien des composants de ces milieux ainsi que pour la bonne fonction de l'organisme et des cellules. Parmi ces composants, le sodium joue un rôle essentiel car il conditionne le maintien de volume sanguin en participant au maintien du volume extracellulaire. Une augmentation du sodium dans l'organisme provoque donc une augmentation du volume sanguin et ainsi provoque une hypertension. De ce fait, le contrôle de la quantité de sodium présente dans l'organisme est essentiel pour le bon fonctionnement de l'organisme. Le sodium est apporté par l'alimentation, et c'est au niveau du rein que va s'effectuer le contrôle de la quantité de sodium qui va être retenue dans l'organisme pour le maintien d'une concentration normale de sodium dans le milieu extracellulaire. Le rein va se charger de réabsorber toutes sortes de solutés nécessaires pour l'organisme avant d'évacuer les déchets ou le surplus de ces solutés en produisant l'urine. Le rein va se charger de réabsorber le sodium grâce à différentes protéines, parmi elle, nous nous sommes intéressés à une protéine appelée ENaC. Cette protéine joue un rôle important dans la réabsorption du sodium, et lorsqu'elle fonctionne mal, comme il a pu être observé dans certaines maladies génétiques, il en résulte des problèmes d'hypo- ou d'hypertension. Les problèmes résultant du mauvais fonctionnement de cette protéine obligent donc la cellule à réguler efficacement ENaC par différents mécanismes, notamment en diminuant son expression et en dégradant le « surplus ». Dans cette travail de thèse, nous nous sommes intéressés au mécanisme impliqué dans la dégradation d'ENaC et plus précisément à un ensemble de protéines, appelé ESCRT, qui va se charger « d'escorter » une protéine vers un sous compartiment à l'intérieur de la cellule ou elle sera dégradée.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Peroxynitrite is a potent oxidant and nitrating species formed from the reaction between the free radicals nitric oxide and superoxide. An excessive formation of peroxynitrite represents an important mechanism contributing to cell death and dysfunction in multiple cardiovascular pathologies, such as myocardial infarction, heart failure and atherosclerosis. Whereas initial works focused on direct oxidative biomolecular damage as the main route of peroxynitrite toxicity, more recent evidence, mainly obtained in vitro, indicates that peroxynitrite also behaves as a potent modulator of various cell signal transduction pathways. Due to its ability to nitrate tyrosine residues, peroxynitrite affects cellular processes dependent on tyrosine phosphorylation. Peroxynitrite also exerts complex effects on the activity of various kinases and phosphatases, resulting in the up- or downregulation of signalling cascades, in a concentration- and cell-dependent manner. Such roles of peroxynitrite in the redox regulation of key signalling pathways for cardiovascular homeostasis, including protein kinase B and C, the MAP kinases, Nuclear Factor Kappa B, as well as signalling dependent on insulin and the sympatho-adrenergic system are presented in detail in this review.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Langattomien lähiverkkotekniikoiden yleistyessä langattomien verkkojen ja palveluiden kysyntä on kasvanut nopeasti. Varsinkin vuonna 1997 julkistettu IEEE:n 802.11b-standardi on mahdollistanut langattomien verkkotekniikoiden nopean kehityksen. Tässä työssä esitetään suunnitelma kahden verkon välisen rajapinnan rakenteesta ja to-teutuksesta. Rajapintaa kutsutaan yhdysliikennepisteeksi. Sen pääasiallisena tehtävänä on toimia solmupisteenä kaikelle verkkojen väliselle tietoliikenteelle ja hallinnoida niin sisäverkkoa käyttäjineen kuin ulkoverkon puolelle kytkettyjä operaattoreita. Yhdyslii-kennepisteen tehtävänä on tunnistaa sisäverkon käyttäjät, auktorisoida heidät yhteis-työssä operaattorien kanssa, huolehtia sisäverkon käyttäjien verkko-osoitteista ja toimia verkkoliikenteen välittäjänä. Yhdysliikennepiste kykenee reitittämään käyttäjän oikealle operaattorille ja huolehtii siitä, että käyttäjällä on pääsy palveluihin, joiden käyttämiseen tällä on valtuutus. Työssä määritellään yhdysliikennepisteen rajapinnat sekä siihen liitettäviä operaattoreita että sisäverkkoon tarjottavia peruspalveluita varten. Lisäksi määritellään yhdysliikenne-pisteen sisäiset rajapinnat. Yhdysliikennepiste ei rajoita käytettyä verkkotekniikkaa, mutta tässä työssä keskitytään IEEE 802.11b -standardin mukaisiin WLAN-verkkoihin. Yhden tai useamman operaattorin verkkoja on olemassa sekä langallisessa että langat-tomissa ympäristöissä. Näissä verkoissa jokainen Internet-operaattori huolehtii kuiten-kin vain omista asiakkaistaan. Sisäverkko on suljettu, siihen pääsevät liittymään vain operaattorin omat asiakkaat. Työn tuloksena syntynyt yhdysliikennepiste on ratkaisu, jonka avulla voidaan rakentaa monioperaattorialueverkko, joka on avoin kaikille sen käyttäjille.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The primary objective is to identify the critical factors that have a natural impact on the performance measurement system. It is important to make correct decisions related to measurement systems, which are based on the complex business environment. The performance measurement system is combined with a very complex non-linear factor. The Six Sigma methodology is seen as one potential approach at every organisational level. It will be linked to the performance and financial measurement as well as to the analytical thinking on which the viewpoint of management depends. The complex systems are connected to the customer relationship study. As the primary throughput can be seen in a new well-defined performance measurement structure that will also be facilitated as will an analytical multifactor system. These critical factors should also be seen as a business innovation opportunity at the same time. This master's thesis has been divided into two different theoretical parts. The empirical part consists of both action-oriented and constructive research approaches with an empirical case study. The secondary objective is to seek a competitive advantage factor with a new analytical tool and the Six Sigma thinking. Process and product capabilities will be linked to the contribution of complex system. These critical barriers will be identified by the performance measuring system. The secondary throughput can be recognised as the product and the process cost efficiencies which throughputs are achieved with an advantage of management. The performance measurement potential is related to the different productivity analysis. Productivity can be seen as one essential part of the competitive advantage factor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study reports details of the stoichiometric characterization of the mixed complex system, V(H2O2)PAR, formed when vanadium adequately reacts with hydrogen peroxide and with 4-(2-Pyridilazo)Resorcinol. Also the presence of polynuclear species was investigated in order to elucidate about unambiguous assignment of the molar absorptivity, stability constant and composition of the complex. Two mathematical treatments methods of the experimental results were employed. From the results it can be concluded that this system corresponds to a mononuclear complex with 1:1:1 stoichiometry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Brazil is doing a major effort to find alternatives to diesel oil as combustible. Some study lines are oriented to the development of vegetable oils used as fuel, as a source of getting cheaper and have higher energy density than the converted vegetable oils, and less risk of environmental contamination. The aim of this study was to evaluate the performance, the useful life of the lubricant and some components of a Diesel Cycle engine, with an electronic injection system, in a long-term test operating with a preheated blend (65°C) of 50% (v v-1) of soybean oil in petrodiesel. There was a reduction of the useful life of the injectors which presented failure because of high wear with 264 hours of operation and showed an increase in emissions of particulate matter (opacity) which may be assigned to the failures occurred in the injection system. An increase in the useful life of the lubricant, when compared with the literature was also observed. The electronic injection system may favor the burning of the tested fuel. The test was interrupted with 264 hours because of failures in the injection system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The brain is a complex system, which produces emergent properties such as those associated with activity-dependent plasticity in processes of learning and memory. Therefore, understanding the integrated structures and functions of the brain is well beyond the scope of either superficial or extremely reductionistic approaches. Although a combination of zoom-in and zoom-out strategies is desirable when the brain is studied, constructing the appropriate interfaces to connect all levels of analysis is one of the most difficult challenges of contemporary neuroscience. Is it possible to build appropriate models of brain function and dysfunctions with computational tools? Among the best-known brain dysfunctions, epilepsies are neurological syndromes that reach a variety of networks, from widespread anatomical brain circuits to local molecular environments. One logical question would be: are those complex brain networks always producing maladaptive emergent properties compatible with epileptogenic substrates? The present review will deal with this question and will try to answer it by illustrating several points from the literature and from our laboratory data, with examples at the behavioral, electrophysiological, cellular and molecular levels. We conclude that, because the brain is a complex system compatible with the production of emergent properties, including plasticity, its functions should be approached using an integrated view. Concepts such as brain networks, graphics theory, neuroinformatics, and e-neuroscience are discussed as new transdisciplinary approaches dealing with the continuous growth of information about brain physiology and its dysfunctions. The epilepsies are discussed as neurobiological models of complex systems displaying maladaptive plasticity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An enterprise is viewed as a complex system which can be engineered to accomplish organisational objectives. Systems analysis and modelling will enable to the planning and development of the enterprise and IT systems. Many IT systems design methods focus on functional and non-functional requirements of the IT systems. Most methods are normally capable of one but leave out other aspects. Analysing and modelling of both business and IT systems may often have to call on techniques from various suites of methods which may be placed on different philosophic and methodological underpinnings. Coherence and consistency between the analyses are hard to ensure. This paper introduces the Problem Articulation Method (PAM) which facilitates the design of an enterprise system infrastructure on which an IT system is built. Outcomes of this analysis represent requirements which can be further used for planning and designing a technical system. As a case study, a finance system, Agresso, for e-procurement has been used in this paper to illustrate the applicability of PAM in modelling complex systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modelling the interaction of terahertz(THz) radiation with biological tissueposes many interesting problems. THzradiation is neither obviously described byan electric field distribution or anensemble of photons and biological tissueis an inhomogeneous medium with anelectronic permittivity that is bothspatially and frequency dependent making ita complex system to model.A three-layer system of parallel-sidedslabs has been used as the system throughwhich the passage of THz radiation has beensimulated. Two modelling approaches havebeen developed a thin film matrix model anda Monte Carlo model. The source data foreach of these methods, taken at the sametime as the data recorded to experimentallyverify them, was a THz spectrum that hadpassed though air only.Experimental verification of these twomodels was carried out using athree-layered in vitro phantom. Simulatedtransmission spectrum data was compared toexperimental transmission spectrum datafirst to determine and then to compare theaccuracy of the two methods. Goodagreement was found, with typical resultshaving a correlation coefficient of 0.90for the thin film matrix model and 0.78 forthe Monte Carlo model over the full THzspectrum. Further work is underway toimprove the models above 1 THz.