871 resultados para Dependent Failures, Interactive Failures, Interactive Coefficients, Reliability, Complex System
Resumo:
The uses of Information and Communication Technologies (ICT) and Web environments for creation, treatment and availability of information have supported the emergence of new social-cultural patterns represented by convergences in textual, image and audio languages. This paper describes and analyzes the National Archives Experience Digital Vaults as a digital publishing web environment and as a cultural heritage. It is a complex system - synthesizer of information design options at information setting, provides new aesthetic aspects, but specially enlarges the cognition of the subjects who interact with the environment. It also enlarges the institutional spaces that guard the collective memory beyond its role of keeping the physical patrimony collected there. Digital Vaults lies as a mix of guide and interactive catalogue to be dealt in a ludic way. The publishing design of the information held on the Archives is meant to facilitate access to knowledge. The documents are organized in a dynamic and not chronological way. They are not divided in fonds or distinct categories, but in controlled interaction of documents previously indexed and linked by the software. The software creates information design and view of documental content that can be considered a new paradigm in Information Science and are part of post-custodial regime, independent from physical spaces and institutions. Information professionals must be prepared to understand and work with the paradigmatic changes described and represented by the new hybrid digital environments; hence the importance of this paper. Cyberspace interactivity between user and the content provided by the environment design provide cooperation, collaboration and sharing knowledge actions, all features of networks, transforming culture globally.
Falhas de mercado e redes em políticas públicas: desafios e possibilidades ao Sistema Único de Saúde
Resumo:
Os princípios e as diretrizes do Sistema Único de Saúde (SUS) impõem uma estrutura de assistência baseada em redes de políticas públicas que, combinada ao modelo de financiamento adotado, conduz a falhas de mercado. Isso impõe barreiras à gestão do sistema público de saúde e à concretização dos objetivos do SUS. As características institucionais e a heterogeneidade dos atores, aliadas à existência de diferentes redes de atenção à saúde, geram complexidade analítica no estudo da dinâmica global da rede do SUS. Há limitações ao emprego de métodos quantitativos baseados em análise estática com dados retrospectivos do sistema público de saúde. Assim, propõe-se a abordagem do SUS como sistema complexo, a partir da utilização de metodologia quantitativa inovadora baseada em simulação computacional. O presente artigo buscou analisar desafios e potencialidades na utilização de modelagem com autômatos celulares combinada com modelagem baseada em agentes para simulação da evolução da rede de serviços do SUS. Tal abordagem deve permitir melhor compreensão da organização, heterogeneidade e dinâmica estrutural da rede de serviços do SUS e possibilitar minimização dos efeitos das falhas de mercado no sistema de saúde brasileiro.
Resumo:
Large Power transformers, an aging and vulnerable part of our energy infrastructure, are at choke points in the grid and are key to reliability and security. Damage or destruction due to vandalism, misoperation, or other unexpected events is of great concern, given replacement costs upward of $2M and lead time of 12 months. Transient overvoltages can cause great damage and there is much interest in improving computer simulation models to correctly predict and avoid the consequences. EMTP (the Electromagnetic Transients Program) has been developed for computer simulation of power system transients. Component models for most equipment have been developed and benchmarked. Power transformers would appear to be simple. However, due to their nonlinear and frequency-dependent behaviors, they can be one of the most complex system components to model. It is imperative that the applied models be appropriate for the range of frequencies and excitation levels that the system experiences. Thus, transformer modeling is not a mature field and newer improved models must be made available. In this work, improved topologically-correct duality-based models are developed for three-phase autotransformers having five-legged, three-legged, and shell-form cores. The main problem in the implementation of detailed models is the lack of complete and reliable data, as no international standard suggests how to measure and calculate parameters. Therefore, parameter estimation methods are developed here to determine the parameters of a given model in cases where available information is incomplete. The transformer nameplate data is required and relative physical dimensions of the core are estimated. The models include a separate representation of each segment of the core, including hysteresis of the core, λ-i saturation characteristic, capacitive effects, and frequency dependency of winding resistance and core loss. Steady-state excitation, and de-energization and re-energization transients are simulated and compared with an earlier-developed BCTRAN-based model. Black start energization cases are also simulated as a means of model evaluation and compared with actual event records. The simulated results using the model developed here are reasonable and more correct than those of the BCTRAN-based model. Simulation accuracy is dependent on the accuracy of the equipment model and its parameters. This work is significant in that it advances existing parameter estimation methods in cases where the available data and measurements are incomplete. The accuracy of EMTP simulation for power systems including three-phase autotransformers is thus enhanced. Theoretical results obtained from this work provide a sound foundation for development of transformer parameter estimation methods using engineering optimization. In addition, it should be possible to refine which information and measurement data are necessary for complete duality-based transformer models. To further refine and develop the models and transformer parameter estimation methods developed here, iterative full-scale laboratory tests using high-voltage and high-power three-phase transformer would be helpful.
Resumo:
The combination of global and local stressors is leading to a decline in coral reef health globally. In the case of eutrophication, increased concentrations of dissolved inorganic nitrogen (DIN) and phosphorus (DIP) are largely attributed to local land use changes. From the global perspective, increased atmospheric CO2 levels are not only contributing to global warming but also ocean acidification (OA). Both eutrophication and OA have serious implications for calcium carbonate production and dissolution among calcifying organisms. In particular, benthic foraminifera precipitate the most soluble form of mineral calcium carbonate (high-Mg calcite), potentially making them more sensitive to dissolution. In this study, a manipulative orthogonal two-factor experiment was conducted to test the effects of dissolved inorganic nutrients and OA on the growth, respiration and photophysiology of the large photosymbiont-bearing benthic foraminifer, Marginopora rossi. This study found the growth rate of M. rossi was inhibited by the interaction of eutrophication and acidification. The relationship between M. rossi and its photosymbionts became destabilized due to the photosymbiont's release from nutrient limitation in the nitrate-enriched treatment, as shown by an increase in zooxanthellae cells per host surface area. Foraminifers from the OA treatments had an increased amount of Chl a per cell, suggesting a greater potential to harvest light energy, however, there was no net benefit to the foraminifer growth. Overall, this study demonstrates that the impacts of OA and eutrophication are dose dependent and interactive. This research indicates an OA threshold at pH 7.6, alone or in combination with eutrophication, will lead to a decline in M. rossi calcification. The decline in foraminifera calcification associated with pollution and OA will have broad ecological implications across their ubiquitous range and suggests that without mitigation it could have serious implications for the future of coral reefs.
Resumo:
The accurate identification of T-cell epitopes remains a principal goal of bioinformatics within immunology. As the immunogenicity of peptide epitopes is dependent on their binding to major histocompatibility complex (MHC) molecules, the prediction of binding affinity is a prerequisite to the reliable prediction of epitopes. The iterative self-consistent (ISC) partial-least-squares (PLS)-based additive method is a recently developed bioinformatic approach for predicting class II peptide−MHC binding affinity. The ISC−PLS method overcomes many of the conceptual difficulties inherent in the prediction of class II peptide−MHC affinity, such as the binding of a mixed population of peptide lengths due to the open-ended class II binding site. The method has applications in both the accurate prediction of class II epitopes and the manipulation of affinity for heteroclitic and competitor peptides. The method is applied here to six class II mouse alleles (I-Ab, I-Ad, I-Ak, I-As, I-Ed, and I-Ek) and included peptides up to 25 amino acids in length. A series of regression equations highlighting the quantitative contributions of individual amino acids at each peptide position was established. The initial model for each allele exhibited only moderate predictivity. Once the set of selected peptide subsequences had converged, the final models exhibited a satisfactory predictive power. Convergence was reached between the 4th and 17th iterations, and the leave-one-out cross-validation statistical terms - q2, SEP, and NC - ranged between 0.732 and 0.925, 0.418 and 0.816, and 1 and 6, respectively. The non-cross-validated statistical terms r2 and SEE ranged between 0.98 and 0.995 and 0.089 and 0.180, respectively. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made freely available online (http://www.jenner.ac.uk/MHCPred).
Resumo:
Research and development of mathematical model of optimum distribution of resources (basically financial) for maintenance of the new (raised) quality (reliability) of complex system concerning, which the decision on its re-structuring is accepted, is stated. The final model gives answers (algorithm of calculation) to questions: how many elements of system to allocate on modernization, which elements, up to what level of depth modernization of each of allocated is necessary, and optimum answers are by criterion of minimization of financial charges.
Resumo:
Statistical mechanics of two coupled vector fields is studied in the tight-binding model that describes propagation of polarized light in discrete waveguides in the presence of the four-wave mixing. The energy and power conservation laws enable the formulation of the equilibrium properties of the polarization state in terms of the Gibbs measure with positive temperature. The transition line T=∞ is established beyond which the discrete vector solitons are created. Also in the limit of the large nonlinearity an analytical expression for the distribution of Stokes parameters is obtained, which is found to be dependent only on the statistical properties of the initial polarization state and not on the strength of nonlinearity. The evolution of the system to the final equilibrium state is shown to pass through the intermediate stage when the energy exchange between the waveguides is still negligible. The distribution of the Stokes parameters in this regime has a complex multimodal structure strongly dependent on the nonlinear coupling coefficients and the initial conditions.
Resumo:
As users continually request additional functionality, software systems will continue to grow in their complexity, as well as in their susceptibility to failures. Particularly for sensitive systems requiring higher levels of reliability, faulty system modules may increase development and maintenance cost. Hence, identifying them early would support the development of reliable systems through improved scheduling and quality control. Research effort to predict software modules likely to contain faults, as a consequence, has been substantial. Although a wide range of fault prediction models have been proposed, we remain far from having reliable tools that can be widely applied to real industrial systems. For projects with known fault histories, numerous research studies show that statistical models can provide reasonable estimates at predicting faulty modules using software metrics. However, as context-specific metrics differ from project to project, the task of predicting across projects is difficult to achieve. Prediction models obtained from one project experience are ineffective in their ability to predict fault-prone modules when applied to other projects. Hence, taking full benefit of the existing work in software development community has been substantially limited. As a step towards solving this problem, in this dissertation we propose a fault prediction approach that exploits existing prediction models, adapting them to improve their ability to predict faulty system modules across different software projects.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Artes, Programa de Pós-Graduação em Arte, 2016.
Resumo:
As users continually request additional functionality, software systems will continue to grow in their complexity, as well as in their susceptibility to failures. Particularly for sensitive systems requiring higher levels of reliability, faulty system modules may increase development and maintenance cost. Hence, identifying them early would support the development of reliable systems through improved scheduling and quality control. Research effort to predict software modules likely to contain faults, as a consequence, has been substantial. Although a wide range of fault prediction models have been proposed, we remain far from having reliable tools that can be widely applied to real industrial systems. For projects with known fault histories, numerous research studies show that statistical models can provide reasonable estimates at predicting faulty modules using software metrics. However, as context-specific metrics differ from project to project, the task of predicting across projects is difficult to achieve. Prediction models obtained from one project experience are ineffective in their ability to predict fault-prone modules when applied to other projects. Hence, taking full benefit of the existing work in software development community has been substantially limited. As a step towards solving this problem, in this dissertation we propose a fault prediction approach that exploits existing prediction models, adapting them to improve their ability to predict faulty system modules across different software projects.
Resumo:
In the last decades, the air traffic system has been changing to adapt itself to new social demands, mainly the safe growth of worldwide traffic capacity. Those changes are ruled by the Communication, Navigation, Surveillance/Air Traffic Management (CNS/ATM) paradigm, based on digital communication technologies (mainly satellites) as a way of improving communication, surveillance, navigation and air traffic management services. However, CNS/ATM poses new challenges and needs, mainly related to the safety assessment process. In face of these new challenges, and considering the main characteristics of the CNS/ATM, a methodology is proposed at this work by combining ""absolute"" and ""relative"" safety assessment methods adopted by the International Civil Aviation Organization (ICAO) in ICAO Doc.9689 [14], using Fluid Stochastic Petri Nets (FSPN) as the modeling formalism, and compares the safety metrics estimated from the simulation of both the proposed (in analysis) and the legacy system models. To demonstrate its usefulness, the proposed methodology was applied to the ""Automatic Dependent Surveillance-Broadcasting"" (ADS-B) based air traffic control system. As conclusions, the proposed methodology assured to assess CNS/ATM system safety properties, in which FSPN formalism provides important modeling capabilities, and discrete event simulation allowing the estimation of the desired safety metric. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The brain is a complex system that, in the normal condition, has emergent properties like those associated with activity-dependent plasticity in learning and memory, and in pathological situations, manifests abnormal long-term phenomena like the epilepsies. Data from our laboratory and from the literature were classified qualitatively as sources of complexity and emergent properties from behavior to electrophysiological, cellular, molecular, and computational levels. We used such models as brainstem-dependent acute audiogenic seizures and forebrain-dependent kindled audiogenic seizures. Additionally we used chemical OF electrical experimental models of temporal lobe epilepsy that induce status epilepticus with behavioral, anatomical, and molecular sequelae such as spontaneous recurrent seizures and long-term plastic changes. Current Computational neuroscience tools will help the interpretation. storage, and sharing of the exponential growth of information derived from those studies. These strategies are considered solutions to deal with the complexity of brain pathologies such as the epilepsies. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
Central chemoreception, the detection of CO(2)/H(+) within the brain and the resultant effect on ventilation, was initially localized at two areas on the ventrolateral medulla, one rostral (rVLM-Mitchell`s) the other caudal (cVLM-Loeschcke`s), by surface application of acidic solutions in anesthetized animals. Focal dialysis of a high CO(2)/H(+) artificial cerebrospinal fluid (aCSF) that produced a milder local pH change in unanesthetized rats (like that with a similar to 6.6 mm Hg increase in arterial P(CO2)) delineated putative chemoreceptor regions for the rVLM at the retrotrapezoid nucleus and the rostral medullary raphe that function predominantly in wakefulness and sleep, respectively. Here we ask if chemoreception in the cVLM can be detected by mild focal stimulation and if it functions in a state dependent manner. At responsive sites just beneath Loeschcke`s area, ventilation was increased by, on average, 17% (P < 0.01) only in wakefulness. These data support our hypothesis that central chemoreception is a distributed property with some sites functioning in a state dependent manner. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Many business-oriented software applications are subject to frequent changes in requirements. This paper shows that, ceteris paribus, increases in the volatility of system requirements decrease the reliability of software. Further, systems that exhibit high volatility during the development phase are likely to have lower reliability during their operational phase. In addition to the typically higher volatility of requirements, end-users who specify the requirements of business-oriented systems are usually less technically oriented than people who specify the requirements of compilers, radar tracking systems or medical equipment. Hence, the characteristics of software reliability problems for business-oriented systems are likely to differ significantly from those of more technically oriented systems.