60 resultados para Distributed computer-controlled systems
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
In this paper we study the approximate controllability of control systems with states and controls in Hilbert spaces, and described by a second-order semilinear abstract functional differential equation with infinite delay. Initially we establish a characterization for the approximate controllability of a second-order abstract linear system and, in the last section, we compare the approximate controllability of a semilinear abstract functional system with the approximate controllability of the associated linear system. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Background: Various neuroimaging studies, both structural and functional, have provided support for the proposal that a distributed brain network is likely to be the neural basis of intelligence. The theory of Distributed Intelligent Processing Systems (DIPS), first developed in the field of Artificial Intelligence, was proposed to adequately model distributed neural intelligent processing. In addition, the neural efficiency hypothesis suggests that individuals with higher intelligence display more focused cortical activation during cognitive performance, resulting in lower total brain activation when compared with individuals who have lower intelligence. This may be understood as a property of the DIPS. Methodology and Principal Findings: In our study, a new EEG brain mapping technique, based on the neural efficiency hypothesis and the notion of the brain as a Distributed Intelligence Processing System, was used to investigate the correlations between IQ evaluated with WAIS (Whechsler Adult Intelligence Scale) and WISC (Wechsler Intelligence Scale for Children), and the brain activity associated with visual and verbal processing, in order to test the validity of a distributed neural basis for intelligence. Conclusion: The present results support these claims and the neural efficiency hypothesis.
Resumo:
This paper applies the concepts and methods of complex networks to the development of models and simulations of master-slave distributed real-time systems by introducing an upper bound in the allowable delivery time of the packets with computation results. Two representative interconnection models are taken into account: Uniformly random and scale free (Barabasi-Albert), including the presence of background traffic of packets. The obtained results include the identification of the uniformly random interconnectivity scheme as being largely more efficient than the scale-free counterpart. Also, increased latency tolerance of the application provides no help under congestion.
Resumo:
Tooth shade results from the interaction between enamel color, enamel translucency and dentine color. A change in any of these parameters will change a tooth’s color. The objective of this study was to evaluate the changes occurring in enamel translucency during a tooth whitening process. Fourteen human tooth enamel fragments, with a mean thickness of 0.96 mm (± 0.3 mm), were subjected to a bleaching agent (10% carbamide peroxide) 8 hours per day for 28 days. The enamel fragment translucency was measured by a computer controlled spectrophotometer before and after the bleaching agent applications in accordance with ANSI Z80.3-1986 - American National Standard for Ophthalmics - nonprescription sunglasses and fashion eyewear-requirements. The measurements were statistically compared by the Mann-Whitney non-parametric test. A decrease was observed in the translucency of all specimens and, consequently, there was a decrease in transmittance values for all samples. It was observed that the bleaching procedure significantly changes the enamel translucency, making it more opaque.
Resumo:
The aim of this work was to evaluate the performance of femtosecond laser-induced breakdown spectroscopy (fs-LIBS) for the determination of elements in animal tissues. Sample pellets were prepared from certified reference materials, such as liver, kidney, muscle, hepatopancreas, and oyster, after cryogenic grinding assisted homogenization. Individual samples were placed in a two-axis computer-controlled translation stage that moved in the plane orthogonal to a beam originating from a Ti:Sapphire chirped-pulse amplification (CPA) laser system operating at 800 mu and producing a train of 840 mu J and 40 fs pulses at 90 Hz. The plasma emission was coupled into the optical fiber of a high-resolution intensified charge-coupled device (ICCD)-echelle spectrometer. Time-resolved characteristics of the laser-produced plasmas showed that the best results were obtained with delay times between 80 and 120 ns. Data obtained indicate both that it is a matrix-independent sampling process and that fs-LIBS can be used for the determination of Ca, Cu, Fe, K, Mg, Na, and P, but efforts must be made to obtain more appropriate detection limits for Al, Sr, and Zn.
Resumo:
A multi-pumping flow system exploiting prior assay is proposed for sequential turbidimetric determination of sulphate and chloride in natural waters. Both methods are implemented in the same manifold that provides facilities for: in-line sample clean-up with a Bio-Rex 70 mini-column with fluidized beads: addition of low amounts of sulphate or chloride ions to the reaction medium for improving supersaturation; analyte precipitation with Ba(2+) or Ag(+); real-time decision on the need for next assay. The sample is initially run for chloride determination, and the analytical signal is compared with a preset value. If higher, the sample is run again, now for sulphate determination. The strategy may lead to all increased sample throughput. The proposed system is computer-controlled and presents enhanced figures of merit. About 10 samples are run per hour (about 60 measurements) and results are reproducible and Unaffected by the presence of potential interfering ions at concentration levels usually found in natural waters. Accuracy was assessed against ion chromatography. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Acoustic resonances are observed in high-pressure discharge lamps operated with ac input modulated power frequencies in the kilohertz range. This paper describes an optical resonance detection method for high-intensity discharge lamps using computer-controlled cameras and image processing software. Experimental results showing acoustic resonances in high-pressure sodium lamps are presented.
Resumo:
Mutation testing has been used to assess the quality of test case suites by analyzing the ability in distinguishing the artifact under testing from a set of alternative artifacts, the so-called mutants. The mutants are generated from the artifact under testing by applying a set of mutant operators, which produce artifacts with simple syntactical differences. The mutant operators are usually based on typical errors that occur during the software development and can be related to a fault model. In this paper, we propose a language-named MuDeL (MUtant DEfinition Language)-for the definition of mutant operators, aiming not only at automating the mutant generation, but also at providing precision and formality to the operator definition. The proposed language is based on concepts from transformational and logical programming paradigms, as well as from context-free grammar theory. Denotational semantics formal framework is employed to define the semantics of the MuDeL language. We also describe a system-named mudelgen-developed to support the use of this language. An executable representation of the denotational semantics of the language is used to check the correctness of the implementation of mudelgen. At the very end, a mutant generator module is produced, which can be incorporated into a specific mutant tool/environment. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Distributed control systems consist of sensors, actuators and controllers, interconnected by communication networks and are characterized by a high number of concurrent process. This work presents a proposal for a procedure to model and analyze communication networks for distributed control systems in intelligent building. The approach considered for this purpose is based on the characterization of the control system as a discrete event system and application of coloured Petri net as a formal method for specification, analysis and verification of control solutions. With this approach, we develop the models that compose the communication networks for the control systems of intelligent building, which are considered the relationships between the various buildings systems. This procedure provides a structured development of models, facilitating the process of specifying the control algorithm. An application example is presented in order to illustrate the main features of this approach.
Resumo:
Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network. Copyright (c) 2008 J. R. C. Piqueira and F. B. Cesar. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Resumo:
Wireless Sensor Networks (WSNs) have a vast field of applications, including deployment in hostile environments. Thus, the adoption of security mechanisms is fundamental. However, the extremely constrained nature of sensors and the potentially dynamic behavior of WSNs hinder the use of key management mechanisms commonly applied in modern networks. For this reason, many lightweight key management solutions have been proposed to overcome these constraints. In this paper, we review the state of the art of these solutions and evaluate them based on metrics adequate for WSNs. We focus on pre-distribution schemes well-adapted for homogeneous networks (since this is a more general network organization), thus identifying generic features that can improve some of these metrics. We also discuss some challenges in the area and future research directions. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The TCP/IP architecture was consolidated as a standard to the distributed systems. However, there are several researches and discussions about alternatives to the evolution of this architecture and, in this study area, this work presents the Title Model to contribute with the application needs support by the cross layer ontology use and the horizontal addressing, in a next generation Internet. For a practical viewpoint, is showed the network cost reduction for the distributed programming example, in networks with layer 2 connectivity. To prove the title model enhancement, it is presented the network analysis performed for the message passing interface, sending a vector of integers and returning its sum. By this analysis, it is confirmed that the current proposal allows, in this environment, a reduction of 15,23% over the total network traffic, in bytes.
Resumo:
In this paper the continuous Verhulst dynamic model is used to synthesize a new distributed power control algorithm (DPCA) for use in direct sequence code division multiple access (DS-CDMA) systems. The Verhulst model was initially designed to describe the population growth of biological species under food and physical space restrictions. The discretization of the corresponding differential equation is accomplished via the Euler numeric integration (ENI) method. Analytical convergence conditions for the proposed DPCA are also established. Several properties of the proposed recursive algorithm, such as Euclidean distance from optimum vector after convergence, convergence speed, normalized mean squared error (NSE), average power consumption per user, performance under dynamics channels, and implementation complexity aspects, are analyzed through simulations. The simulation results are compared with two other DPCAs: the classic algorithm derived by Foschini and Miljanic and the sigmoidal of Uykan and Koivo. Under estimated errors conditions, the proposed DPCA exhibits smaller discrepancy from the optimum power vector solution and better convergence (under fixed and adaptive convergence factor) than the classic and sigmoidal DPCAs. (C) 2010 Elsevier GmbH. All rights reserved.
Resumo:
Background Heart failure and diabetes often occur simultaneously in patients, but the prognostic value of glycemia in chronic heart failure is debatable. We evaluated the role of glycemia on prognosis of heart failure. Methods Outpatients with chronic heart failure from the Long-term Prospective Randomized Controlled Study Using Repetitive Education at Six-Month Intervals and Monitoring for Adherence in Heart Failure Outpatients (REMADHE) trial were grouped according to the presence of diabetes and level of glycemia. All-cause mortality/heart transplantation and unplanned hospital admission were evaluated. Results Four hundred fifty-six patients were included (135 [29.5%] female, 124 [27.2%] with diabetes mellitus, age of w50.2 +/- 11.4 years, and left-ventricle ejection fraction of 34.7% +/- 10.5%). During follow-up (3.6 +/- 2.2 years), 27 (5.9%) patients were submitted to heart transplantation and 202 (44.2%) died; survival was similar in patients with and without diabetes mellitus. When patients with and without diabetes were categorized according to glucose range (glycemia <= 100 mg/dL [5.5 mmol/L]), as well as when distributed in quintiles of glucose, the survival was significantly worse among patients with lower levels of glycemia. This finding persisted in Cox proportional hazards regression model that included gender, etiology, left ventricle ejection fraction, left ventricle diastolic diameter, creatinine level and beta-blocker therapy, and functional status (hazard ratio 1.45, 95% CI 1.09-1.69, P = .039). No difference regarding unplanned hospital admission was found. Conclusion We report on an inverse association between glycemia and mortality in outpatients with chronic heart failure. These results point to a new pathophysiologic understanding of the interactions between diabetes mellitus, hyperglycemia, and heart disease. (Am Heart J 2010; 159: 90-7.)
Resumo:
Purpose: The objective of this study is to evaluate blood glucose (BG) control efficacy and safety of 3 insulin protocols in medical intensive care unit (MICU) patients. Methods: This was a multicenter randomized controlled trial involving 167 MICU patients with at least one BG measurement +/- 150 mg/dL and one or more of the following: mechanical ventilation, systemic inflammatory response syndrome, trauma, or burns. The interventions were computer-assisted insulin protocol (CAIP), with insulin infusion maintaining BG between 100 and 130 mg/dL; Leuven protocol, with insulin maintaining BG between 80 and 110 mg/dL; or conventional treatment-subcutaneous insulin if glucose > 150 mg/dL. The main efficacy outcome was the mean of patients` median BG, and the safety outcome was the incidence of hypoglycemia (<= 40 mg/dL). Results: The mean of patients` median BG was 125.0, 127.1, and 158.5 mg/dL for CAIP, Leuven, and conventional treatment, respectively (P = .34, CAIP vs Leuven; P < .001, CAIP vs conventional). In CAIP, 12 patients (21.4%) had at least one episode of hypoglycemia vs 24 (41.4%) in Leuven and 2 (3.8%) in conventional treatment (P = .02, CAIP vs Leuven; P = .006, CAIP vs conventional). Conclusions: The CAIP is safer than and as effective as the standard strict protocol for controlling glucose in MICU patients. Hypoglycemia was rare under conventional treatment. However, BG levels were higher than with IV insulin protocols. (C) 2009 Elsevier Inc. All rights reserved.