65 resultados para 005 Computer programming, programs
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
Architectures based on Coordinated Atomic action (CA action) concepts have been used to build concurrent fault-tolerant systems. This conceptual model combines concurrent exception handling with action nesting to provide a general mechanism for both enclosing interactions among system components and coordinating forward error recovery measures. This article presents an architectural model to guide the formal specification of concurrent fault-tolerant systems. This architecture provides built-in Communicating Sequential Processes (CSPs) and predefined channels to coordinate exception handling of the user-defined components. Hence some safety properties concerning action scoping and concurrent exception handling can be proved by using the FDR (Failure Divergence Refinement) verification tool. As a result, a formal and general architecture supporting software fault tolerance is ready to be used and proved as users define components with normal and exceptional behaviors. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This paper proposes a simple high-level programming language, endowed with resources that help encoding self-modifying programs. With this purpose, a conventional imperative language syntax (not explicitly stated in this paper) is incremented with special commands and statements forming an adaptive layer specially designed with focus on the dynamical changes to be applied to the code at run-time. The resulting language allows programmers to easily specify dynamic changes to their own program`s code. Such a language succeeds to allow programmers to effortless describe the dynamic logic of their adaptive applications. In this paper, we describe the most important aspects of the design and implementation of such a language. A small example is finally presented for illustration purposes.
Resumo:
Aspect-oriented programming (AOP) is a promising technology that supports separation of crosscutting concerns (i.e., functionality that tends to be tangled with, and scattered through the rest of the system). In AOP, a method-like construct named advice is applied to join points in the system through a special construct named pointcut. This mechanism supports the modularization of crosscutting behavior; however, since the added interactions are not explicit in the source code, it is hard to ensure their correctness. To tackle this problem, this paper presents a rigorous coverage analysis approach to ensure exercising the logic of each advice - statements, branches, and def-use pairs - at each affected join point. To make this analysis possible, a structural model based on Java bytecode - called PointCut-based Del-Use Graph (PCDU) - is proposed, along with three integration testing criteria. Theoretical, empirical, and exploratory studies involving 12 aspect-oriented programs and several fault examples present evidence of the feasibility and effectiveness of the proposed approach. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network. Copyright (c) 2008 J. R. C. Piqueira and F. B. Cesar. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Resumo:
The TCP/IP architecture was consolidated as a standard to the distributed systems. However, there are several researches and discussions about alternatives to the evolution of this architecture and, in this study area, this work presents the Title Model to contribute with the application needs support by the cross layer ontology use and the horizontal addressing, in a next generation Internet. For a practical viewpoint, is showed the network cost reduction for the distributed programming example, in networks with layer 2 connectivity. To prove the title model enhancement, it is presented the network analysis performed for the message passing interface, sending a vector of integers and returning its sum. By this analysis, it is confirmed that the current proposal allows, in this environment, a reduction of 15,23% over the total network traffic, in bytes.
Resumo:
Since the computer viruses pose a serious problem to individual and corporative computer systems, a lot of effort has been dedicated to study how to avoid their deleterious actions, trying to create anti-virus programs acting as vaccines in personal computers or in strategic network nodes. Another way to combat viruses propagation is to establish preventive policies based on the whole operation of a system that can be modeled with population models, similar to those that are used in epidemiological studies. Here, a modified version of the SIR (Susceptible-Infected-Removed) model is presented and how its parameters are related to network characteristics is explained. Then, disease-free and endemic equilibrium points are calculated, stability and bifurcation conditions are derived and some numerical simulations are shown. The relations among the model parameters in the several bifurcation conditions allow a network design minimizing viruses risks. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Purpose: The objective of this study is to evaluate blood glucose (BG) control efficacy and safety of 3 insulin protocols in medical intensive care unit (MICU) patients. Methods: This was a multicenter randomized controlled trial involving 167 MICU patients with at least one BG measurement +/- 150 mg/dL and one or more of the following: mechanical ventilation, systemic inflammatory response syndrome, trauma, or burns. The interventions were computer-assisted insulin protocol (CAIP), with insulin infusion maintaining BG between 100 and 130 mg/dL; Leuven protocol, with insulin maintaining BG between 80 and 110 mg/dL; or conventional treatment-subcutaneous insulin if glucose > 150 mg/dL. The main efficacy outcome was the mean of patients` median BG, and the safety outcome was the incidence of hypoglycemia (<= 40 mg/dL). Results: The mean of patients` median BG was 125.0, 127.1, and 158.5 mg/dL for CAIP, Leuven, and conventional treatment, respectively (P = .34, CAIP vs Leuven; P < .001, CAIP vs conventional). In CAIP, 12 patients (21.4%) had at least one episode of hypoglycemia vs 24 (41.4%) in Leuven and 2 (3.8%) in conventional treatment (P = .02, CAIP vs Leuven; P = .006, CAIP vs conventional). Conclusions: The CAIP is safer than and as effective as the standard strict protocol for controlling glucose in MICU patients. Hypoglycemia was rare under conventional treatment. However, BG levels were higher than with IV insulin protocols. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Substance-dependence is highly associated with executive cognitive function (ECF) impairments. However. considering that it is difficult to assess ECF clinically, the aim of the present study was to examine the feasibility of a brief neuropsychological tool (the Frontal Assessment Battery FAB) to detect specific ECF impairments in a sample of substance-dependent individuals (SDI). Sixty-two subjects participated in this study. Thirty DSM-IV-diagnosed SDI, after 2 weeks of abstinence, and 32 healthy individuals (control group) were evaluated with FAD and other ECF-related tasks: digits forward (DF), digits backward (DB), Stroop Color Word Test (SCWT), and Wisconsin Card Sorting Test (WCST). SDI did not differ from the control group on sociodemographic variables or IQ. However, SDI performed below the controls in OF, DB, and FAB. The SDI were cognitively impaired in 3 of the 6 cognitive domains assessed by the FAB: abstract reasoning, motor programming, and cognitive flexibility. The FAB correlated with DF, SCWT, and WCST. In addition, some neuropsychological measures were correlated with the amount of alcohol, cannabis, and cocaine use. In conclusion, SDI performed more poorly than the comparison group on the FAB and the FAB`s results were associated with other ECF-related tasks. The results suggested a negative impact of alcohol, cannabis, and cocaine use on the ECF. The FAB may be useful in assisting professionals as an instrument to screen for ECF-related deficits in SDI. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Vector field formulation based on the Poisson theorem allows an automatic determination of rock physical properties (magnetization to density ratio-MDR-and the magnetization inclination-MI) from combined processing of gravity and magnetic geophysical data. The basic assumptions (i.e., Poisson conditions) are: that gravity and magnetic fields share common sources, and that these sources have a uniform magnetization direction and MDR. In addition, the previously existing formulation was restricted to profile data, and assumed sufficiently elongated (2-D) sources. For sources that violate Poisson conditions or have a 3-D geometry, the apparent values of MDR and MI that are generated in this way have an unclear relationship to the actual properties in the subsurface. We present Fortran programs that estimate MDR and MI values for 3-D sources through processing of gridded gravity and magnetic data. Tests with simple geophysical models indicate that magnetization polarity can be successfully recovered by MDR-MI processing, even in cases where juxtaposed bodies cannot be clearly distinguished on the basis of anomaly data. These results may be useful in crustal studies, especially in mapping magnetization polarity from marine-based gravity and magnetic data. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
This paper describes a new module of the expert system SISTEMAT used for the prediction of the skeletons of neolignans by (13)C NMR, (1)H NMR and botanical data obtained from the literature. SISTEMAT is composed of MACRONO, SISCONST, C13MACH, H1MACH and SISOCBOT programs, each analyzing data of the neolignan in question to predict the carbon skeleton of the compound. From these results, the global probability is computed and the most probable skeleton predicted. SISTEMAT predicted the skeletons of 75% of the 20 neolignans tested, in a rapid and simple procedure demonstrating its advantage for the structural elucidation of new compounds.
Resumo:
This paper reports an expert system (SISTEMAT) developed for structural determination of diverse chemical classes of natural products, including lignans, based mainly on 13C NMR and 1H NMR data of these compounds. The system is composed of five programs that analyze specific data of a lignan and shows a skeleton probability for the compound. At the end of analyses, the results are grouped, the global probability is computed, and the most probable skeleton is exhibited to the user. SISTEMAT was able to properly predict the skeletons of 80% of the 30 lignans tested, demonstrating its advantage during the structural elucidation course in a short period of time.
Resumo:
We present a computer program developed for estimating penetrance rates in autosomal dominant diseases by means of family kinship and phenotype information contained within the pedigrees. The program also determines the exact 95% credibility interval for the penetrance estimate. Both executable (PenCalc for Windows) and web versions (PenCalcWeb) of the software are available. The web version enables further calculations, such as heterozygosity probabilities and assessment of offspring risks for all individuals in the pedigrees. Both programs can be accessed and down-loaded freely at the home-page address http://www.ib.usp.br/~otto/software.htm.