952 resultados para Run-Time Code Generation, Programming Languages, Object-Oriented Programming


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A chemical process optimization and control is strongly correlated with the quantity of information can be obtained from the system. In biotechnological processes, where the transforming agent is a cell, many variables can interfere in the process, leading to changes in the microorganism metabolism and affecting the quantity and quality of final product. Therefore, the continuously monitoring of the variables that interfere in the bioprocess, is crucial to be able to act on certain variables of the system, keeping it under desirable operational conditions and control. In general, during a fermentation process, the analysis of important parameters such as substrate, product and cells concentration, is done off-line, requiring sampling, pretreatment and analytical procedures. Therefore, this steps require a significant run time and the use of high purity chemical reagents to be done. In order to implement a real time monitoring system for a benchtop bioreactor, these study was conducted in two steps: (i) The development of a software that presents a communication interface between bioreactor and computer based on data acquisition and process variables data recording, that are pH, temperature, dissolved oxygen, level, foam level, agitation frequency and the input setpoints of the operational parameters of the bioreactor control unit; (ii) The development of an analytical method using near-infrared spectroscopy (NIRS) in order to enable substrate, products and cells concentration monitoring during a fermentation process for ethanol production using the yeast Saccharomyces cerevisiae. Three fermentation runs were conducted (F1, F2 and F3) that were monitored by NIRS and subsequent sampling for analytical characterization. The data obtained were used for calibration and validation, where pre-treatments combined or not with smoothing filters were applied to spectrum data. The most satisfactory results were obtained when the calibration models were constructed from real samples of culture medium removed from the fermentation assays F1, F2 and F3, showing that the analytical method based on NIRS can be used as a fast and effective method to quantify cells, substrate and products concentration what enables the implementation of insitu real time monitoring of fermentation processes

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An enantioselective micromethod for the simultaneous analysis of verapamil (VER) and norverapamil (NOR) in plasma was developed, validated and applied to the study of the kinetic disposition of VER and NOR after the administration of a single oral dose of racemic-VER to rats. VER, NOR and the internal standard (paroxetine) were extracted from only 100-mu L plasma samples using n-hexane and the enantiomers were resolved on a Chiralpak AD column using n-hexane:isopropanol: ethanol: diethyl ami ne (88:6:6:0.1) as the mobile phase. The analyses were performed in the selected reaction monitoring mode. Transitions 456 > 166 for VER enantiomers, 441 > 166 for NOR enantiomers and 330 > 193 for the internal standard were monitored and the method had a total chromatographic run time of 12 min. The method allows the determination of VER and NOR enantiomers at plasma levels as low as 1.0 ng/mL. Racemic VER hydrochloride (10 mg/kg) was given to male Wistar rats by gavage and blood samples were collected from 0 to 6.0 h(n = 6 at each time point). The concentration of (-)-(S)-VER was three folds higher than (+)-(R)-VER, with an AUC ratio (-)/(+) of 2.66. Oral clearance values were 12.17 and 28.77 L/h/kg for (-)-(S)-VER and (+)-(R)-VER, respectively. The pharmacokinetic parameters of NOR were not shown to be enantioselective. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gait speed has been described as a predictive indicator of important adverse outcomes in older populations. Among the criteria to evaluate frailty, gait speed has been identified as the most reliable predictor of fragility, practical and low cost. Objective: This study assesses the discriminating capability of gait speed in determining the presence of fragility in the elderly community in northeast of Brazil. Method: We performed an observational analytic study with a transversal character with a sample of 391 community-living elders, aged 65 years or older, of both sexes, in the city of Santa Cruz-RN. Participants were interviewed using a multidimensional questionnaire to obtain sociodemographic information, physical-related and mental health-related information. The unintentional weight loss, muscle weakness, self-reported exhaustion, slow gait and low-physical activity were considered to evaluate the frailty syndrome. Gait velocity was measured as the time taken to walk the middle 4,6 meters of 8,6 meters (excluding 2 meters to warm-up phase and 2 meters to deceleration phase).We calculate the sensitivity and specificity of gait speed test in different cutoff points for the test run time, from which ROC curve was constructed as a measure of test predictive value to identify frail elders. The prevalence of frailty in Santa Cruz-RN was 17.1%. The gait speed test accuracy was 71%when speed is below 0,91m/s. Among women, the gait speed test accuracy was 80%(gait speed below 0.77m/s) and among men, the test accuracy was 86% (gait spend below 0,82%) (p<0,0001).Conclusion: our findings have clinical relevance when we consider that the detection of frailty presence by the gait speed test can be observed in elderly men and women by a simple, cheap and efficient exam

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the increasing complexity of software systems, there is also an increased concern about its faults. These faults can cause financial losses and even loss of life. Therefore, we propose in this paper the minimization of faults in software by using formally specified tests. The combination of testing and formal specifications is gaining strength in searches mainly through the MBT (Model-Based Testing). The development of software from formal specifications, when the whole process of refinement is done rigorously, ensures that what is specified in the application will be implemented. Thus, the implementation generated from these specifications would accurately depict what was specified. But not always the specification is refined to the level of implementation and code generation, and in these cases the tests generated from the specification tend to find fault. Additionally, the generation of so-called "invalid tests", ie tests that exercise the application scenarios that were not addressed in the specification, complements more significantly the formal development process. Therefore, this paper proposes a method for generating tests from B formal specifications. This method was structured in pseudo-code. The method is based on the systematization of the techniques of black box testing of boundary value analysis, equivalence partitioning, as well as the technique of orthogonal pairs. The method was applied to a B specification and B test machines that generate test cases independent of implementation language were generated. Aiming to validate the method, test cases were transformed manually in JUnit test cases and the application, created from the B specification and developed in Java, was tested. Faults were found with the execution of the JUnit test cases

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aspect-Oriented Software Development (AOSD) is a technique that complements the Object- Oriented Software Development (OOSD) modularizing several concepts that OOSD approaches do not modularize appropriately. However, the current state-of-the art on AOSD suffers with software evolution, mainly because aspect definition can stop to work correctly when base elements evolve. A promising approach to deal with that problem is the definition of model-based pointcuts, where pointcuts are defined based on a conceptual model. That strategy makes pointcut less prone to software evolution than model-base elements. Based on that strategy, this work defines a conceptual model at high abstraction level where we can specify software patterns and architectures that through Model Driven Development techniques they can be instantiated and composed in architecture description language that allows aspect modeling at architecture level. Our MDD approach allows propagate concepts in architecture level to another abstraction levels (design level, for example) through MDA transformation rules. Also, this work shows a plug-in implemented to Eclipse platform called AOADLwithCM. That plug-in was created to support our development process. The AOADLwithCM plug-in was used to describe a case study based on MobileMedia System. MobileMedia case study shows step-by-step how the Conceptual Model approach could minimize Pointcut Fragile Problems, due to software evolution. MobileMedia case study was used as input to analyses evolutions on software according to software metrics proposed by KHATCHADOURIAN, GREENWOOD and RASHID. Also, we analyze how evolution in base model could affect maintenance on aspectual model with and without Conceptual Model approaches

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The tracking between models of the requirements and architecture activities is a strategy that aims to prevent loss of information, reducing the gap between these two initial activities of the software life cycle. In the context of Software Product Lines (SPL), it is important to have this support, which allows the correspondence between this two activities, with management of variability. In order to address this issue, this paper presents a process of bidirectional mapping, defining transformation rules between elements of a goaloriented requirements model (described in PL-AOVgraph) and elements of an architectural description (defined in PL-AspectualACME). These mapping rules are evaluated using a case study: the GingaForAll LPS. To automate this transformation, we developed the MaRiPLA tool (Mapping Requirements to Product Line Architecture), through MDD techniques (Modeldriven Development), including Atlas Transformation Language (ATL) with specification of Ecore metamodels jointly with Xtext , a DSL definition framework, and Acceleo, a code generation tool, in Eclipse environment. Finally, the generated models are evaluated based on quality attributes such as variability, derivability, reusability, correctness, traceability, completeness, evolvability and maintainability, extracted from the CAFÉ Quality Model

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The widespread growth in the use of smart cards (by banks, transport services, and cell phones, etc) has brought an important fact that must be addressed: the need of tools that can be used to verify such cards, so to guarantee the correctness of their software. As the vast majority of cards that are being developed nowadays use the JavaCard technology as they software layer, the use of the Java Modeling Language (JML) to specify their programs appear as a natural solution. JML is a formal language tailored to Java. It has been inspired by methodologies from Larch and Eiffel, and has been widely adopted as the de facto language when dealing with specification of any Java related program. Various tools that make use of JML have already been developed, covering a wide range of functionalities, such as run time and static checking. But the tools existent so far for static checking are not fully automated, and, those that are, do not offer an adequate level of soundness and completeness. Our objective is to contribute to a series of techniques, that can be used to accomplish a fully automated and confident verification of JavaCard applets. In this work we present the first steps to this. With the use of a software platform comprised by Krakatoa, Why and haRVey, we developed a set of techniques to reduce the size of the theory necessary to verify the specifications. Such techniques have yielded very good results, with gains of almost 100% in all tested cases, and has proved as a valuable technique to be used, not only in this, but in most real world problems related to automatic verification

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The way to deal with information assets means nowadays the main factor not only for the success but also for keeping the companies in the global world. The number of information security incidents has grown for the last years. The establishment of information security policies that search to keep the security requirements of assets in the desired degrees is the major priority for the companies. This dissertation suggests a unified process for elaboration, maintenance and development of information security policies, the Processo Unificado para Políticas de Segurança da Informação - PUPSI. The elaboration of this proposal started with the construction of a structure of knowledge based on documents and official rules, published in the last two decades, about security policies and information security. It's a model based on the examined documents which defines the needed security policies to be established in the organization, its work flow and identifies the sequence of hierarchy among them. It's also made a model of the entities participating in the process. Being the problem treated by the model so complex, which involves all security policies that the company must have. PUPSI has an interative and developing approach. This approach was obtained from the instantiation of the RUP - Rational Unified Process model. RUP is a platform for software development object oriented, of Rational Software (IBM group). Which uses the best practice known by the market. PUPSI got from RUP a structure of process that offers functionality, diffusion capacity and comprehension, performance and agility for the process adjustment, offering yet capacity of adjustment to technological and structural charges of the market and the company

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MODSI is a multi-models tool for information systems modeling. A modeling process in MODSI can be driven according to three different approaches: informal, semi-formal and formal. The MODSI tool is therefore based on the linked usage of these three modeling approaches. It can be employed at two different levels: the meta-modeling of a method and the modeling of an information system.In this paper we start presenting different types of modeling by making an analysis of their particular features. Then, we introduce the meta-model defined in our tool, as well as the tool functional architecture. Finally, we describe and illustrate the various usage levels of this tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes an environment for constructing multimedia applications which are used to present different multimedia database objects in accordance with spatiotemporal constraints and from different sources. The main contribution of this paper is to propose an environment which integrates both a modelling case tool and an object-oriented database system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a technique to share the data stored in an object-oriented database aimed at designing environments. This technique shares data between two related databases, called the Original and Product databases, and is composed of three processes: data separation, evolution and integration. Whenever a block of data needs to be shared, it is spread into both databases, resulting in a block on the original database, and another into the Product database, with special links between them controlled by the Object Manager. These blocks do not need to be maintained identical during the evolution phase of the sharing process. Six types of links were defined, and by choosing one, the designer control the evolution and reintegration of the block in both databases. This process uses the composite object concept as the unit of control. The presented concepts can be applied to any data model with support to composite objects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a mathematical model and a methodology to solve the transmission network expansion planning problem with security constraints in full competitive market, assuming that all generation programming plans present in the system operation are known. The methodology let us find an optimal transmission network expansion plan that allows the power system to operate adequately in each one of the generation programming plans specified in the full competitive market case, including a single contingency situation with generation rescheduling using the security (n-1) criterion. In this context, the centralized expansion planning with security constraints and the expansion planning in full competitive market are subsets of the proposal presented in this paper. The model provides a solution using a genetic algorithm designed to efficiently solve the reliable expansion planning in full competitive market. The results obtained for several known systems from the literature show the excellent performance of the proposed methodology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)