952 resultados para Run-Time Code Generation, Programming Languages, Object-Oriented Programming


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A rapid, sensitive and specific LC-MS/MS method was developed and validated for quantifying chlordesmethyldiazepam (CDDZ or delorazepam), the active metabolite of cloxazolam, in human plasma. In the analytical assay, bromazepam (internal standard) and CDDZ were extracted using a liquid-liquid extraction (diethyl-ether/hexane, 80/20, v/v) procedure. The LC-MS/MS method on a RP-C18 column had an overall run time of 5.0 min and was linear (1/x weighted) over the range 0.5-50 ng/mL (R > 0.999). The between-run precision was 8.0% (1.5 ng/mL), 7.6% (9 ng/mL), 7.4% (40 ng/mL), and 10.9% at the low limit of quantification-LLOQ (0.500 ng/mL). The between-run accuracies were 0.1, -1.5, -2.7 and 8.7% for the above mentioned concentrations, respectively. All current bioanalytical method validation requirements (FDA and ANVISA) were achieved and it was applied to the bioequivalence study (Cloxazolam-test, Eurofarma Lab. Ltda and Olcadil (R)-reference, Novartis Biociencias S/A). The relative bioavailability between both formulations was assessed by calculating individual test/reference ratios for Cmax, AUClast and AUCO-inf. The pharmacokinetic profiles indicated bioequivalence since all ratios were as proposed by FDA and ANVISA. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A rapid, sensitive and specific method for quantifying ciprofibrate in human plasma using bezafibrate as the internal standard (IS) is described. The sample was acidified prior extraction with formic acid (88%). The analyte and the IS were extracted from plasma by liquid-liquid extraction using an organic solvent (diethyl ether/dichloromethane 70/30 (v/v)). The extracts were analyzed by high performance liquid chromatography coupled with electrospray tandem mass spectrometry (HPLC-MS/MS). Chromatography was performed using Genesis C18 4 mu m analytical column (4.6 x 150 mm i.d.) and a mobile phase consisting of acetonitrile/water (70/30, v/v) and 1 mM acetic acid. The method had a chromatographic run time of 3.4 min and a linear calibration curve over the range 0.1-60 mu g/mL (r > 0.99). The limit of quantification was 0.1 mu g/mL. The intra- and interday accuracy and precision values of the assay were less than 13.5%. The stability tests indicated no significant degradation. The recovery of ciprofibrate was 81.2%, 73.3% and 76.2% for the 0.3, 5.0 and 48.0 ng/mL standard concentrations, respectively. For ciprofibrate, the optimized parameters of the declustering potential, collision energy and collision exit potential were -51 V, -16 eV and -5 V, respectively. The method was also validated without the use of the internal standard. This HPLC-MS/MS procedure was used to assess the bioequivalence of two ciprofibrate 100 mg tablet formulations in healthy volunteers of both sexes. The following pharmacokinetic parameters were obtained from the ciprofibrate plasma concentration vs. time curves: AUC(last), AUC(0-168 h), C(max) and T(max). The geometric mean with corresponding 90% confidence interval (CI) for test/reference percent ratios were 93.80% (90% CI = 88.16-99.79%) for C(max), 98.31% (90% CI = 94.91-101.83%) for AUC(last) and 97.67% (90% CI = 94.45-101.01%) for AUC(0-168 h). Since the 90% Cl for AUC(last), AUC(0-168 h) and C(max) ratios were within the 80-125% interval proposed by the US FDA, it was concluded that ciprofibrate (Lipless (R) 100 mg tablet) formulation manufactured by Biolab Sanus Farmaceutica Ltda. is bioequivalent to the Oroxadin (R) (100 mg tablet) formulation for both the rate and the extent of absorption. (C) 2011 Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the one-dimensional cutting stock problem when demand is a random variable. The problem is formulated as a two-stage stochastic nonlinear program with recourse. The first stage decision variables are the number of objects to be cut according to a cutting pattern. The second stage decision variables are the number of holding or backordering items due to the decisions made in the first stage. The problem`s objective is to minimize the total expected cost incurred in both stages, due to waste and holding or backordering penalties. A Simplex-based method with column generation is proposed for solving a linear relaxation of the resulting optimization problem. The proposed method is evaluated by using two well-known measures of uncertainty effects in stochastic programming: the value of stochastic solution-VSS-and the expected value of perfect information-EVPI. The optimal two-stage solution is shown to be more effective than the alternative wait-and-see and expected value approaches, even under small variations in the parameters of the problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Security administrators face the challenge of designing, deploying and maintaining a variety of configuration files related to security systems, especially in large-scale networks. These files have heterogeneous syntaxes and follow differing semantic concepts. Nevertheless, they are interdependent due to security services having to cooperate and their configuration to be consistent with each other, so that global security policies are completely and correctly enforced. To tackle this problem, our approach supports a comfortable definition of an abstract high-level security policy and provides an automated derivation of the desired configuration files. It is an extension of policy-based management and policy hierarchies, combining model-based management (MBM) with system modularization. MBM employs an object-oriented model of the managed system to obtain the details needed for automated policy refinement. The modularization into abstract subsystems (ASs) segment the system-and the model-into units which more closely encapsulate related system components and provide focused abstract views. As a result, scalability is achieved and even comprehensive IT systems can be modelled in a unified manner. The associated tool MoBaSeC (Model-Based-Service-Configuration) supports interactive graphical modelling, automated model analysis and policy refinement with the derivation of configuration files. We describe the MBM and AS approaches, outline the tool functions and exemplify their applications and results obtained. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to develop a fast capillary electrophoresis method for the determination of benzoate and sorbate ions in commercial beverages. In the method development the pH and constituents of the background electrolyte were selected using the effective mobility versus pH curves. As the high resolution obtained experimentally for sorbate and benzoate in the studies presented in the literature is not in agreement with that expected from the ionic mobility values published, a procedure to determine these values was carried out. The salicylate ion was used as the internal standard. The background electrolyte was composed of 25 mmol L(-1) tris(hydroxymethyl)aminomethane and 12.5 mmol L(-1) 2-hydroxyisobutyric acid, atpH 8.1.Separation was conducted in a fused-silica capillary(32 cm total length and 8.5 cm effective length, 50 mu m I.D.), with short-end injection configuration and direct UV detection at 200 nm for benzoate and salicylate and 254 nm for sorbate ions. The run time was only 28 s. A few figures of merit of the proposed method include: good linearity (R(2) > 0.999), limit of detection of 0.9 and 0.3 mg L(-1) for benzoate and sorbate, respectively, inter-day precision better than 2.7% (n =9) and recovery in the range 97.9-105%. Beverage samples were prepared by simple dilution with deionized water (1:11, v/v). Concentrations in the range of 197-401 mg L(-1) for benzoate and 28-144 mg L(-1) for sorbate were found in soft drinks and tea. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Direct analysis, with minimal sample pretreatment, of antidepressant drugs, fluoxetine, imipramine, desipramine, amitriptyline, and nortriptyline in biofluids was developed with a total run time of 8 min. The setup consists of two HPLC pumps, injection valve, capillary RAM-ADS-C18 pre-column and a capillary analytical C 18 column connected by means of a six-port valve in backflush mode. Detection was performed with ESI-MS/MS and only 1 mu m of sample was injected. Validation was adequately carried out using FLU-d(5) as internal standard. Calibration curves were constructed under a linear range of 1-250 ng mL(-1) in plasma, being the limit of quantification (LOQ), determined as 1 ng mL(-1), for all the analytes. With the described approach it was possible to reach a quantified mass sensitivity of 0.3 pg for each analyte (equivalent to 1.1-1.3 fmol), translating to a lower sample consumption (in the order of 103 less sample than using conventional methods). (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The possibility to compress analyte bands at the beginning of CE runs has many advantages. Analytes at low concentration can be analyzed with high signal-to-noise ratios by using the so-called sample stacking methods. Moreover, sample injections with very narrow initial band widths (small initial standard deviations) are sometimes useful, especially if high resolutions among the bands are required in the shortest run time. In the present work, a method of sample stacking is proposed and demonstrated. It is based on BGEs with high thermal sensitive pHs (high dpH/dT) and analytes with low dpK(a)/dT. High thermal sensitivity means that the working pK(a) of the BGE has a high dpK(a)/dT in modulus. For instance, Tris and Ethanolamine have dpH/dT = -0.028/degrees C and -0.029/degrees C, respectively, whereas carboxylic acids have low dpK(a)/dT values, i.e. in the -0.002/degrees C to+0.002/degrees C range. The action of cooling and heating sections along the capillary during the runs affects also the local viscosity, conductivity, and electric field strength. The effect of these variables on electrophoretic velocity and band compression is theoretically calculated using a simple model. Finally, this stacking method was demonstrated for amino acids derivatized with naphthalene-2,3-dicarboxaldehyde and fluorescamine using a temperature difference of 70 degrees C between two neighbor sections and Tris as separation buffer. In this case, the BGE has a high pH thermal coefficient whereas the carboxylic groups of the analytes have low pK(a) thermal coefficients. The application of these dynamic thermal gradients increased peak height by a factor of two (and decreased the standard deviations of peaks by a factor of two) of aspartic acid and glutamic acid derivatized with naphthalene-2,3-dicarboxaldehyde and serine derivatized with fluorescamine. The effect of thermal compression of bands was not observed when runs were accomplished using phosphate buffer at pH 7 (negative control). Phosphate has a low dpH/dT in this pH range, similar to the dK(a)/dT of analytes. It is shown that vertical bar dK(a)/dT-dpH/dT vertical bar >> 0 is one determinant factor to have significant stacking produced by dynamic thermal junctions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A decision support system (DSS) was implemented based on a fuzzy logic inference system (FIS) to provide assistance in dose alteration of Duodopa infusion in patients with advanced Parkinson’s disease, using data from motor state assessments and dosage. Three-tier architecture with an object oriented approach was used. The DSS has a web enabled graphical user interface that presents alerts indicating non optimal dosage and states, new recommendations, namely typical advice with typical dose and statistical measurements. One data set was used for design and tuning of the FIS and another data set was used for evaluating performance compared with actual given dose. Overall goodness-of-fit for the new patients (design data) was 0.65 and for the ongoing patients (evaluation data) 0.98. User evaluation is now ongoing. The system could work as an assistant to clinical staff for Duodopa treatment in advanced Parkinson’s disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Agent-oriented software engineering and software product lines are two promising software engineering techniques. Recent research work has been exploring their integration, namely multi-agent systems product lines (MAS-PLs), to promote reuse and variability management in the context of complex software systems. However, current product derivation approaches do not provide specific mechanisms to deal with MAS-PLs. This is essential because they typically encompass several concerns (e.g., trust, coordination, transaction, state persistence) that are constructed on the basis of heterogeneous technologies (e.g., object-oriented frameworks and platforms). In this paper, we propose the use of multi-level models to support the configuration knowledge specification and automatic product derivation of MAS-PLs. Our approach provides an agent-specific architecture model that uses abstractions and instantiation rules that are relevant to this application domain. In order to evaluate the feasibility and effectiveness of the proposed approach, we have implemented it as an extension of an existing product derivation tool, called GenArch. The approach has also been evaluated through the automatic instantiation of two MAS-PLs, demonstrating its potential and benefits to product derivation and configuration knowledge specification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dynamic composition of services provides the ability to build complex distributed applications at run time by combining existing services, thus coping with a large variety of complex requirements that cannot be met by individual services alone. However, with the increasing amount of available services that differ in granularity (amount of functionality provided) and qualities, selecting the best combination of services becomes very complex. In response, this paper addresses the challenges of service selection, and makes a twofold contribution. First, a rich representation of compositional planning knowledge is provided, allowing the expression of multiple decompositions of tasks at arbitrary levels of granularity. Second, two distinct search space reduction techniques are introduced, the application of which, prior to performing service selection, results in significant improvement in selection performance in terms of execution time, which is demonstrated via experimental results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O desenvolvimento de artefatos de software é um processo de engenharia, como todo processo de engenharia, envolve uma série de etapas que devem ser conduzidas através de uma metodologia apropriada. Para que um determinado software alcance seus objetivos, as características conceituais e arquiteturais devem ser bem definidas antes da implementação. Aplicações baseadas em hiperdocumentos possuem uma característica específica que é a definição de seus aspectos navegacionais. A navegação é uma etapa crítica no processo de definição de softwares baseados em hiperdocumentos, pois ela conduz o usuário durante uma sessão de visita ao conteúdo de um site. Uma falha no processo de especificação da navegação causa uma perda de contexto, desorientando o usuário no espaço da aplicação. Existem diversas metodologias para o tratamento das características de navegação de aplicações baseadas em hiperdocumentos. As principais metodologias encontradas na literatura foram estudadas e analisadas neste trabalho. Foi realizada uma análise comparativa entre as metodologias, traçando suas abordagens e etapas. O estudo das abordagens de especificação de hiperdocumentos foi uma etapa preliminar servindo como base de estudo para o objetivo deste trabalho. O foco é a construção de uma ferramenta gráfica de especificação conceitual de hiperdocumentos, segundo uma metodologia de modelagem de software baseado em hiperdocumentos. O método adotado foi o OOHDM (Object-Oriented Hypermedia Design Model), por cercar todas as etapas de um processo de desenvolvimento de aplicações, com uma atenção particular à navegação. A ferramenta implementa uma interface gráfica onde o usuário poderá modelar a aplicação através da criação de modelos. O processo de especificação compreende três modelos: modelagem conceitual, modelagem navegacional e de interface. As características da aplicação são definidas em um processo incremental, que começa na definição conceitual e finaliza nas características de interface. A ferramenta gera um protótipo da aplicação em XML. Para a apresentação das páginas em um navegador Web, utilizou-se XSLT para a conversão das informações no formato XML para HTML. Os modelos criados através das etapas de especificação abstrata da aplicação são exportados em OOHDM-ML. Um estudo de caso foi implementado para validação da ferramenta. Como principal contribuição deste trabalho, pode-se citar a construção de um ambiente gráfico de especificação abstrata de hiperdocumentos e um ambiente de implementação de protótipos e exportação de modelos. Com isso, pretende-se orientar, conduzir e disciplinar o trabalho do usuário durante o processo de especificação de aplicações.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a JML-based strategy that incorporates formal specifications into the software development process of object-oriented programs. The strategy evolves functional requirements into a “semi-formal” requirements form, and then expressing them as JML formal specifications. The strategy is implemented as a formal-specification pseudo-phase that runs in parallel with the other phase of software development. What makes our strategy different from other software development strategies used in literature is the particular use of JML specifications we make all along the way from requirements to validation-and-verification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

VALENTIM, R. A. M. ; SOUZA NETO, Plácido Antônio de. O impacto da utilização de design patterns nas métricas e estimativas de projetos de software: a utilização de padrões tem alguma influência nas estimativas?. Revista da FARN, Natal, v. 4, p. 63-74, 2006