976 resultados para Standard method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To evaluate drug interaction software programs and determine their accuracy in identifying drug-drug interactions that may occur in intensive care units. Setting The study was developed in Brazil. Method Drug interaction software programs were identified through a bibliographic search in PUBMED and in LILACS (database related to the health sciences published in Latin American and Caribbean countries). The programs` sensitivity, specificity, and positive and negative predictive values were determined to assess their accuracy in detecting drug-drug interactions. The accuracy of the software programs identified was determined using 100 clinically important interactions and 100 clinically unimportant ones. Stockley`s Drug Interactions 8th edition was employed as the gold standard in the identification of drug-drug interaction. Main outcome Sensitivity, specificity, positive and negative predictive values. Results The programs studied were: Drug Interaction Checker (DIC), Drug-Reax (DR), and Lexi-Interact (LI). DR displayed the highest sensitivity (0.88) and DIC showed the lowest (0.69). A close similarity was observed among the programs regarding specificity (0.88-0.92) and positive predictive values (0.88-0.89). The DIC had the lowest negative predictive value (0.75) and DR the highest (0.91). Conclusion The DR and LI programs displayed appropriate sensitivity and specificity for identifying drug-drug interactions of interest in intensive care units. Drug interaction software programs help pharmacists and health care teams in the prevention and recognition of drug-drug interactions and optimize safety and quality of care delivered in intensive care units.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optical monitoring systems are necessary to manufacture multilayer thin-film optical filters with low tolerance on spectrum specification. Furthermore, to have better accuracy on the measurement of film thickness, direct monitoring is a must. Direct monitoring implies acquiring spectrum data from the optical component undergoing the film deposition itself, in real time. In making film depositions on surfaces of optical components, the high vacuum evaporator chamber is the most popular equipment. Inside the evaporator, at the top of the chamber, there is a metallic support with several holes where the optical components are assembled. This metallic support has rotary motion to promote film homogenization. To acquire a measurement of the spectrum of the film in deposition, it is necessary to pass a light beam through a glass witness undergoing the film deposition process, and collect a sample of the light beam using a spectrometer. As both the light beam and the light collector are stationary, a synchronization system is required to identify the moment at which the optical component passes through the light beam.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivation: Understanding the patterns of association between polymorphisms at different loci in a population ( linkage disequilibrium, LD) is of fundamental importance in various genetic studies. Many coefficients were proposed for measuring the degree of LD, but they provide only a static view of the current LD structure. Generative models (GMs) were proposed to go beyond these measures, giving not only a description of the actual LD structure but also a tool to help understanding the process that generated such structure. GMs based in coalescent theory have been the most appealing because they link LD to evolutionary factors. Nevertheless, the inference and parameter estimation of such models is still computationally challenging. Results: We present a more practical method to build GM that describe LD. The method is based on learning weighted Bayesian network structures from haplotype data, extracting equivalence structure classes and using them to model LD. The results obtained in public data from the HapMap database showed that the method is a promising tool for modeling LD. The associations represented by the learned models are correlated with the traditional measure of LD D`. The method was able to represent LD blocks found by standard tools. The granularity of the association blocks and the readability of the models can be controlled in the method. The results suggest that the causality information gained by our method can be useful to tell about the conservability of the genetic markers and to guide the selection of subset of representative markers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research presents a method for frequency estimation in power systems using an adaptive filter based on the Least Mean Square Algorithm (LMS). In order to analyze a power system, three-phase voltages were converted into a complex signal applying the alpha beta-transform and the results were used in an adaptive filtering algorithm. Although the use of the complex LMS algorithm is described in the literature, this paper deals with some practical aspects of the algorithm implementation. In order to reduce computing time, a coefficient generator was implemented. For the algorithm validation, a computing simulation of a power system was carried Out using the ATP software. Many different situations were Simulated for the performance analysis of the proposed methodology. The results were compared to a commercial relay for validation, showing the advantages of the new method. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a novel graphical approach to adjust and evaluate frequency-based relays employed in anti-islanding protection schemes of distributed synchronous generators, in order to meet the anti-islanding and abnormal frequency variation requirements, simultaneously. The proposed method defines a region in the power mismatch space, inside which the relay non-detection zone should be located, if the above-mentioned requirements must be met. Such region is called power imbalance application region. Results show that this method can help protection engineers to adjust frequency-based relays to improve the anti-islanding capability and to minimize false operation occurrences, keeping the abnormal frequency variation utility requirements satisfied. Moreover, the proposed method can be employed to coordinate different types of frequency-based relays, aiming at improving overall performance of the distributed generator frequency protection scheme. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The design of supplementary damping controllers to mitigate the effects of electromechanical oscillations in power systems is a highly complex and time-consuming process, which requires a significant amount of knowledge from the part of the designer. In this study, the authors propose an automatic technique that takes the burden of tuning the controller parameters away from the power engineer and places it on the computer. Unlike other approaches that do the same based on robust control theories or evolutionary computing techniques, our proposed procedure uses an optimisation algorithm that works over a formulation of the classical tuning problem in terms of bilinear matrix inequalities. Using this formulation, it is possible to apply linear matrix inequality solvers to find a solution to the tuning problem via an iterative process, with the advantage that these solvers are widely available and have well-known convergence properties. The proposed algorithm is applied to tune the parameters of supplementary controllers for thyristor controlled series capacitors placed in the New England/New York benchmark test system, aiming at the improvement of the damping factor of inter-area modes, under several different operating conditions. The results of the linear analysis are validated by non-linear simulation and demonstrate the effectiveness of the proposed procedure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nature of the molecular structure of plastics makes the properties of such materials markedly temperature dependent. In addition, the continuous increase in the utilization of polymeric materials in many specific applications has demanded knowledge of their physical properties, both during their processing as raw material, as well as over the working temperature range of the final polymer product. Thermal conductivity, thermal diffusivity and specific heat, namely the thermal properties, are the three most important physical properties of a material that are needed for heat transfer calculations. Recently, among several different methods for the determination of the thermal diffusivity and thermal conductivity, transient techniques have become the preferable way for measuring thermal properties of materials. In this work, a very simple and low cost variation of the well known Angstrom method is employed in the experimental determination of the thermal diffusivity of some selected polymers. Cylindrical shaped samples 3 cm diameter and 7 cm high were prepared by cutting from long cylindrical commercial bars. The reproducibility is very good, and the results obtained were checked against results obtained by the hot wire technique, laser flash technique, and when possible, they were also compared with data found in the literature. Thermal conductivity may be then derived from the thermal diffusivity with the knowledge of the bulk density and the specific heat, easily obtained by differential scanning calorimetry. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article presents an extensive investigation carried out in two technology-based companies of the So Carlos technological pole in Brazil. Based on this multiple case study and literature review, a method, entitled hereafter IVPM2, applying agile project management (APM) principles was developed. After the method implementation, a qualitative evaluation was carried out by a document analysis and questionnaire application. This article shows that the application of this method at the companies under investigation evidenced the benefits of using simple, iterative, visual, and agile techniques to plan and control innovative product projects combined with traditional project management best practices, such as standardization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the method of Galerkin and the Askey-Wiener scheme are used to obtain approximate solutions to the stochastic displacement response of Kirchhoff plates with uncertain parameters. Theoretical and numerical results are presented. The Lax-Milgram lemma is used to express the conditions for existence and uniqueness of the solution. Uncertainties in plate and foundation stiffness are modeled by respecting these conditions, hence using Legendre polynomials indexed in uniform random variables. The space of approximate solutions is built using results of density between the space of continuous functions and Sobolev spaces. Approximate Galerkin solutions are compared with results of Monte Carlo simulation, in terms of first and second order moments and in terms of histograms of the displacement response. Numerical results for two example problems show very fast convergence to the exact solution, at excellent accuracies. The Askey-Wiener Galerkin scheme developed herein is able to reproduce the histogram of the displacement response. The scheme is shown to be a theoretically sound and efficient method for the solution of stochastic problems in engineering. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a physical non-linear formulation to deal with steel fiber reinforced concrete by the finite element method. The proposed formulation allows the consideration of short or long fibers placed arbitrarily inside a continuum domain (matrix). The most important feature of the formulation is that no additional degree of freedom is introduced in the pre-existent finite element numerical system to consider any distribution or quantity of fiber inclusions. In other words, the size of the system of equations used to solve a non-reinforced medium is the same as the one used to solve the reinforced counterpart. Another important characteristic of the formulation is the reduced work required by the user to introduce reinforcements, avoiding ""rebar"" elements, node by node geometrical definitions or even complex mesh generation. Bounded connection between long fibers and continuum is considered, for short fibers a simplified approach is proposed to consider splitting. Non-associative plasticity is adopted for the continuum and one dimensional plasticity is adopted to model fibers. Examples are presented in order to show the capabilities of the formulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inverse analysis is currently an important subject of study in several fields of science and engineering. The identification of physical and geometric parameters using experimental measurements is required in many applications. In this work a boundary element formulation to identify boundary and interface values as well as material properties is proposed. In particular the proposed formulation is dedicated to identifying material parameters when a cohesive crack model is assumed for 2D problems. A computer code is developed and implemented using the BEM multi-region technique and regularisation methods to perform the inverse analysis. Several examples are shown to demonstrate the efficiency of the proposed model. (C) 2010 Elsevier Ltd. All rights reserved,

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, a new boundary element formulation for the analysis of plate-beam interaction is presented. This formulation uses a three nodal value boundary elements and each beam element is replaced by its actions on the plate, i.e., a distributed load and end of element forces. From the solution of the differential equation of a beam with linearly distributed load the plate-beam interaction tractions can be written as a function of the nodal values of the beam. With this transformation a final system of equation in the nodal values of displacements of plate boundary and beam nodes is obtained and from it, all unknowns of the plate-beam system are obtained. Many examples are analyzed and the results show an excellent agreement with those from the analytical solution and other numerical methods. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Generalized Finite Element Method (GFEM) is employed in this paper for the numerical analysis of three-dimensional solids tinder nonlinear behavior. A brief summary of the GFEM as well as a description of the formulation of the hexahedral element based oil the proposed enrichment strategy are initially presented. Next, in order to introduce the nonlinear analysis of solids, two constitutive models are briefly reviewed: Lemaitre`s model, in which damage and plasticity are coupled, and Mazars`s damage model suitable for concrete tinder increased loading. Both models are employed in the framework of a nonlocal approach to ensure solution objectivity. In the numerical analyses carried out, a selective enrichment of approximation at regions of concern in the domain (mainly those with high strain and damage gradients) is exploited. Such a possibility makes the three-dimensional analysis less expensive and practicable since re-meshing resources, characteristic of h-adaptivity, can be minimized. Moreover, a combination of three-dimensional analysis and the selective enrichment presents a valuable good tool for a better description of both damage and plastic strain scatterings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a formulation for representation of stiffeners in plane stress by the boundary elements method (BEM) in linear analysis is presented. The strategy is to adopt approximations for the displacements in the central line of the stiffener. With this simplification the Spurious oscillations in the stress along stiffeners with small thickness is prevented. Worked examples are analyzed to show the efficiency of these techniques, especially in the insertion of very narrow sub-regions, in which quasi-singular integrals are calculated, with stiffeners that are much stiffer than the main domain. The results obtained with this formulation are very close to those obtained with other formulations. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a domain boundary element formulation for inelastic saturated porous media with rate-independent behavior for the solid skeleton. The formulation is then applied to elastic-plastic behavior for the solid. Biot`s consolidation theory, extended to include irreversible phenomena is considered and the direct boundary element technique is used for the numerical solution after time discretization by the implicit Euler backward algorithm. The associated nonlinear algebraic problem is solved by the Newton-Raphson procedure whereby the loading/unloading conditions are fully taken into account and the consistent tangent operator defined. Only domain nodes (nodes defined inside the domain) are used to represent all domain values and the corresponding integrals are computed by using an accurate sub-elementation scheme. The developments are illustrated through the Drucker-Prager elastic-plastic model for the solid skeleton and various examples are analyzed with the proposed algorithms. (c) 2008 Elsevier B.V. All rights reserved.