967 resultados para Editor of flow analysis methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The article proposes the model of management of information about program flow analysis for conducting computer experiments with program transformations. It considers the architecture and context of the flow analysis subsystem within the framework of Specialized Knowledge Bank on Program Transformations and describes the language for presenting flow analysis methods in the knowledge bank.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Because the biomechanical behavior of dental implants is different from that of natural tooth, clinical problems may occur. The mechanism of stress distribution and load transfer to the implant/bone interface is a critical issue affecting the success rate of implants. Therefore, the aim of this study was to conduct a brief literature review of the available stress analysis methods to study implant-supported prosthesis loading and to discuss their contributions in the biomechanical evaluation of oral rehabilitation with implants. Several studies have used experimental, analytical, and computational models by means of finite element models (FEM), photoelasticity, strain gauges and associations of these methods to evaluate the biomechanical behavior of dental implants. The FEM has been used to evaluate new components, configurations, materials, and shapes of implants. The greatest advantage of the photoelastic method is the ability to visualize the stresses in complex structures, such as oral structures, and to observe the stress patterns in the whole model, allowing the researcher to localize and quantify the stress magnitude. Strain gauges can be used to assess in vivo and in vitro stress in prostheses, implants, and teeth. Some authors use the strain gauge technique with photoelasticity or FEM techniques. These methodologies can be widely applied in dentistry, mainly in the research field. Therefore, they can guide further research and clinical studies by predicting some disadvantages and streamlining clinical time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Several meta-analysis methods can be used to quantitatively combine the results of a group of experiments, including the weighted mean difference, statistical vote counting, the parametric response ratio and the non-parametric response ratio. The software engineering community has focused on the weighted mean difference method. However, other meta-analysis methods have distinct strengths, such as being able to be used when variances are not reported. There are as yet no guidelines to indicate which method is best for use in each case. Aim: Compile a set of rules that SE researchers can use to ascertain which aggregation method is best for use in the synthesis phase of a systematic review. Method: Monte Carlo simulation varying the number of experiments in the meta analyses, the number of subjects that they include, their variance and effect size. We empirically calculated the reliability and statistical power in each case Results: WMD is generally reliable if the variance is low, whereas its power depends on the effect size and number of subjects per meta-analysis; the reliability of RR is generally unaffected by changes in variance, but it does require more subjects than WMD to be powerful; NPRR is the most reliable method, but it is not very powerful; SVC behaves well when the effect size is moderate, but is less reliable with other effect sizes. Detailed tables of results are annexed. Conclusions: Before undertaking statistical aggregation in software engineering, it is worthwhile checking whether there is any appreciable difference in the reliability and power of the methods. If there is, software engineers should select the method that optimizes both parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article explains first, the reasons why a knowledge of statistics is necessary and describes the role that statistics plays in an experimental investigation. Second, the normal distribution is introduced which describes the natural variability shown by many measurements in optometry and vision sciences. Third, the application of the normal distribution to some common statistical problems including how to determine whether an individual observation is a typical member of a population and how to determine the confidence interval for a sample mean is described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increased demands placed on solution propulsion by programmed flow systems, such as sequential injection analysis, lab-on-value technology, bead injection and multi-commutation, has highlighted the inability of many conventional pumps to generate a smooth, consistent flow. A number of researchers have examined ways to overcome the inadvertent, uncontrolled pulsation caused by the mechanical action of peristaltic pumps. In contrast, we have developed instruments that exploit the characteristics of a reproducible pulsed flow of solution. In this paper, we discuss our instrumental approaches and some applications that have benefited from the use of a reproducible pulsed flow rather than the traditional linear flow approach. To place our approach in the context of the continuously developing field of flow analysis, an overview of other programmed flow systems is also presented.<br />

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The reduction of luvastatin (FLV) at a hanging mercury-drop electrode (HMDE) was studied by square-wave adsorptive-stripping voltammetry (SWAdSV). FLV can be accumulated and reduced at the electrode, with a maximum peak current intensity at a potential of approximately 1.26V vs. AgCl=Ag, in an aqueous electrolyte solution of pH 5.25. The method shows linearity between peak current intensity and FLV concentration between 1.0 10 8 and 2.7 10 6 mol L 1. Limits of detection (LOD) and quantification (LOQ) were found to be 9.9 10 9 mol L 1 and 3.3 10 8 mol L 1, respectively. Furthermore, FLV oxidation at a glassy carbon electrode surface was used for its hydrodynamic monitoring by amperometric detection in a flow-injection system. The amperometric signal was linear with FLV concentration over the range 1.0 10 6 to 1.0 10 5 mol L 1, with an LOD of 2.4 10 7 mol L 1 and an LOQ of 8.0 10 7 mol L 1. A sample rate of 50 injections per hour was achieved. Both methods were validated and showed to be precise and accurate, being satisfactorily applied to the determination of FLV in a commercial pharmaceutical.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Semi-automated flow injection instrumentation, incorporating a small anion exchange column coupled with tris(2,2&prime;-bipyridyl)ruthenium(II) (Ru(bipy)<sub>3</sub><sup>2+</sup>) chemiluminescence detection, was configured and utilised to develop rapid methodology for the determination of sodium oxalate in Bayer liquors. The elimination of both negative and positive interferences from aluminium(III) and, as yet, unknown concomitant organic species, respectively are discussed. The robustness of the methodology was considerably enhanced by using the temporally stable form of the chemiluminescence reagent, tris(2,2&prime;-bipyridyl)ruthenium(III) perchlorate in dry acetonitrile. Real Bayer process samples were analysed and the results obtained compared well with those performed using standard methods within industrial laboratories.<br />

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Event related potential (ERP) analysis is one of the most widely used methods in cognitive neuroscience research to study the physiological correlates of sensory, perceptual and cognitive activity associated with processing information. To this end information flow or dynamic effective connectivity analysis is a vital technique to understand the higher cognitive processing under different events. In this paper we present a Granger causality (GC)-based connectivity estimation applied to ERP data analysis. In contrast to the generally used strictly causal multivariate autoregressive model, we use an extended multivariate autoregressive model (eMVAR) which also accounts for any instantaneous interaction among variables under consideration. The experimental data used in the paper is based on a single subject data set for erroneous button press response from a two-back with feedback continuous performance task (CPT). In order to demonstrate the feasibility of application of eMVAR models in source space connectivity studies, we use cortical source time series data estimated using blind source separation or independent component analysis (ICA) for this data set.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three-dimensional flow visualization plays an essential role in many areas of science and engineering, such as aero- and hydro-dynamical systems which dominate various physical and natural phenomena. For popular methods such as the streamline visualization to be effective, they should capture the underlying flow features while facilitating user observation and understanding of the flow field in a clear manner. My research mainly focuses on the analysis and visualization of flow fields using various techniques, e.g. information-theoretic techniques and graph-based representations. Since the streamline visualization is a popular technique in flow field visualization, how to select good streamlines to capture flow patterns and how to pick good viewpoints to observe flow fields become critical. We treat streamline selection and viewpoint selection as symmetric problems and solve them simultaneously using the dual information channel [81]. To the best of my knowledge, this is the first attempt in flow visualization to combine these two selection problems in a unified approach. This work selects streamline in a view-independent manner and the selected streamlines will not change for all viewpoints. My another work [56] uses an information-theoretic approach to evaluate the importance of each streamline under various sample viewpoints and presents a solution for view-dependent streamline selection that guarantees coherent streamline update when the view changes gradually. When projecting 3D streamlines to 2D images for viewing, occlusion and clutter become inevitable. To address this challenge, we design FlowGraph [57, 58], a novel compound graph representation that organizes field line clusters and spatiotemporal regions hierarchically for occlusion-free and controllable visual exploration. We enable observation and exploration of the relationships among field line clusters, spatiotemporal regions and their interconnection in the transformed space. Most viewpoint selection methods only consider the external viewpoints outside of the flow field. This will not convey a clear observation when the flow field is clutter on the boundary side. Therefore, we propose a new way to explore flow fields by selecting several internal viewpoints around the flow features inside of the flow field and then generating a B-Spline curve path traversing these viewpoints to provide users with closeup views of the flow field for detailed observation of hidden or occluded internal flow features [54]. This work is also extended to deal with unsteady flow fields. Besides flow field visualization, some other topics relevant to visualization also attract my attention. In iGraph [31], we leverage a distributed system along with a tiled display wall to provide users with high-resolution visual analytics of big image and text collections in real time. Developing pedagogical visualization tools forms my other research focus. Since most cryptography algorithms use sophisticated mathematics, it is difficult for beginners to understand both what the algorithm does and how the algorithm does that. Therefore, we develop a set of visualization tools to provide users with an intuitive way to learn and understand these algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.