909 resultados para flow-based
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This article describes a new design for a paper-based electrochemical system for flow injection analysis. Capillary wicking facilitates a gravity-driven flow of buffer solution continuously through paper and nitrocellulose, from a buffer reservoir at one end of the device to a sink at the other. A difference in height between the reservoir and the sink leads to a continuous and constant flow. The nitrocellulose lies horizontally on a working electrode, which consists of a thin platinum layer deposited on a solid support. The counter and reference electrodes are strategically positioned upstream in the buffer reservoir. A simple pipetting device was developed for reliable application of (sub)microliter volumes of sample without the need of commercial micropipets; this device did not damage the nitrocellulose membrane. Demonstration of the system for the determination of the concentration of glucose in urine resulted in a noninvasive, quantitative assay that could be used for diagnosis and monitoring of diabetes. This method does not require disposable test strips, with enzyme and electrodes, that are thrown away after each measurement Because of its low cost, this system could be used in medical environments that are resource-limited.
Resumo:
In this work, a LED (light emitting diode) based photometer for solid phase photometry is described. The photometer was designed to permit direct coupling of a light source (LED) and a photodiode to a flow cell with an optical pathlength of 4 mm. The flow cell was filled with adsorbing solid phase material (C-18), which was used to immobilize the chromogenic reagent 1-(2-thiazolylazo)-2-naphthol (TAN). Aiming to allow accuracy assessment, samples were also analyzed employing ICP OES (inductively coupled plasma optical emission spectrometry) methodology. Applying the paired t-test at the 95% confidence level, no significant difference was observed. Other useful features were also achieved: linear response ranging from 0.05 to 0.85 mg L-1 Zn, limit of detection of 9 mu g L-1 Zn (3 sigma criterion), standard deviation of 1.4% (n = 10), sampling throughput of 36 determinations per h, and a waste generation and reagent consumption of 1.7 mL and of 0.03 mu g per determination, respectively.
Resumo:
Neste trabalho é proposto um fotômetro baseado em LED (diodo emissor de luz) para fotometria em fase sólida. O fotômetro foi desenvolvido para permitir o acoplamento da fonte de radiação (LED) e do fotodetector direto na cela de fluxo, tendo um caminho óptico de 4 mm. A cela de fluxo foi preenchida com material sólido (C18), o qual foi utilizado para imobilizar o reagente cromogênico 1-(2-tiazolilazo)-2-naftol (TAN). A exatidão foi avaliada empregando dados obtidos através da técnica ICP OES (espectrometria de emissão por plasma indutivamente acoplado). Aplicando-se o teste-t pareado não foi observada diferença significativa em nível de confiança de 95%. Outros parâmetros importantes encontrados foram faixa de resposta linear de 0,05 a 0,85 mg L-1 Zn, limite de detecção de 9 µg L-1 Zn (n = 3), desvio padrão de 1,4 % (n = 10), frequência de amostragem de 36 determinações por h, e uma geração de efluente e consumo de reagente de 1,7 mL e 0,03 µg por determinação, respectivamente.
Resumo:
We propose a novel methodology to generate realistic network flow traces to enable systematic evaluation of network monitoring systems in various traffic conditions. Our technique uses a graph-based approach to model the communication structure observed in real-world traces and to extract traffic templates. By combining extracted and user-defined traffic templates, realistic network flow traces that comprise normal traffic and customized conditions are generated in a scalable manner. A proof-of-concept implementation demonstrates the utility and simplicity of our method to produce a variety of evaluation scenarios. We show that the extraction of templates from real-world traffic leads to a manageable number of templates that still enable accurate re-creation of the original communication properties on the network flow level.
Resumo:
We show that the variation of flow stress with strain rate and grain size in a magnesium alloy deformed at a constant strain rate and 450 °C can be predicted by a crystal plasticity model that includes grain boundary sliding and diffusion. The model predicts the grain size dependence of the critical strain rate that will cause a transition in deformation mechanism from dislocation creep to grain boundary sliding, and yields estimates for grain boundary fluidity and diffusivity.
Resumo:
Despite the increased use of intracranial neuromonitoring during experimental subarachnoid hemorrhage (SAH), coordinates for probe placement in rabbits are lacking. This study evaluates the safety and reliability of using outer skull landmarks to identify locations for placement of cerebral blood flow (CBF) and intraparenchymal intracranial pressure (ICP) probes. Experimental SAH was performed in 17 rabbits using an extracranial-intracranial shunt model. ICP probes were placed in the frontal lobe and compared to measurements recorded from the olfactory bulb. CBF probes were placed in various locations in the frontal cortex anterior to the coronary suture. Insertion depth, relation to the ventricular system, and ideal placement location were determined by post-mortem examination. ICP recordings at the time of SAH from the frontal lobe did not differ significantly from those obtained from the right olfactory bulb. Ideal coordinates for intraparenchymal CBF probes in the left and right frontal lobe were found to be located 4.6±0.9 and 4.5±1.2 anterior to the bregma, 4.7±0.7mm and 4.7±0.5mm parasagittal, and at depths of 4±0.5mm and 3.9±0.5mm, respectively. The results demonstrate that the presented coordinates based on skull landmarks allow reliable placement of intraparenchymal ICP and CBF probes in rabbit brains without the use of a stereotactic frame.
Resumo:
Independent component analysis (ICA) or seed based approaches (SBA) in functional magnetic resonance imaging blood oxygenation level dependent (BOLD) data became widely applied tools to identify functionally connected, large scale brain networks. Differences between task conditions as well as specific alterations of the networks in patients as compared to healthy controls were reported. However, BOLD lacks the possibility of quantifying absolute network metabolic activity, which is of particular interest in the case of pathological alterations. In contrast, arterial spin labeling (ASL) techniques allow quantifying absolute cerebral blood flow (CBF) in rest and in task-related conditions. In this study, we explored the ability of identifying networks in ASL data using ICA and to quantify network activity in terms of absolute CBF values. Moreover, we compared the results to SBA and performed a test-retest analysis. Twelve healthy young subjects performed a fingertapping block-design experiment. During the task pseudo-continuous ASL was measured. After CBF quantification the individual datasets were concatenated and subjected to the ICA algorithm. ICA proved capable to identify the somato-motor and the default mode network. Moreover, absolute network CBF within the separate networks during either condition could be quantified. We could demonstrate that using ICA and SBA functional connectivity analysis is feasible and robust in ASL-CBF data. CBF functional connectivity is a novel approach that opens a new strategy to evaluate differences of network activity in terms of absolute network CBF and thus allows quantifying inter-individual differences in the resting state and task-related activations and deactivations.
Resumo:
AIM: To test whether quantitative stress echocardiography using contrast-based myocardial blood flow (MBF, ml x min(-1) x g(-1)) measurements can detect coronary artery disease in humans. METHODS: 48 patients eligible for pharmacological stress testing by myocardial contrast echocardiography (MCE) and willing to undergo subsequent coronary angiography were prospectively enrolled in the study. Baseline and adenosine-induced (140 microg x kg(-1) x min(-1)) hyperaemic MBF was analysed according to a three-coronary-artery-territory model. Vascular territories were categorised into three groups with increasing stenosis severity defined as percentage diameter reduction by quantitative coronary angiography. RESULTS: Myocardial blood flow reserve (MBFR)-that is, the ratio of hyperaemic to baseline MBF, was obtained in 128 (89%) territories. Mean (SD) baseline MBF was 1.073 (0.395) ml x min(-1) x g(-1) and did not differ between territories supplied by coronary arteries with mild (<50% stenosis), moderate (50%-74% stenosis) or severe (>or=75% stenosis) disease. Mean (SD) hyperaemic MBF and MBFR were 2.509 (1.078) ml x min(-1) x g(-1) and 2.54 (1.03), respectively, and decreased linearly (r2 = 0.21 and r2 = 0.39) with stenosis severity. ROC analysis revealed that a territorial MBFR <1.94 detected >or=50% stenosis with 89% sensitivity and 92% specificity. CONCLUSION: Quantitative stress testing based on MBF measurements derived from contrast echocardiography is a new method for the non-invasive and reliable assessment of coronary artery disease in humans.
Resumo:
Three-dimensional flow visualization plays an essential role in many areas of science and engineering, such as aero- and hydro-dynamical systems which dominate various physical and natural phenomena. For popular methods such as the streamline visualization to be effective, they should capture the underlying flow features while facilitating user observation and understanding of the flow field in a clear manner. My research mainly focuses on the analysis and visualization of flow fields using various techniques, e.g. information-theoretic techniques and graph-based representations. Since the streamline visualization is a popular technique in flow field visualization, how to select good streamlines to capture flow patterns and how to pick good viewpoints to observe flow fields become critical. We treat streamline selection and viewpoint selection as symmetric problems and solve them simultaneously using the dual information channel [81]. To the best of my knowledge, this is the first attempt in flow visualization to combine these two selection problems in a unified approach. This work selects streamline in a view-independent manner and the selected streamlines will not change for all viewpoints. My another work [56] uses an information-theoretic approach to evaluate the importance of each streamline under various sample viewpoints and presents a solution for view-dependent streamline selection that guarantees coherent streamline update when the view changes gradually. When projecting 3D streamlines to 2D images for viewing, occlusion and clutter become inevitable. To address this challenge, we design FlowGraph [57, 58], a novel compound graph representation that organizes field line clusters and spatiotemporal regions hierarchically for occlusion-free and controllable visual exploration. We enable observation and exploration of the relationships among field line clusters, spatiotemporal regions and their interconnection in the transformed space. Most viewpoint selection methods only consider the external viewpoints outside of the flow field. This will not convey a clear observation when the flow field is clutter on the boundary side. Therefore, we propose a new way to explore flow fields by selecting several internal viewpoints around the flow features inside of the flow field and then generating a B-Spline curve path traversing these viewpoints to provide users with closeup views of the flow field for detailed observation of hidden or occluded internal flow features [54]. This work is also extended to deal with unsteady flow fields. Besides flow field visualization, some other topics relevant to visualization also attract my attention. In iGraph [31], we leverage a distributed system along with a tiled display wall to provide users with high-resolution visual analytics of big image and text collections in real time. Developing pedagogical visualization tools forms my other research focus. Since most cryptography algorithms use sophisticated mathematics, it is difficult for beginners to understand both what the algorithm does and how the algorithm does that. Therefore, we develop a set of visualization tools to provide users with an intuitive way to learn and understand these algorithms.
Resumo:
We reconstruct the timing of ice flow reconfiguration and deglaciation of the Central Alpine Gotthard Pass, Switzerland, using cosmogenic 10Be and in situ14C surface exposure dating. Combined with mapping of glacial erosional markers, exposure ages of bedrock surfaces reveal progressive glacier downwasting from the maximum LGM ice volume and a gradual reorganization of the paleoflow pattern with a southward migration of the ice divide. Exposure ages of ∼16–14 ka (snow corrected) give evidence for continuous early Lateglacial ice cover and indicate that the first deglaciation was contemporaneous with the decay of the large Gschnitz glacier system. In agreement with published ages from other Alpine passes, these data support the concept of large transection glaciers that persisted in the high Alps after the breakdown of the LGM ice masses in the foreland and possibly decayed as late as the onset of the Bølling warming. A younger group of ages around ∼12–13 ka records the timing of deglaciation following local glacier readvance during the Egesen stadial. Glacial erosional features and the distribution of exposure ages consistently imply that Egesen glaciers were of comparatively small volume and were following a topographically controlled paleoflow pattern. Dating of a boulder close to the pass elevation gives a minimum age of 11.1 ± 0.4 ka for final deglaciation by the end of the Younger Dryas. In situ14C data are overall in good agreement with the 10Be ages and confirm continuous exposure throughout the Holocene. However, in situ14C demonstrates that partial surface shielding, e.g. by snow, has to be incorporated in the exposure age calculations and the model of deglaciation.
Resumo:
A rapid, economic and sensitive chemiluminescent method involving flow-injection analysis was developed for the determination of dipyrone in pharmaceutical preparations. The method is based on the chemiluminescent reaction between quinolinic hydrazide and hydrogen peroxide in a strongly alkaline medium, in which vanadium(IV) acts as a catalyst. Principal chemical and physical variables involved in the flow-injection system were optimized using a modified simplex method. The variations in the quantum yield observed when dipyrone was present in the reaction medium were used to determine the concentration of this compound. The proposed method requires no preconcentration steps and reliably quantifies dipyrone over the linear range 1–50 µg/mL. In addition, a sample throughput of 85 samples/h is possible. Copyright © 2011 John Wiley & Sons, Ltd.
Resumo:
Regionalización de tipos de régimen natural de caudales en la cuenca del Ebro y validación biológica de los tipos de regímen natural.
Resumo:
Traffic flow time series data are usually high dimensional and very complex. Also they are sometimes imprecise and distorted due to data collection sensor malfunction. Additionally, events like congestion caused by traffic accidents add more uncertainty to real-time traffic conditions, making traffic flow forecasting a complicated task. This article presents a new data preprocessing method targeting multidimensional time series with a very high number of dimensions and shows its application to real traffic flow time series from the California Department of Transportation (PEMS web site). The proposed method consists of three main steps. First, based on a language for defining events in multidimensional time series, mTESL, we identify a number of types of events in time series that corresponding to either incorrect data or data with interference. Second, each event type is restored utilizing an original method that combines real observations, local forecasted values and historical data. Third, an exponential smoothing procedure is applied globally to eliminate noise interference and other random errors so as to provide good quality source data for future work.