973 resultados para diffuse-interface method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Performing experiments with transactinide elements demands highly sensitive detection methods due to the extremely low production rates (one-atom-at-a-time conditions). Preseparation with a physical recoil separator is a powerful method to significantly reduce the background in experiments with sufficiently long-lived isotopes (t1/2≥0.5 s). In the last years, the new gas-filled TransActinide Separator and Chemistry Apparatus (TASCA) was installed and successfully commissioned at GSI. Here, we report on the design and performance of a Recoil Transfer Chamber (RTC) for TASCA—an interface to connect various chemistry and counting setups with the separator. Nuclear reaction products recoiling out of the target are separated according to their magnetic rigidity within TASCA, and the wanted products are guided to the focal plane of TASCA. In the focal plane, they pass a thin Mylar window that separates the ∼1 mbar atmosphere in TASCA from the RTC kept at ∼1 bar. The ions are stopped in the RTC and transported by a continuous gas flow from the RTC to the ancillary setup. In this paper, we report on measurements of the transportation yields under various conditions and on the first chemistry experiments at TASCA—an electrochemistry experiment with osmium and an ion exchange experiment with the transactinide element rutherfordium.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A main field in biomedical optics research is diffuse optical tomography, where intensity variations of the transmitted light traversing through tissue are detected. Mathematical models and reconstruction algorithms based on finite element methods and Monte Carlo simulations describe the light transport inside the tissue and determine differences in absorption and scattering coefficients. Precise knowledge of the sample's surface shape and orientation is required to provide boundary conditions for these techniques. We propose an integrated method based on structured light three-dimensional (3-D) scanning that provides detailed surface information of the object, which is usable for volume mesh creation and allows the normalization of the intensity dispersion between surface and camera. The experimental setup is complemented by polarization difference imaging to avoid overlaying byproducts caused by inter-reflections and multiple scattering in semitransparent tissue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clay minerals have a fundamental importance in many processes in soils and sediments such as the bioavailability of nutrients, water retention, the adsorption of common pollutants, and the formation of an impermeable barrier upon swelling. Many of the properties of clay minerals are due to the unique environment present at the clay mineral/water interface. Traditional techniques such as X-ray diffraction (XRD) and absorption isotherms have provided a wealth of information about this interface but have suffered from limitations. The methods and results presented herein are designed to yield new experimental information about the clay mineral/water interface.A new method of studying the swelling dynamics of clay minerals was developed using in situ atomic force microscopy (AFM). The preliminary results presented here demonstrate that this technique allows one to study individual clay mineral unit layers, explore the natural heterogeneities of samples, and monitor swelling dynamics of clay minerals in real time. Cation exchange experiments were conducted monitoring the swelling change of individual nontronite quasi-crystals as the chemical composition of the surrounding environment was manipulated several times. A proof of concept study has shown that the changes in swelling are from the exchange of interlayer cations and not from the mechanical force of replacing the solution in the fluid cell. A series of attenuated total internal reflection Fourier transform infrared spectroscopy (ATR-FTIR) experiments were performed to gain a better understanding of the organization of water within the interlayer region of two Fe-bearing clay minerals. These experiments made use of the Subtractive Kramers-Kronig (SKK) Transform and the calculation of difference spectra to obtain information about interfacial water hidden within the absorption bands of bulk water. The results indicate that the reduction of structural iron disrupts the organization of water around a strongly hydrated cation such as sodium as the cation transitions from an outer-sphere complex with the mineral surface to an inner-sphere complex. In the case of a less strongly hydrated cation such as potassium, reduction of structural iron actually increases the ordering of water molecules at the mineral surface. These effects were only noticed with the reduction of iron in the tetrahedral sheet close to the basal surface where the increased charge density is localized closer to the cations in the interlayer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drug release from a fluid-contacting biomaterial is simulated using a microfluidic device with a channel defined by solute-loaded hydrogel; as water is pumped through the channel, solute transfers from the hydrogel into the water. Optical analysis of in-situ hydrogels, characterization of the microfluidic device effluent, and NMR methods were used to find diffusion coefficients of several dyes (model drugs) in poly( ethylene glycol) diacrylate (PEG-DA) hydrogels. Diffusion coefficients for methylene blue and sulforhodamine 101 in PEG-DA calculated using the three methods are in good agreement; both dyes are mobile in the hydrogel and elute from the hydrogel at the aqueous channel interface. However, the dye acid blue 22 deviates from typical diffusion behavior and does not release as expected from the hydrogel. Importantly, only the microfluidic method is capable of detecting this behavior. Characterizing solute diffusion with a combination of NMR, optical and effluent methods offer greater insight into molecular diffusion in hydrogels than employing each technique individually. The NMR method made precise measurements for solute diffusion in all cases. The microfluidic optical method was effective for visualizing diffusion of the optically active solutes. The optical and effluent methods show potential to be used to screen solutes to determine if they elute from a hydrogel in contact with flowing fluid. Our data suggest that when designing a drug delivery device, analyzing the diffusion from the molecular level to the device level is important to establish a complete picture of drug elution, and microfluidic methods to study such diffusion can play a key role. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The existence and morphology, as well as the dynamics of micro-scale gas-liquid interfaces is investigated numerically and experimentally. These studies can be used to assess liquid management issues in microsystems such as PEMFC gas flow channels, and are meant to open new research perspectives in two-phase flow, particularly in film deposition on non-wetting surfaces. For example the critical plug volume data can be used to deliver desired length plugs, or to determine the plug formation frequency. The dynamics of gas-liquid interfaces, of interest for applications involving small passages (e.g. heat exchangers, phase separators and filtration systems), was investigated using high-speed microscopy - a method that also proved useful for the study of film deposition processes. The existence limit for a liquid plug forming in a mixed wetting channel is determined by numerical simulations using Surface Evolver. The plug model simulate actual conditions in the gas flow channels of PEM fuel cells, the wetting of the gas diffusion layer (GDL) side of the channel being different from the wetting of the bipolar plate walls. The minimum plug volume, denoted as critical volume is computed for a series of GDL and bipolar plate wetting properties. Critical volume data is meant to assist in the water management of PEMFC, when corroborated with experimental data. The effect of cross section geometry is assessed by computing the critical volume in square and trapezoidal channels. Droplet simulations show that water can be passively removed from the GDL surface towards the bipolar plate if we take advantage on differing wetting properties between the two surfaces, to possibly avoid the gas transport blockage through the GDL. High speed microscopy was employed in two-phase and film deposition experiments with water in round and square capillary tubes. Periodic interface destabilization was observed and the existence of compression waves in the gas phase is discussed by taking into consideration a naturally occurring convergent-divergent nozzle formed by the flowing liquid phase. The effect of channel geometry and wetting properties was investigated through two-phase water-air flow in square and round microchannels, having three static contact angles of 20, 80 and 105 degrees. Four different flow regimes are observed for a fixed flow rate, this being thought to be caused by the wetting behavior of liquid flowing in the corners as well as the liquid film stability. Film deposition experiments in wetting and non-wetting round microchannels show that a thicker film is deposited for wetting conditions departing from the ideal 0 degrees contact angle. A film thickness dependence with the contact angle theta as well as the Capillary number, in the form h_R ~ Ca^(2/3)/ cos(theta) is inferred from scaling arguments, for contact angles smaller than 36 degrees. Non-wetting film deposition experiments reveal that a film significantly thicker than the wetting Bretherton film is deposited. A hydraulic jump occurs if critical conditions are met, as given by a proposed nondimensional parameter similar to the Froude number. Film thickness correlations are also found by matching the measured and the proposed velocity derived in the shock theory. The surface wetting as well as the presence of the shock cause morphological changes in the Taylor bubble flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Images of an object under different illumination are known to provide strong cues about the object surface. A mathematical formalization of how to recover the normal map of such a surface leads to the so-called uncalibrated photometric stereo problem. In the simplest instance, this problem can be reduced to the task of identifying only three parameters: the so-called generalized bas-relief (GBR) ambiguity. The challenge is to find additional general assumptions about the object, that identify these parameters uniquely. Current approaches are not consistent, i.e., they provide different solutions when run multiple times on the same data. To address this limitation, we propose exploiting local diffuse reflectance (LDR) maxima, i.e., points in the scene where the normal vector is parallel to the illumination direction (see Fig. 1). We demonstrate several noteworthy properties of these maxima: a closed-form solution, computational efficiency and GBR consistency. An LDR maximum yields a simple closed-form solution corresponding to a semi-circle in the GBR parameters space (see Fig. 2); because as few as two diffuse maxima in different images identify a unique solution, the identification of the GBR parameters can be achieved very efficiently; finally, the algorithm is consistent as it always returns the same solution given the same data. Our algorithm is also remarkably robust: It can obtain an accurate estimate of the GBR parameters even with extremely high levels of outliers in the detected maxima (up to 80 % of the observations). The method is validated on real data and achieves state-of-the-art results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2010 more than 600 radiocarbon samples were measured with the gas ion source at the MIni CArbon DAting System (MICADAS) at ETH Zurich and the number of measurements is rising quickly. While most samples contain less than 50 mu g C at present, the gas ion source is attractive as well for larger samples because the time-consuming graphitization is omitted. Additionally, modern samples are now measured down to 5 per-mill counting statistics in less than 30 min with the recently improved gas ion source. In the versatile gas handling system, a stepping-motor-driven syringe presses a mixture of helium and sample CO2 into the gas ion source, allowing continuous and stable measurements of different kinds of samples. CO2 can be provided in four different ways to the versatile gas interface. As a primary method. CO2 is delivered in glass or quartz ampoules. In this case, the CO2 is released in an automated ampoule cracker with 8 positions for individual samples. Secondly, OX-1 and blank gas in helium can be provided to the syringe by directly connecting gas bottles to the gas interface at the stage of the cracker. Thirdly, solid samples can be combusted in an elemental analyzer or in a thermo-optical OC/EC aerosol analyzer where the produced CO2 is transferred to the syringe via a zeolite trap for gas concentration. As a fourth method, CO2 is released from carbonates with phosphoric acid in septum-sealed vials and loaded onto the same trap used for the elemental analyzer. All four methods allow complete automation of the measurement, even though minor user input is presently still required. Details on the setup, versatility and applications of the gas handling system are given. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Interruptions are known to have a negative impact on activity performance. Understanding how an interruption contributes to human error is limited because there is not a standard method for analyzing and classifying interruptions. Qualitative data are typically analyzed by either a deductive or an inductive method. Both methods have limitations. In this paper, a hybrid method was developed that integrates deductive and inductive methods for the categorization of activities and interruptions recorded during an ethnographic study of physicians and registered nurses in a Level One Trauma Center. Understanding the effects of interruptions is important for designing and evaluating informatics tools in particular as well as improving healthcare quality and patient safety in general. METHOD: The hybrid method was developed using a deductive a priori classification framework with the provision of adding new categories discovered inductively in the data. The inductive process utilized line-by-line coding and constant comparison as stated in Grounded Theory. RESULTS: The categories of activities and interruptions were organized into a three-tiered hierarchy of activity. Validity and reliability of the categories were tested by categorizing a medical error case external to the study. No new categories of interruptions were identified during analysis of the medical error case. CONCLUSIONS: Findings from this study provide evidence that the hybrid model of categorization is more complete than either a deductive or an inductive method alone. The hybrid method developed in this study provides the methodical support for understanding, analyzing, and managing interruptions and workflow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two new approaches to quantitatively analyze diffuse diffraction intensities from faulted layer stacking are reported. The parameters of a probability-based growth model are determined with two iterative global optimization methods: a genetic algorithm (GA) and particle swarm optimization (PSO). The results are compared with those from a third global optimization method, a differential evolution (DE) algorithm [Storn & Price (1997). J. Global Optim. 11, 341–359]. The algorithm efficiencies in the early and late stages of iteration are compared. The accuracy of the optimized parameters improves with increasing size of the simulated crystal volume. The wall clock time for computing quite large crystal volumes can be kept within reasonable limits by the parallel calculation of many crystals (clones) generated for each model parameter set on a super- or grid computer. The faulted layer stacking in single crystals of trigonal three-pointedstar- shaped tris(bicylco[2.1.1]hexeno)benzene molecules serves as an example for the numerical computations. Based on numerical values of seven model parameters (reference parameters), nearly noise-free reference intensities of 14 diffuse streaks were simulated from 1280 clones, each consisting of 96 000 layers (reference crystal). The parameters derived from the reference intensities with GA, PSO and DE were compared with the original reference parameters as a function of the simulated total crystal volume. The statistical distribution of structural motifs in the simulated crystals is in good agreement with that in the reference crystal. The results found with the growth model for layer stacking disorder are applicable to other disorder types and modeling techniques, Monte Carlo in particular.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fast and automatic method for radiocarbon analysis of aerosol samples is presented. This type of analysis requires high number of sample measurements of low carbon masses, but accepts precisions lower than for carbon dating analysis. The method is based on online Trapping CO2 and coupling an elemental analyzer with a MICADAS AMS by means of a gas interface. It gives similar results to a previously validated reference method for the same set of samples. This method is fast and automatic and typically provides uncertainties of 1.5–5% for representative aerosol samples. It proves to be robust and reliable and allows for overnight and unattended measurements. A constant and cross contamination correction is included, which indicates a constant contamination of 1.4 ± 0.2 μg C with 70 ± 7 pMC and a cross contamination of (0.2 ± 0.1)% from the previous sample. A Real-time online coupling version of the method was also investigated. It shows promising results for standard materials with slightly higher uncertainties than the Trapping online approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clinical text understanding (CTU) is of interest to health informatics because critical clinical information frequently represented as unconstrained text in electronic health records are extensively used by human experts to guide clinical practice, decision making, and to document delivery of care, but are largely unusable by information systems for queries and computations. Recent initiatives advocating for translational research call for generation of technologies that can integrate structured clinical data with unstructured data, provide a unified interface to all data, and contextualize clinical information for reuse in multidisciplinary and collaborative environment envisioned by CTSA program. This implies that technologies for the processing and interpretation of clinical text should be evaluated not only in terms of their validity and reliability in their intended environment, but also in light of their interoperability, and ability to support information integration and contextualization in a distributed and dynamic environment. This vision adds a new layer of information representation requirements that needs to be accounted for when conceptualizing implementation or acquisition of clinical text processing tools and technologies for multidisciplinary research. On the other hand, electronic health records frequently contain unconstrained clinical text with high variability in use of terms and documentation practices, and without commitmentto grammatical or syntactic structure of the language (e.g. Triage notes, physician and nurse notes, chief complaints, etc). This hinders performance of natural language processing technologies which typically rely heavily on the syntax of language and grammatical structure of the text. This document introduces our method to transform unconstrained clinical text found in electronic health information systems to a formal (computationally understandable) representation that is suitable for querying, integration, contextualization and reuse, and is resilient to the grammatical and syntactic irregularities of the clinical text. We present our design rationale, method, and results of evaluation in processing chief complaints and triage notes from 8 different emergency departments in Houston Texas. At the end, we will discuss significance of our contribution in enabling use of clinical text in a practical bio-surveillance setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel graphical user interface program GrafLab (GRAvity Field LABoratory) for spherical harmonic synthesis (SHS) created in MATLAB®. This program allows to comfortably compute 38 various functionals of the geopotential up to ultra-high degrees and orders of spherical harmonic expansion. For the most difficult part of the SHS, namely the evaluation of the fully normalized associated Legendre functions (fnALFs), we used three different approaches according to required maximum degree: (i) the standard forward column method (up to maximum degree 1800, in some cases up to degree 2190); (ii) the modified forward column method combined with Horner's scheme (up to maximum degree 2700); (iii) the extended-range arithmetic (up to an arbitrary maximum degree). For the maximum degree 2190, the SHS with fnALFs evaluated using the extended-range arithmetic approach takes only approximately 2-3 times longer than its standard arithmetic counterpart, i.e. the standard forward column method. In the GrafLab, the functionals of the geopotential can be evaluated on a regular grid or point-wise, while the input coordinates can either be read from a data file or entered manually. For the computation on a regular grid we decided to apply the lumped coefficients approach due to significant time-efficiency of this method. Furthermore, if a full variance-covariance matrix of spherical harmonic coefficients is available, it is possible to compute the commission errors of the functionals. When computing on a regular grid, the output functionals or their commission errors may be depicted on a map using automatically selected cartographic projection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decade, the aquatic eddy correlation (EC) technique has proven to be a powerful approach for non-invasive measurements of oxygen fluxes across the sediment water interface. Fundamental to the EC approach is the correlation of turbulent velocity and oxygen concentration fluctuations measured with high frequencies in the same sampling volume. Oxygen concentrations are commonly measured with fast responding electrochemical microsensors. However, due to their own oxygen consumption, electrochemical microsensors are sensitive to changes of the diffusive boundary layer surrounding the probe and thus to changes in the ambient flow velocity. The so-called stirring sensitivity of microsensors constitutes an inherent correlation of flow velocity and oxygen sensing and thus an artificial flux which can confound the benthic flux determination. To assess the artificial flux we measured the correlation between the turbulent flow velocity and the signal of oxygen microsensors in a sealed annular flume without any oxygen sinks and sources. Experiments revealed significant correlations, even for sensors designed to have low stirring sensitivities of ~0.7%. The artificial fluxes depended on ambient flow conditions and, counter intuitively, increased at higher velocities because of the nonlinear contribution of turbulent velocity fluctuations. The measured artificial fluxes ranged from 2 - 70 mmol m**-2 d**-1 for weak and very strong turbulent flow, respectively. Further, the stirring sensitivity depended on the sensor orientation towards the flow. Optical microsensors (optodes) that should not exhibit a stirring sensitivity were tested in parallel and did not show any significant correlation between O2 signals and turbulent flow. In conclusion, EC data obtained with electrochemical sensors can be affected by artificial flux and we recommend using optical microsensors in future EC-studies. Flume experiments were conducted in February 2013 at the Institute for Environmental Sciences, University of Koblenz-Landau Landau. Experiments were performed in a closed oval-shaped acrylic glass flume with cross-sectional width of 4 cm and height of 10 cm and total length of 54 cm. The fluid flow was induced by a propeller driven by a motor and mean flow velocities of up to 20 cm s-1 were generated by applying voltages between 0 V and 4 V DC. The flume was completely sealed with an acrylic glass cover. Oxygen sensors were inserted through rubber seal fittings and allowed positioning the sensors with inclinations to the main flow direction of ~60°, ~95° and ~135°. A Clark type electrochemical O2 microsensor with a low stirring sensitivity (0.7%) was tested and a fast-responding needle-type O2 optode (PyroScience GmbH, Germany) was used as reference as optodes should not be stirring sensitive. Instantaneous three-dimensional flow velocities were measured at 7.4 Hz using stereoscopic particle image velocimetry (PIV). The velocity at the sensor tip was extracted. The correlation of the fluctuating O2 sensor signals and the fluctuating velocities was quantified with a cross-correlation analysis. A significant cross-correlation is equivalent to a significant artificial flux. For a total of 18 experiments the flow velocity was adjusted between 1.7 and 19.2 cm s**-1, and 3 different orientations of the electrochemical sensor were tested with inclination angles of ~60°, ~95° and ~135° with respect to the main flow direction. In experiments 16-18, wavelike flow was induced, whereas in all other experiments the motor was driven by constant voltages. In 7 experiments, O2 was additionally measured by optodes. Although performed simultaneously with the electrochemical sensor, optode measurements are listed as separate experiments (denoted by the attached 'op' in the filename), because the velocity time series was extracted at the optode tip, located at a different position in the flume.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Filchner-Ronne ice shelf, which drains most of the marine-based portions of the West Antarctic ice sheet, is the largest ice shelf on Earth by volume. The origin and properties of the ice that constitutes this shelf are poorly understood, because a strong reflecting interface within the ice and the diffuse nature of the ice?ocean interface make seismic and radio echo sounding data difficult to interpret. Ice in the upper part of the shelf is of meteoric origin, but it has been proposed that a basal layer of saline ice accumulates from below. Here we present the results of an analysis of the physical and chemical characteristics of an ice core drilled almost to the bottom of the Ronne ice shelf. We observe a change in ice properties at about 150 m depth, which we ascribe to a change from meteoric ice to basal marine ice. The basal ice is very different from sea ice formed at the ocean surface and we propose a formation mechanism in which ice platelets in the water column accrete to the bottom of the ice shelf.