84 resultados para Static Mixer


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multisensory interactions are a fundamental feature of brain organization. Principles governing multisensory processing have been established by varying stimulus location, timing and efficacy independently. Determining whether and how such principles operate when stimuli vary dynamically in their perceived distance (as when looming/receding) provides an assay for synergy among the above principles and also means for linking multisensory interactions between rudimentary stimuli with higher-order signals used for communication and motor planning. Human participants indicated movement of looming or receding versus static stimuli that were visual, auditory, or multisensory combinations while 160-channel EEG was recorded. Multivariate EEG analyses and distributed source estimations were performed. Nonlinear interactions between looming signals were observed at early poststimulus latencies (∼75 ms) in analyses of voltage waveforms, global field power, and source estimations. These looming-specific interactions positively correlated with reaction time facilitation, providing direct links between neural and performance metrics of multisensory integration. Statistical analyses of source estimations identified looming-specific interactions within the right claustrum/insula extending inferiorly into the amygdala and also within the bilateral cuneus extending into the inferior and lateral occipital cortices. Multisensory effects common to all conditions, regardless of perceived distance and congruity, followed (∼115 ms) and manifested as faster transition between temporally stable brain networks (vs summed responses to unisensory conditions). We demonstrate the early-latency, synergistic interplay between existing principles of multisensory interactions. Such findings change the manner in which to model multisensory interactions at neural and behavioral/perceptual levels. We also provide neurophysiologic backing for the notion that looming signals receive preferential treatment during perception.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are far-reaching conceptual similarities between bi-static surface georadar and post-stack, "zero-offset" seismic reflection data, which is expressed in largely identical processing flows. One important difference is, however, that standard deconvolution algorithms routinely used to enhance the vertical resolution of seismic data are notoriously problematic or even detrimental to the overall signal quality when applied to surface georadar data. We have explored various options for alleviating this problem and have tested them on a geologically well-constrained surface georadar dataset. Standard stochastic and direct deterministic deconvolution approaches proved to be largely unsatisfactory. While least-squares-type deterministic deconvolution showed some promise, the inherent uncertainties involved in estimating the source wavelet introduced some artificial "ringiness". In contrast, we found spectral balancing approaches to be effective, practical and robust means for enhancing the vertical resolution of surface georadar data, particularly, but not exclusively, in the uppermost part of the georadar section, which is notoriously plagued by the interference of the direct air- and groundwaves. For the data considered in this study, it can be argued that band-limited spectral blueing may provide somewhat better results than standard band-limited spectral whitening, particularly in the uppermost part of the section affected by the interference of the air- and groundwaves. Interestingly, this finding is consistent with the fact that the amplitude spectrum resulting from least-squares-type deterministic deconvolution is characterized by a systematic enhancement of higher frequencies at the expense of lower frequencies and hence is blue rather than white. It is also consistent with increasing evidence that spectral "blueness" is a seemingly universal, albeit enigmatic, property of the distribution of reflection coefficients in the Earth. Our results therefore indicate that spectral balancing techniques in general and spectral blueing in particular represent simple, yet effective means of enhancing the vertical resolution of surface georadar data and, in many cases, could turn out to be a preferable alternative to standard deconvolution approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Static incubation tests, where microcapsules and beads are contacted with polymer and protein solutions, have been developed for the characterization of permselective materials applied for bioartificial organs and drug delivery. A combination of polymer ingress, detected by size-exclusion chromatography, and protein ingress/ egress, assessed by gel electrophoresis, provides information regarding the diffusion kinetics, molar mass cutoff(MMCO) and permeability. This represents an improvement over existing permeability measurements that are based on the diffusion of a single type of solute. Specifically, the permeability of capsules based on alginate, cellulose sulfate, polymethylene-co-guanidine were characterized as a function of membrane thickness. Solid alginate beads were also evaluated. The MMCO of these capsules was estimated to be between 80 and 90 kDa using polymers, and between 116-150 kDa with proteins. Apparently, the globular shape of the proteins (radius of gyration (Rg) of 4.2-4.6 nm) facilitates their passage through the membrane, comparatively to the polysaccharide coil conformation (Rg of 6.5-8.3 nm). An increase of the capsule membrane thickness reduced these values. The MMCO of the beads, which do not have a membrane limiting their permselective properties, was higher, between 110 and 200 kDa with dextrans, and between 150 and 220 kDa with proteins. Therefore, although the permeability estimated with biologically relevant molecules is generally higher due to their lower radius of gyration, both the MMCO of synthetic and natural watersoluble polymers correlate well, and can be used as in vitro metrics for the immune protection ability of microcapsules and microbeads. This article shows, to the authors' knowledge, the first reported concordance between permeability measures based on model natural and biological macromolecules.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: It was the aim of this study to investigate facial emotion recognition (FER) in the elderly with cognitive impairment. Method: Twelve patients with Alzheimer's disease (AD) and 12 healthy control subjects were asked to name dynamic or static pictures of basic facial emotions using the Multimodal Emotion Recognition Test and to assess the degree of their difficulty in the recognition task, while their electrodermal conductance was registered as an unconscious processing measure. Results: AD patients had lower objective recognition performances for disgust and fear, but only disgust was accompanied by decreased subjective FER in AD patients. The electrodermal response was similar in all groups. No significant effect of dynamic versus static emotion presentation on FER was found. Conclusion: Selective impairment in recognizing facial expressions of disgust and fear may indicate a nonlinear decline in FER capacity with increasing cognitive impairment and result from progressive though specific damage to neural structures engaged in emotional processing and facial emotion identification. Although our results suggest unchanged unconscious FER processing with increasing cognitive impairment, further investigations on unconscious FER and self-awareness of FER capacity in neurodegenerative disorders are required.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quartz veins ranging in size from less than 50 cm length and 5 cm width to greater than 10 m in length and 5 m in width are found throughout the Central Swiss Alps. In some cases, the veins are completely filled with milky quartz, while in others, sometimes spectacular void-filling quartz crystals are found. The style of vein filling and size is controlled by host rock composition and deformation history. Temperatures of vein formation, estimated using stable isotope thermometry and mineral equilibria, cover a range of 450 degrees C down to 150 degrees C. Vein formation started at 18 to 20 Ma and continued for over 10 My. The oxygen isotope values of quartz veins range from 10 to 20 permil, and in almost all cases are equal to those of the hosting lithology. The strongly rock-buffered veins imply a low fluid/rock ratio and minimal fluid flow. In order to explain massive, nearly morromineralic quartz formation without exceptionally large fluid fluxes, a mechanism of differential pressure and silica diffusion, combined with pressure solution, is proposed for early vein formation. Fluid inclusions and hydrous minerals in late-formed veins have extremely low delta D values, consistent with meteoric water infiltration. The change from rock-buffered, static fluid to infiltration from above can be explained in terms of changes in the large-scale deformation style occurring between 20 and 15 Ma. The rapid cooling of the Central Alps identified in previous studies may be explained in part, by infiltration of cold meteoric waters along fracture systems down to depths of 10 km or more. An average water flux of 0.15 cm 3 cm(-2)yr(-1) entering the rock and reemerging heated by 40 degrees C is sufficient to cool rock at 10 km depth by 100 degrees C in 5 million years. The very negative delta D values of < -130 permil for the late stage fluids are well below the annual average values measured in meteoric water in the region today. The low fossil delta D values indicate that the Central Alps were at a higher elevation in the Neogene. Such a conclusion is supported by an earlier work, where a paleoaltitude of 5000 meters was proposed on the basis of large erratic boulders found at low elevations far from their origin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cannula design is of prime importance for venous drainage during cardiopulmonary bypass (CPB). To evaluate cannulas intended for CPB, an in vitro circuit was set up with silicone tubing between the test cannula encased in a movable preload reservoir and another static reservoir. The pressure-drop (DeltaP) value (P-drainage - P-preload) was measured using Millar pressure transducers. Flow rate (Q) was measured using an ultrasound flowmeter. Data display and data recording were controlled using a LabView application, custom made particularly for our experiments. Our results demonstrated that DeltaP, Q, and cannula resistance (DeltaP/Q) values were significantly decreased when the cannula diameter was increased for Smart and Medtronic cannulas. Smartcanula showed 36% and 43% less resistance compared to Medtronic venous and Medtronic femoral cannulas, respectively. The cannula shape (straight- or curved-tips) did not affect the DLP cannula resistance. Out of five cannulas tested, the Smartcanula outperforms the other commercially available cannulas. The mean (DeltaP/Q) values were 3.3 +/- 0.08, 4.07 +/- 0.08, 5.58 +/- 0.10, 5.74 +/- 0.15, and 6.45 +/- 0.15 for Smart, Medtronic, Edwards, Sarns, and Gambro cannulas, respectively (two-way ANOVA, p < 0.0001). In conclusion, the present assay allows discrimination between different forms of cannula with high or low lumen resistance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective Biomonitoring of solvents using the unchanged substance in urine as exposure indicator is still relatively scarce due to some discrepancies between the results reported in the literature. Based on the assessment of toluene exposure, the aim of this work was to evaluate the effects of some steps likely to bias the results and to measure urinary toluene both in volunteers experimentally exposed and in workers of rotogravure factories. Methods Static headspace was used for toluene analysis. o-Cresol was also measured for comparison. Urine collection, storage and conservation conditions were studied to evaluate possible loss or contamination of toluene in controlled situations applied to six volunteers in an exposure chamber according to four scenarios with exposure at stable levels from 10 to 50 ppm. Kinetics of elimination of toluene were determined over 24 h. A field study was then carried out in a total of 29 workers from two rotogravure printing facilities. Results Potential contamination during urine collection in the field is confirmed to be a real problem but technical precautions for sampling, storage and analysis can be easily followed to control the situation. In the volunteers at rest, urinary toluene showed a rapid increase after 2 h with a steady level after about 3 h. At 47.1 ppm the mean cumulated excretion was about 0.005% of the amount of the toluene ventilated. Correlation between the toluene levels in air and in end of exposure urinary sample was excellent (r = 0.965). In the field study, the median personal exposure to toluene was 32 ppm (range 3.6-148). According to the correlations between environmental and biological monitoring data, the post-shift urinary toluene (r = 0.921) and o-cresol (r = 0.873) concentrations were, respectively, 75.6 mu g/l and 0.76 mg/g creatinine for 50 ppm toluene personal exposure. The corresponding urinary toluene concentration before the next shift was 11 mu g/l (r = 0.883). Conclusion Urinary toluene was shown once more time a very interesting surrogate to o-cresol and could be recommended as a biomarker of choice for solvent exposure. [Authors]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recognition by the T-cell receptor (TCR) of immunogenic peptides (p) presented by Class I major histocompatibility complexes (MHC) is the key event in the immune response against virus-infected cells or tumor cells. A study of the 2C TCR/SIYR/H-2K(b) system using a computational alanine scanning and a much faster binding free energy decomposition based on the Molecular Mechanics-Generalized Born Surface Area (MM-GBSA) method is presented. The results show that the TCR-p-MHC binding free energy decomposition using this approach and including entropic terms provides a detailed and reliable description of the interactions between the molecules at an atomistic level. Comparison of the decomposition results with experimentally determined activity differences for alanine mutants yields a correlation of 0.67 when the entropy is neglected and 0.72 when the entropy is taken into account. Similarly, comparison of experimental activities with variations in binding free energies determined by computational alanine scanning yields correlations of 0.72 and 0.74 when the entropy is neglected or taken into account, respectively. Some key interactions for the TCR-p-MHC binding are analyzed and some possible side chains replacements are proposed in the context of TCR protein engineering. In addition, a comparison of the two theoretical approaches for estimating the role of each side chain in the complexation is given, and a new ad hoc approach to decompose the vibrational entropy term into atomic contributions, the linear decomposition of the vibrational entropy (LDVE), is introduced. The latter allows the rapid calculation of the entropic contribution of interesting side chains to the binding. This new method is based on the idea that the most important contributions to the vibrational entropy of a molecule originate from residues that contribute most to the vibrational amplitude of the normal modes. The LDVE approach is shown to provide results very similar to those of the exact but highly computationally demanding method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study of wave propagation at sonic frequency in soil leads to elasticity parameter determination. These parameters are compatible to those measured simultaneously by static loading. The acquisition of in situ elasticity parameter combined with laboratory description of the elastoplastic behaviour can lead to in situ elastoplastic curves. - L'étude de la propagation des ondes acoustiques permet la détermination des paramètres d'élasticité dans les sols. Ces paramètres sont cohérents avec des mesures statiques simultanées. L'acquisition des paramètres d'élasticité in situ associée à une description du comportement élasto-plastique mesuré en laboratoire permet d'obtenir des courbes d'élastoplasticité in situ.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High-field (>or=3 T) cardiac MRI is challenged by inhomogeneities of both the static magnetic field (B(0)) and the transmit radiofrequency field (B(1)+). The inhomogeneous B fields not only demand improved shimming methods but also impede the correct determination of the zero-order terms, i.e., the local resonance frequency f(0) and the radiofrequency power to generate the intended local B(1)+ field. In this work, dual echo time B(0)-map and dual flip angle B(1)+-map acquisition methods are combined to acquire multislice B(0)- and B(1)+-maps simultaneously covering the entire heart in a single breath hold of 18 heartbeats. A previously proposed excitation pulse shape dependent slice profile correction is tested and applied to reduce systematic errors of the multislice B(1)+-map. Localized higher-order shim correction values including the zero-order terms for frequency f(0) and radiofrequency power can be determined based on the acquired B(0)- and B(1)+-maps. This method has been tested in 7 healthy adult human subjects at 3 T and improved the B(0) field homogeneity (standard deviation) from 60 Hz to 35 Hz and the average B(1)+ field from 77% to 100% of the desired B(1)+ field when compared to more commonly used preparation methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Whole-body (WB) planar imaging has long been one of the staple methods of dosimetry, and its quantification has been formalized by the MIRD Committee in pamphlet no 16. One of the issues not specifically addressed in the formalism occurs when the count rates reaching the detector are sufficiently high to result in camera count saturation. Camera dead-time effects have been extensively studied, but all of the developed correction methods assume static acquisitions. However, during WB planar (sweep) imaging, a variable amount of imaged activity exists in the detector's field of view as a function of time and therefore the camera saturation is time dependent. A new time-dependent algorithm was developed to correct for dead-time effects during WB planar acquisitions that accounts for relative motion between detector heads and imaged object. Static camera dead-time parameters were acquired by imaging decaying activity in a phantom and obtaining a saturation curve. Using these parameters, an iterative algorithm akin to Newton's method was developed, which takes into account the variable count rate seen by the detector as a function of time. The algorithm was tested on simulated data as well as on a whole-body scan of high activity Samarium-153 in an ellipsoid phantom. A complete set of parameters from unsaturated phantom data necessary for count rate to activity conversion was also obtained, including build-up and attenuation coefficients, in order to convert corrected count rate values to activity. The algorithm proved successful in accounting for motion- and time-dependent saturation effects in both the simulated and measured data and converged to any desired degree of precision. The clearance half-life calculated from the ellipsoid phantom data was calculated to be 45.1 h after dead-time correction and 51.4 h with no correction; the physical decay half-life of Samarium-153 is 46.3 h. Accurate WB planar dosimetry of high activities relies on successfully compensating for camera saturation which takes into account the variable activity in the field of view, i.e. time-dependent dead-time effects. The algorithm presented here accomplishes this task.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Solid phase microextraction (SPME) has been widely used for many years in various applications, such as environmental and water samples, food and fragrance analysis, or biological fluids. The aim of this study was to suggest the SPME method as an alternative to conventional techniques used in the evaluation of worker exposure to benzene, toluene, ethylbenzene, and xylene (BTEX). Polymethylsiloxane-carboxen (PDMS/CAR) showed as the most effective stationary phase material for sorbing BTEX among other materials (polyacrylate, PDMS, PDMS/divinylbenzene, Carbowax/divinylbenzene). Various experimental conditions were studied to apply SPME to BTEX quantitation in field situations. The uptake rate of the selected fiber (75 microm PDMS/CAR) was determined for each analyte at various concentrations, relative humidities, and airflow velocities from static (calm air) to dynamic (> 200 cm/s) conditions. The SPME method also was compared with the National Institute of Occupational Safety and Health method 1501. Unlike the latter, the SPME approach fulfills the new requirement for the threshold limit value-short term exposure limit (TLV-STEL) of 2.5 ppm for benzene (8 mg/m(3))

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The large spatial inhomogeneity in transmit B, field (B-1(+)) observable in human MR images at hi h static magnetic fields (B-0) severely impairs image quality. To overcome this effect in brain T-1-weighted images the, MPRAGE sequence was modified to generate two different images at different inversion times MP2RAGE By combining the two images in a novel fashion, it was possible to create T-1-weigthed images where the result image was free of proton density contrast, T-2* contrast, reception bias field, and, to first order transmit field inhomogeneity. MP2RAGE sequence parameters were optimized using Bloch equations to maximize contrast-to-noise ratio per unit of time between brain tissues and minimize the effect of B-1(+) variations through space. Images of high anatomical quality and excellent brain tissue differentiation suitable for applications such as segmentation and voxel-based morphometry were obtained at 3 and 7 T. From such T-1-weighted images, acquired within 12 min, high-resolution 3D T-1 maps were routinely calculated at 7 T with sub-millimeter voxel resolution (0.65-0.85 mm isotropic). T-1 maps were validated in phantom experiments. In humans, the T, values obtained at 7 T were 1.15 +/- 0.06 s for white matter (WM) and 1.92 +/- 0.16 s for grey matter (GM), in good agreement with literature values obtained at lower spatial resolution. At 3 T, where whole-brain acquisitions with 1 mm isotropic voxels were acquired in 8 min the T-1 values obtained (0.81 +/- 0.03 S for WM and 1.35 +/- 0.05 for GM) were once again found to be in very good agreement with values in the literature. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Linking the structural connectivity of brain circuits to their cooperative dynamics and emergent functions is a central aim of neuroscience research. Graph theory has recently been applied to study the structure-function relationship of networks, where dynamical similarity of different nodes has been turned into a "static" functional connection. However, the capability of the brain to adapt, learn and process external stimuli requires a constant dynamical functional rewiring between circuitries and cell assemblies. Hence, we must capture the changes of network functional connectivity over time. Multi-electrode array data present a unique challenge within this framework. We study the dynamics of gamma oscillations in acute slices of the somatosensory cortex from juvenile mice recorded by planar multi-electrode arrays. Bursts of gamma oscillatory activity lasting a few hundred milliseconds could be initiated only by brief trains of electrical stimulations applied at the deepest cortical layers and simultaneously delivered at multiple locations. Local field potentials were used to study the spatio-temporal properties and the instantaneous synchronization profile of the gamma oscillatory activity, combined with current source density (CSD) analysis. Pair-wise differences in the oscillation phase were used to determine the presence of instantaneous synchronization between the different sites of the circuitry during the oscillatory period. Despite variation in the duration of the oscillatory response over successive trials, they showed a constant average power, suggesting that the rate of expenditure of energy during the gamma bursts is consistent across repeated stimulations. Within each gamma burst, the functional connectivity map reflected the columnar organization of the neocortex. Over successive trials, an apparently random rearrangement of the functional connectivity was observed, with a more stable columnar than horizontal organization. This work reveals new features of evoked gamma oscillations in developing cortex.