959 resultados para processing method
Resumo:
Speech melody or prosody subserves linguistic, emotional, and pragmatic functions in speech communication. Prosodic perception is based on the decoding of acoustic cues with a predominant function of frequency-related information perceived as speaker's pitch. Evaluation of prosodic meaning is a cognitive function implemented in cortical and subcortical networks that generate continuously updated affective or linguistic speaker impressions. Various brain-imaging methods allow delineation of neural structures involved in prosody processing. In contrast to functional magnetic resonance imaging techniques, DC (direct current, slow) components of the EEG directly measure cortical activation without temporal delay. Activation patterns obtained with this method are highly task specific and intraindividually reproducible. Studies presented here investigated the topography of prosodic stimulus processing in dependence on acoustic stimulus structure and linguistic or affective task demands, respectively. Data obtained from measuring DC potentials demonstrated that the right hemisphere has a predominant role in processing emotions from the tone of voice, irrespective of emotional valence. However, right hemisphere involvement is modulated by diverse speech and language-related conditions that are associated with a left hemisphere participation in prosody processing. The degree of left hemisphere involvement depends on several factors such as (i) articulatory demands on the perceiver of prosody (possibly, also the poser), (ii) a relative left hemisphere specialization in processing temporal cues mediating prosodic meaning, and (iii) the propensity of prosody to act on the segment level in order to modulate word or sentence meaning. The specific role of top-down effects in terms of either linguistically or affectively oriented attention on lateralization of stimulus processing is not clear and requires further investigations.
Resumo:
A method for quantifying nociceptive withdrawal reflex receptive fields in human volunteers and patients is described. The reflex receptive field (RRF) for a specific muscle denotes the cutaneous area from which a muscle contraction can be evoked by a nociceptive stimulus. The method is based on random stimulations presented in a blinded sequence to 10 stimulation sites. The sensitivity map is derived by interpolating the reflex responses evoked from the 10 sites. A set of features describing the size and location of the RRF is presented based on statistical analysis of the sensitivity map within every subject. The features include RRF area, volume, peak location and center of gravity. The method was applied to 30 healthy volunteers. Electrical stimuli were applied to the sole of the foot evoking reflexes in the ankle flexor tibialis anterior. The RRF area covered a fraction of 0.57+/-0.06 (S.E.M.) of the foot and was located on the medial, distal part of the sole of the foot. An intramuscular injection into flexor digitorum brevis of capsaicin was performed in one spinal cord injured subject to attempt modulation of the reflex receptive field. The RRF area, RRF volume and location of the peak reflex response appear to be the most sensitive measures for detecting modulation of spinal nociceptive processing. This new method has important potential applications for exploring aspects of central plasticity in volunteers and patients. It may be utilized as a new diagnostic tool for central hypersensitivity and quantification of therapeutic interventions.
Resumo:
This study develops an automated analysis tool by combining total internal reflection fluorescence microscopy (TIRFM), an evanescent wave microscopic imaging technique to capture time-sequential images and the corresponding image processing Matlab code to identify movements of single individual particles. The developed code will enable us to examine two dimensional hindered tangential Brownian motion of nanoparticles with a sub-pixel resolution (nanoscale). The measured mean square displacements of nanoparticles are compared with theoretical predictions to estimate particle diameters and fluid viscosity using a nonlinear regression technique. These estimated values will be confirmed by the diameters and viscosities given by manufacturers to validate this analysis tool. Nano-particles used in these experiments are yellow-green polystyrene fluorescent nanospheres (200 nm, 500 nm and 1000 nm in diameter (nominal); 505 nm excitation and 515 nm emission wavelengths). Solutions used in this experiment are de-ionized (DI) water, 10% d-glucose and 10% glycerol. Mean square displacements obtained near the surface shows significant deviation from theoretical predictions which are attributed to DLVO forces in the region but it conforms to theoretical predictions after ~125 nm onwards. The proposed automation analysis tool will be powerfully employed in the bio-application fields needed for examination of single protein (DNA and/or vesicle) tracking, drug delivery, and cyto-toxicity unlike the traditional measurement techniques that require fixing the cells. Furthermore, this tool can be also usefully applied for the microfluidic areas of non-invasive thermometry, particle tracking velocimetry (PTV), and non-invasive viscometry.
Processing and characterization of PbSnTe-based thermoelectric materials made by mechanical alloying
Resumo:
The research reported in this dissertation investigates the processes required to mechanically alloy Pb1-xSnxTe and AgSbTe2 and a method of combining these two end compounds to result in (y)(AgSbTe2)–(1 - y)(Pb1-xSnxTe) thermoelectric materials for power generation applications. In general, traditional melt processing of these alloys has employed high purity materials that are subjected to time and energy intensive processes that result in highly functional material that is not easily reproducible. This research reports the development of mechanical alloying processes using commercially available 99.9% pure elemental powders in order to provide a basis for the economical production of highly functional thermoelectric materials. Though there have been reports of high and low ZT materials fabricated by both melt alloying and mechanical alloying, the processing-structure-properties-performance relationship connecting how the material is made to its resulting functionality is poorly understood. This is particularly true for mechanically alloyed material, motivating an effort to investigate bulk material within the (y)(AgSbTe2)–(1 - y)(Pb1-xSnx- Te) system using the mechanical alloying method. This research adds to the body of knowledge concerning the way in which mechanical alloying can be used to efficiently produce high ZT thermoelectric materials. The processes required to mechanically alloy elemental powders to form Pb1-xSnxTe and AgSbTe2 and to subsequently consolidate the alloyed powder is described. The composition, phases present in the alloy, volume percent, size and spacing of the phases are reported. The room temperature electronic transport properties of electrical conductivity, carrier concentration and carrier mobility are reported for each alloy and the effect of the presence of any secondary phase on the electronic transport properties is described. An mechanical mixing approach for incorporating the end compounds to result in (y)(AgSbTe2)–(1-y)(Pb1-xSnxTe) is described and when 5 vol.% AgSbTe2 was incorporated was found to form a solid solution with the Pb1-xSnxTe phase. An initial attempt to change the carrier concentration of the Pb1-xSnxTe phase was made by adding excess Te and found that the carrier density of the alloys in this work are not sensitive to excess Te. It has been demonstrated using the processing techniques reported in this research that this material system, when appropriately doped, has the potential to perform as highly functional thermoelectric material.
Resumo:
Quantifying belowground dynamics is critical to our understanding of plant and ecosystem function and belowground carbon cycling, yet currently available tools for complex belowground image analyses are insufficient. We introduce novel techniques combining digital image processing tools and geographic information systems (GIS) analysis to permit semi-automated analysis of complex root and soil dynamics. We illustrate methodologies with imagery from microcosms, minirhizotrons, and a rhizotron, in upland and peatland soils. We provide guidelines for correct image capture, a method that automatically stitches together numerous minirhizotron images into one seamless image, and image analysis using image segmentation and classification in SPRING or change analysis in ArcMap. These methods facilitate spatial and temporal root and soil interaction studies, providing a framework to expand a more comprehensive understanding of belowground dynamics.
Resumo:
Methodological evaluation of the proteomic analysis of cardiovascular-tissue material has been performed with a special emphasis on establishing examinations that allow reliable quantitative analysis of silver-stained readouts. Reliability, reproducibility, robustness and linearity were addressed and clarified. In addition, several types of normalization procedures were evaluated and new approaches are proposed. It has been found that the silver-stained readout offers a convenient approach for quantitation if a linear range for gel loading is defined. In addition, a broad range of a 10-fold input (loading 20-200 microg per gel) fulfills the linearity criteria, although at the lowest input (20 microg) a portion of protein species will remain undetected. The method is reliable and reproducible within a range of 65-200 microg input. The normalization procedure using the sum of all spot intensities from a silver-stained 2D pattern has been shown to be less reliable than other approaches, namely, normalization through median or through involvement of interquartile range. A special refinement of the normalization through virtual segmentation of pattern, and calculation of normalization factor for each stratum provides highly satisfactory results. The presented results not only provide evidence for the usefulness of silver-stained gels for quantitative evaluation, but they are directly applicable to the research endeavor of monitoring alterations in cardiovascular pathophysiology.
Resumo:
In this paper, the well-known method of frames approach to the signal decomposition problem is reformulated as a certain bilevel goal-attainment linear least squares problem. As a consequence, a numerically robust variant of the method, named approximating method of frames, is proposed on the basis of a certain minimal Euclidean norm approximating splitting pseudo-iteration-wise method.
Resumo:
BACKGROUND: Many patients with Posttraumatic Stress Disorder (PTSD) feel overwhelmed in situations with high levels of sensory input, as in crowded situations with complex sensory characteristics. These difficulties might be related to subtle sensory processing deficits similar to those that have been found for sounds in electrophysiological studies. METHOD: Visual processing was investigated with functional magnetic resonance imaging in trauma-exposed participants with (N = 18) and without PTSD (N = 21) employing a picture-viewing task. RESULTS: Activity observed in response to visual scenes was lower in PTSD participants 1) in the ventral stream of the visual system, including striate and extrastriate, inferior temporal, and entorhinal cortices, and 2) in dorsal and ventral attention systems (P < 0.05, FWE-corrected). These effects could not be explained by the emotional salience of the pictures. CONCLUSIONS: Visual processing was substantially altered in PTSD in the ventral visual stream, a component of the visual system thought to be responsible for object property processing. Together with previous reports of subtle auditory deficits in PTSD, these findings provide strong support for potentially important sensory processing deficits, whose origins may be related to dysfunctional attention processes.
Resumo:
Due to the ongoing trend towards increased product variety, fast-moving consumer goods such as food and beverages, pharmaceuticals, and chemicals are typically manufactured through so-called make-and-pack processes. These processes consist of a make stage, a pack stage, and intermediate storage facilities that decouple these two stages. In operations scheduling, complex technological constraints must be considered, e.g., non-identical parallel processing units, sequence-dependent changeovers, batch splitting, no-wait restrictions, material transfer times, minimum storage times, and finite storage capacity. The short-term scheduling problem is to compute a production schedule such that a given demand for products is fulfilled, all technological constraints are met, and the production makespan is minimised. A production schedule typically comprises 500–1500 operations. Due to the problem size and complexity of the technological constraints, the performance of known mixed-integer linear programming (MILP) formulations and heuristic approaches is often insufficient. We present a hybrid method consisting of three phases. First, the set of operations is divided into several subsets. Second, these subsets are iteratively scheduled using a generic and flexible MILP formulation. Third, a novel critical path-based improvement procedure is applied to the resulting schedule. We develop several strategies for the integration of the MILP model into this heuristic framework. Using these strategies, high-quality feasible solutions to large-scale instances can be obtained within reasonable CPU times using standard optimisation software. We have applied the proposed hybrid method to a set of industrial problem instances and found that the method outperforms state-of-the-art methods.
Resumo:
Histomorphometric evaluation of the buccal aspects of periodontal tissues in rodents requires reproducible alignment of maxillae and highly precise sections containing central sections of buccal roots; this is a cumbersome and technically sensitive process due to the small specimen size. The aim of the present report is to describe and analyze a method to transfer virtual sections of micro-computer tomographic (CT)-generated image stacks to the microtome for undecalcified histological processing and to describe the anatomy of the periodontium in rat molars. A total of 84 undecalcified sections of all buccal roots of seven untreated rats was analyzed. The accuracy of section coordinate transfer from virtual micro-CT slice to the histological slice, right-left side differences and the measurement error for linear and angular measurements on micro-CT and on histological micrographs were calculated using the Bland-Altman method, interclass correlation coefficient and the method of moments estimator. Also, manual alignment of the micro-CT-scanned rat maxilla was compared with multiplanar computer-reconstructed alignment. The supra alveolar rat anatomy is rather similar to human anatomy, whereas the alveolar bone is of compact type and the keratinized gingival epithelium bends apical to join the junctional epithelium. The high methodological standardization presented herein ensures retrieval of histological slices with excellent display of anatomical microstructures, in a reproducible manner, minimizes random errors, and thereby may contribute to the reduction of number of animals needed.
Resumo:
Procurement of fresh tissue of prostate cancer is critical for biobanking and generation of xenograft models as an important preclinical step towards new therapeutic strategies in advanced prostate cancer. However, handling of fresh radical prostatectomy specimens has been notoriously challenging given the distinctive physical properties of prostate tissue and the difficulty to identify cancer foci on gross examination. Here, we have developed a novel approach using ceramic foam plates for processing freshly cut whole mount sections from radical prostatectomy specimens without compromising further diagnostic assessment. Forty-nine radical prostatectomy specimens were processed and sectioned from the apex to the base in whole mount slices. Putative carcinoma foci were morphologically verified by frozen section analysis. The fresh whole mount slices were then laid between two ceramic foam plates and fixed overnight. To test tissue preservation after this procedure, formalin-fixed and paraffin-embedded whole mount sections were stained with hematoxylin and eosin (H&E) and analyzed by immunohistochemistry, fluorescence, and silver in situ hybridization (FISH and SISH, respectively). There were no morphological artifacts on H&E stained whole mount sections from slices that had been fixed between two plates of ceramic foam, and the histological architecture was fully retained. The quality of immunohistochemistry, FISH, and SISH was excellent. Fixing whole mount tissue slices between ceramic foam plates after frozen section examination is an excellent method for processing fresh radical prostatectomy specimens, allowing for a precise identification and collection of fresh tumor tissue without compromising further diagnostic analysis.
Resumo:
BACKGROUND Record linkage of existing individual health care data is an efficient way to answer important epidemiological research questions. Reuse of individual health-related data faces several problems: Either a unique personal identifier, like social security number, is not available or non-unique person identifiable information, like names, are privacy protected and cannot be accessed. A solution to protect privacy in probabilistic record linkages is to encrypt these sensitive information. Unfortunately, encrypted hash codes of two names differ completely if the plain names differ only by a single character. Therefore, standard encryption methods cannot be applied. To overcome these challenges, we developed the Privacy Preserving Probabilistic Record Linkage (P3RL) method. METHODS In this Privacy Preserving Probabilistic Record Linkage method we apply a three-party protocol, with two sites collecting individual data and an independent trusted linkage center as the third partner. Our method consists of three main steps: pre-processing, encryption and probabilistic record linkage. Data pre-processing and encryption are done at the sites by local personnel. To guarantee similar quality and format of variables and identical encryption procedure at each site, the linkage center generates semi-automated pre-processing and encryption templates. To retrieve information (i.e. data structure) for the creation of templates without ever accessing plain person identifiable information, we introduced a novel method of data masking. Sensitive string variables are encrypted using Bloom filters, which enables calculation of similarity coefficients. For date variables, we developed special encryption procedures to handle the most common date errors. The linkage center performs probabilistic record linkage with encrypted person identifiable information and plain non-sensitive variables. RESULTS In this paper we describe step by step how to link existing health-related data using encryption methods to preserve privacy of persons in the study. CONCLUSION Privacy Preserving Probabilistic Record linkage expands record linkage facilities in settings where a unique identifier is unavailable and/or regulations restrict access to the non-unique person identifiable information needed to link existing health-related data sets. Automated pre-processing and encryption fully protect sensitive information ensuring participant confidentiality. This method is suitable not just for epidemiological research but also for any setting with similar challenges.
Resumo:
XMapTools is a MATLAB©-based graphical user interface program for electron microprobe X-ray image processing, which can be used to estimate the pressure–temperature conditions of crystallization of minerals in metamorphic rocks. This program (available online at http://www.xmaptools.com) provides a method to standardize raw electron microprobe data and includes functions to calculate the oxide weight percent compositions for various minerals. A set of external functions is provided to calculate structural formulae from the standardized analyses as well as to estimate pressure–temperature conditions of crystallization, using empirical and semi-empirical thermobarometers from the literature. Two graphical user interface modules, Chem2D and Triplot3D, are used to plot mineral compositions into binary and ternary diagrams. As an example, the software is used to study a high-pressure Himalayan eclogite sample from the Stak massif in Pakistan. The high-pressure paragenesis consisting of omphacite and garnet has been retrogressed to a symplectitic assemblage of amphibole, plagioclase and clinopyroxene. Mineral compositions corresponding to ~165,000 analyses yield estimates for the eclogitic pressure–temperature retrograde path from 25 kbar to 9 kbar. Corresponding pressure–temperature maps were plotted and used to interpret the link between the equilibrium conditions of crystallization and the symplectitic microstructures. This example illustrates the usefulness of XMapTools for studying variations of the chemical composition of minerals and for retrieving information on metamorphic conditions on a microscale, towards computation of continuous pressure–temperature-and relative time path in zoned metamorphic minerals not affected by post-crystallization diffusion.
Resumo:
Background: A prerequisite for high performance in motor tasks is the acquisition of egocentric sensory information that must be translated into motor actions. A phenomenon that supports this process is the Quiet Eye (QE) defined as long final fixation before movement initiation. It is assumed that the QE facilitates information processing, particularly regarding movement parameterization. Aims: The question remains whether this facilitation also holds for the information-processing stage of response selection and – related to perception crucial – stage of stimulus identification. Method: In two experiments with sport science students, performance-enhancing effects of experimentally manipulated QE durations were tested as a function of target position predictability and target visibility, thereby selectively manipulating response selection and stimulus identification demands, respectively. Results: The results support the hypothesis of facilitated information processing through long QE durations since in both experiments performance-enhancing effects of long QE durations were found under increased processing demands only. In Experiment 1, QE duration affected performance only if the target position was not predictable and positional information had to be processed over the QE period. In Experiment 2, in a full vs. no target visibility comparison with saccades to the upcoming target position induced by flicker cues, the functionality of a long QE duration depended on the visual stimulus identification period as soon as the interval falls below a certain threshold. Conclusions: The results corroborate earlier findings that QE efficiency depends on demands put on the visuomotor system, thereby furthering the assumption that the phenomenon supports the processes of sensorimotor integration.
Resumo:
An efficient and reliable automated model that can map physical Soil and Water Conservation (SWC) structures on cultivated land was developed using very high spatial resolution imagery obtained from Google Earth and ArcGIS, ERDAS IMAGINE, and SDC Morphology Toolbox for MATLAB and statistical techniques. The model was developed using the following procedures: (1) a high-pass spatial filter algorithm was applied to detect linear features, (2) morphological processing was used to remove unwanted linear features, (3) the raster format was vectorized, (4) the vectorized linear features were split per hectare (ha) and each line was then classified according to its compass direction, and (5) the sum of all vector lengths per class of direction per ha was calculated. Finally, the direction class with the greatest length was selected from each ha to predict the physical SWC structures. The model was calibrated and validated on the Ethiopian Highlands. The model correctly mapped 80% of the existing structures. The developed model was then tested at different sites with different topography. The results show that the developed model is feasible for automated mapping of physical SWC structures. Therefore, the model is useful for predicting and mapping physical SWC structures areas across diverse areas.