954 resultados para thermal image segmentation
Resumo:
Report for the scientific sojourn carried out at Albert Einstein Institut in Germany, from April to July 2006.
Resumo:
OBJECTIVE: Local heating increases skin blood flow SkBF (thermal hyperemia). In a previous study, we reported that a first local thermal stimulus could attenuate the hyperemic response to a second one applied later on the same skin spot, a phenomenon that we termed desensitization. However, other studies found no evidence for desensitization in similar conditions. The aim of the present work was to test whether it was related to differences in instrumentation. METHODS: Twenty-eight healthy young males were studied. Two pairs of heating chambers, one custom-made (our study) and one commercial (other groups), were affixed to forearm skin. SkBF was measured with single-point laser-Doppler flowmetry (LDF) (780nm) in one pair, and laser-Doppler imaging (LDI) (633nm) in the other. A temperature step from 34 to 41°C, was applied for 30minutes and repeated after two hours. RESULTS: During the second thermal challenge, the plateau SkBF was lower than during the first thermal and was observed with each of the four combinations of SkBF measurement techniques and heating equipment (p<0.05 for all conditions, range -9% to -16% of the initial value). CONCLUSION: Desensitization of thermal hyperemia is not specific to peculiar operating conditions.
Resumo:
Thermal systems interchanging heat and mass by conduction, convection, radiation (solar and thermal ) occur in many engineering applications like energy storage by solar collectors, window glazing in buildings, refrigeration of plastic moulds, air handling units etc. Often these thermal systems are composed of various elements for example a building with wall, windows, rooms, etc. It would be of particular interest to have a modular thermal system which is formed by connecting different modules for the elements, flexibility to use and change models for individual elements, add or remove elements without changing the entire code. A numerical approach to handle the heat transfer and fluid flow in such systems helps in saving the full scale experiment time, cost and also aids optimisation of parameters of the system. In subsequent sections are presented a short summary of the work done until now on the orientation of the thesis in the field of numerical methods for heat transfer and fluid flow applications, the work in process and the future work.
Resumo:
The long term goal of this research is to develop a program able to produce an automatic segmentation and categorization of textual sequences into discourse types. In this preliminary contribution, we present the construction of an algorithm which takes a segmented text as input and attempts to produce a categorization of sequences, such as narrative, argumentative, descriptive and so on. Also, this work aims at investigating a possible convergence between the typological approach developed in particular in the field of text and discourse analysis in French by Adam (2008) and Bronckart (1997) and unsupervised statistical learning.
Resumo:
This paper presents a semisupervised support vector machine (SVM) that integrates the information of both labeled and unlabeled pixels efficiently. Method's performance is illustrated in the relevant problem of very high resolution image classification of urban areas. The SVM is trained with the linear combination of two kernels: a base kernel working only with labeled examples is deformed by a likelihood kernel encoding similarities between labeled and unlabeled examples. Results obtained on very high resolution (VHR) multispectral and hyperspectral images show the relevance of the method in the context of urban image classification. Also, its simplicity and the few parameters involved make the method versatile and workable by unexperienced users.
Resumo:
We consider a general equilibrium model a la Bhaskar (Review of Economic Studies 2002): there are complementarities across sectors, each of which comprise (many) heterogenous monopolistically competitive firms. Bhaskar's model is extended in two directions: production requires capital, and labour markets are segmented. Labour market segmentation models the difficulties of labour migrating across international barriers (in a trade context) or from a poor region to a richer one (in a regional context), whilst the assumption of a single capital market means that capital flows freely between countries or regions. The model is solved analytically and a closed form solution is provided. Adding labour market segmentation to Bhaskar's two-tier industrial structure allows us to study, inter alia, the impact of competition regulations on wages and - financial flows both in the regional and international context, and the output, welfare and financial implications of relaxing immigration laws. The analytical approach adopted allows us, not only to sign the effect of policies, but also to quantify their effects. Introducing capital as a factor of production improves the realism of the model and refi nes its empirically testable implications.
Resumo:
The assessment of yellow fever vaccine thermostability both in lyophilized form and after reconstitution were analyzed. Two commercial yellow fever vaccines were assayed for their thermal stability. Vaccines were exposed to test temperatures in the range of 8 (graus) C to 45 (graus) C. Residual infectivity was measured by a plaque assay using Vero cells. The titre values were used in an accelerated degradation test that follows the Arrhenius equation and the minimum immunizing dose was assumed to be 10 (ao cubo) particles forming unit (pfu)/dose. Some of the most relevant results include that (i) regular culture medium show the same degradation pattern of a reconstituted 17D-204 vaccine; (ii) reconstituted YF-17D-204 showed a predictable half life of more than six days if kept at 0 (graus) C; (iii) there are differences in thermostability between different products that are probably due to both presence of stabilizers in the preparation and the modernization in the vaccine production; (iv) it is important to establish a proper correlation between the mouse infectivity test and the plaque assay since the last appears to be more simple, economical, and practical for small laboratories to assess the potency of the vaccine, and (v) the accelerated degradation test appears to be the best procedure to quantify the thermostability of biological products.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
The investigation of perceptual and cognitive functions with non-invasive brain imaging methods critically depends on the careful selection of stimuli for use in experiments. For example, it must be verified that any observed effects follow from the parameter of interest (e.g. semantic category) rather than other low-level physical features (e.g. luminance, or spectral properties). Otherwise, interpretation of results is confounded. Often, researchers circumvent this issue by including additional control conditions or tasks, both of which are flawed and also prolong experiments. Here, we present some new approaches for controlling classes of stimuli intended for use in cognitive neuroscience, however these methods can be readily extrapolated to other applications and stimulus modalities. Our approach is comprised of two levels. The first level aims at equalizing individual stimuli in terms of their mean luminance. Each data point in the stimulus is adjusted to a standardized value based on a standard value across the stimulus battery. The second level analyzes two populations of stimuli along their spectral properties (i.e. spatial frequency) using a dissimilarity metric that equals the root mean square of the distance between two populations of objects as a function of spatial frequency along x- and y-dimensions of the image. Randomized permutations are used to obtain a minimal value between the populations to minimize, in a completely data-driven manner, the spectral differences between image sets. While another paper in this issue applies these methods in the case of acoustic stimuli (Aeschlimann et al., Brain Topogr 2008), we illustrate this approach here in detail for complex visual stimuli.
Resumo:
Defining an efficient training set is one of the most delicate phases for the success of remote sensing image classification routines. The complexity of the problem, the limited temporal and financial resources, as well as the high intraclass variance can make an algorithm fail if it is trained with a suboptimal dataset. Active learning aims at building efficient training sets by iteratively improving the model performance through sampling. A user-defined heuristic ranks the unlabeled pixels according to a function of the uncertainty of their class membership and then the user is asked to provide labels for the most uncertain pixels. This paper reviews and tests the main families of active learning algorithms: committee, large margin, and posterior probability-based. For each of them, the most recent advances in the remote sensing community are discussed and some heuristics are detailed and tested. Several challenging remote sensing scenarios are considered, including very high spatial resolution and hyperspectral image classification. Finally, guidelines for choosing the good architecture are provided for new and/or unexperienced user.