52 resultados para Paper-based


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Imaging mass spectrometry (IMS) is an emergent and innovative approach for measuring the composition, abundance and regioselectivity of molecules within an investigated area of fixed dimension. Although providing unprecedented molecular information compared with conventional MS techniques, enhancement of protein signature by IMS is still necessary and challenging. This paper demonstrates the combination of conventional organic washes with an optimized aqueous-based buffer for tissue section preparation before matrix-assisted laser desorption/ionization (MALDI) IMS of proteins. Based on a 500 mM ammonium formate in water-acetonitrile (9:1; v/v, 0.1% trifluororacetic acid, 0.1% Triton) solution, this buffer wash has shown to significantly enhance protein signature by profiling and IMS (~fourfold) when used after organic washes (70% EtOH followed by 90% EtOH), improving the quality and number of ion images obtained from mouse kidney and a 14-day mouse fetus whole-body tissue sections, while maintaining a similar reproducibility with conventional tissue rinsing. Even if some protein losses were observed, the data mining has demonstrated that it was primarily low abundant signals and that the number of new peaks found is greater with the described procedure. The proposed buffer has thus demonstrated to be of high efficiency for tissue section preparation providing novel and complementary information for direct on-tissue MALDI analysis compared with solely conventional organic rinsing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article examines the extent and limits of non-state forms of authority in international relations. It analyses how the information and communication technology (ICT) infrastructure for the tradability of services in a global knowledge-based economy relies on informal regulatory practices for adjustment of ICT-related skills. Companies and associations provide training and certification programmes as part of a growing market for educational services setting their own standards. The existing literature on non-conventional forms of authority in the global political economy has emphasised that the consent of actors subject to informal rules and explicit or implicit state recognition remains crucial for the effectiveness of those new forms of power. However, analyses based on a limited sample of actors tend toward a narrow understanding of the issues and fail to fully explore the differentiated space in which non-state authority is emerging. This paper examines the form of authority underpinning the global knowledge-based economy within the broader perspective of the issues likely to be standardised by technical ICT specification, the wide range of actors involved, and the highly differentiated space where standards become authoritative. The empirical findings highlight the role of different private actors in establishing international educational norms in this field. They also pinpoint the limits of profit-oriented standard-settings, notably with regard to generic norms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Multiple logistic regression is precluded from many practical applications in ecology that aim to predict the geographic distributions of species because it requires absence data, which are rarely available or are unreliable. In order to use multiple logistic regression, many studies have simulated "pseudo-absences" through a number of strategies, but it is unknown how the choice of strategy influences models and their geographic predictions of species. In this paper we evaluate the effect of several prevailing pseudo-absence strategies on the predictions of the geographic distribution of a virtual species whose "true" distribution and relationship to three environmental predictors was predefined. We evaluated the effect of using a) real absences b) pseudo-absences selected randomly from the background and c) two-step approaches: pseudo-absences selected from low suitability areas predicted by either Ecological Niche Factor Analysis: (ENFA) or BIOCLIM. We compared how the choice of pseudo-absence strategy affected model fit, predictive power, and information-theoretic model selection results. Results Models built with true absences had the best predictive power, best discriminatory power, and the "true" model (the one that contained the correct predictors) was supported by the data according to AIC, as expected. Models based on random pseudo-absences had among the lowest fit, but yielded the second highest AUC value (0.97), and the "true" model was also supported by the data. Models based on two-step approaches had intermediate fit, the lowest predictive power, and the "true" model was not supported by the data. Conclusion If ecologists wish to build parsimonious GLM models that will allow them to make robust predictions, a reasonable approach is to use a large number of randomly selected pseudo-absences, and perform model selection based on an information theoretic approach. However, the resulting models can be expected to have limited fit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monitoring of posture allocations and activities enables accurate estimation of energy expenditure and may aid in obesity prevention and treatment. At present, accurate devices rely on multiple sensors distributed on the body and thus may be too obtrusive for everyday use. This paper presents a novel wearable sensor, which is capable of very accurate recognition of common postures and activities. The patterns of heel acceleration and plantar pressure uniquely characterize postures and typical activities while requiring minimal preprocessing and no feature extraction. The shoe sensor was tested in nine adults performing sitting and standing postures and while walking, running, stair ascent/descent and cycling. Support vector machines (SVMs) were used for classification. A fourfold validation of a six-class subject-independent group model showed 95.2% average accuracy of posture/activity classification on full sensor set and over 98% on optimized sensor set. Using a combination of acceleration/pressure also enabled a pronounced reduction of the sampling frequency (25 to 1 Hz) without significant loss of accuracy (98% versus 93%). Subjects had shoe sizes (US) M9.5-11 and W7-9 and body mass index from 18.1 to 39.4 kg/m2 and thus suggesting that the device can be used by individuals with varying anthropometric characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Helvetic nappe system in Western Switzerland is a stack of fold nappes and thrust sheets em-placed at low grade metamorphism. Fold nappes and thrust sheets are also some of the most common features in orogens. Fold nappes are kilometer scaled recumbent folds which feature a weakly deformed normal limb and an intensely deformed overturned limb. Thrust sheets on the other hand are characterized by the absence of overturned limb and can be defined as almost rigid blocks of crust that are displaced sub-horizontally over up to several tens of kilometers. The Morcles and Doldenhom nappe are classic examples of fold nappes and constitute the so-called infra-Helvetic complex in Western and Central Switzerland, respectively. This complex is overridden by thrust sheets such as the Diablerets and Wildhörn nappes in Western Switzerland. One of the most famous example of thrust sheets worldwide is the Glariis thrust sheet in Central Switzerland which features over 35 kilometers of thrusting which are accommodated by a ~1 m thick shear zone. Since the works of the early Alpine geologist such as Heim and Lugeon, the knowledge of these nappes has been steadily refined and today the geometry and kinematics of the Helvetic nappe system is generally agreed upon. However, despite the extensive knowledge we have today of the kinematics of fold nappes and thrust sheets, the mechanical process leading to the emplacement of these nappe is still poorly understood. For a long time geologist were facing the so-called 'mechanical paradox' which arises from the fact that a block of rock several kilometers high and tens of kilometers long (i.e. nappe) would break internally rather than start moving on a low angle plane. Several solutions were proposed to solve this apparent paradox. Certainly the most successful is the theory of critical wedges (e.g. Chappie 1978; Dahlen, 1984). In this theory the orogen is considered as a whole and this change of scale allows thrust sheet like structures to form while being consistent with mechanics. However this theoiy is intricately linked to brittle rheology and fold nappes, which are inherently ductile structures, cannot be created in these models. When considering the problem of nappe emplacement from the perspective of ductile rheology the problem of strain localization arises. The aim of this thesis was to develop and apply models based on continuum mechanics and integrating heat transfer to understand the emplacement of nappes. Models were solved either analytically or numerically. In the first two papers of this thesis we derived a simple model which describes channel flow in a homogeneous material with temperature dependent viscosity. We applied this model to the Morcles fold nappe and to several kilometer-scale shear zones worldwide. In the last paper we zoomed out and studied the tectonics of (i) ductile and (ii) visco-elasto-plastic and temperature dependent wedges. In this last paper we focused on the relationship between basement and cover deformation. We demonstrated that during the compression of a ductile passive margin both fold nappes and thrust sheets can develop and that these apparently different structures constitute two end-members of a single structure (i.e. nappe). The transition from fold nappe to thrust sheet is to first order controlled by the deformation of the basement. -- Le système des nappes helvétiques en Suisse occidentale est un empilement de nappes de plis et de nappes de charriage qui se sont mis en place à faible grade métamorphique. Les nappes de plis et les nappes de charriage sont parmi les objets géologiques les plus communs dans les orogènes. Les nappes de plis sont des plis couchés d'échelle kilométrique caractérisés par un flanc normal faiblement défor-mé, au contraire de leur flanc inverse, intensément déformé. Les nappes de charriage, à l'inverse se caractérisent par l'absence d'un flanc inverse bien défini. Elles peuvent être définies comme des blocs de croûte terrestre qui se déplacent de manière presque rigide qui sont déplacés sub-horizontalement jusqu'à plusieurs dizaines de kilomètres. La nappe de Mordes et la nappe du Doldenhorn sont des exemples classiques de nappes de plis et constitue le complexe infra-helvétique en Suisse occidentale et centrale, respectivement. Ce complexe repose sous des nappes de charriages telles les nappes des Diablerets et du Widlhörn en Suisse occidentale. La nappe du Glariis en Suisse centrale se distingue par un déplacement de plus de 35 kilomètres qui s'est effectué à la faveur d'une zone de cisaillement basale épaisse de seulement 1 mètre. Aujourd'hui la géométrie et la cinématique des nappes alpines fait l'objet d'un consensus général. Malgré cela, les processus mécaniques par lesquels ces nappes se sont mises en place restent mal compris. Pendant toute la première moitié du vingtième siècle les géologues les géologues ont été confrontés au «paradoxe mécanique». Celui-ci survient du fait qu'un bloc de roche haut de plusieurs kilomètres et long de plusieurs dizaines de kilomètres (i.e., une nappe) se fracturera de l'intérieur plutôt que de se déplacer sur une surface frictionnelle. Plusieurs solutions ont été proposées pour contourner cet apparent paradoxe. La solution la plus populaire est la théorie des prismes d'accrétion critiques (par exemple Chappie, 1978 ; Dahlen, 1984). Dans le cadre de cette théorie l'orogène est considéré dans son ensemble et ce simple changement d'échelle solutionne le paradoxe mécanique (la fracturation interne de l'orogène correspond aux nappes). Cette théorie est étroitement lié à la rhéologie cassante et par conséquent des nappes de plis ne peuvent pas créer au sein d'un prisme critique. Le but de cette thèse était de développer et d'appliquer des modèles basés sur la théorie de la méca-nique des milieux continus et sur les transferts de chaleur pour comprendre l'emplacement des nappes. Ces modèles ont été solutionnés de manière analytique ou numérique. Dans les deux premiers articles présentés dans ce mémoire nous avons dérivé un modèle d'écoulement dans un chenal d'un matériel homogène dont la viscosité dépend de la température. Nous avons appliqué ce modèle à la nappe de Mordes et à plusieurs zone de cisaillement d'échelle kilométrique provenant de différents orogènes a travers le monde. Dans le dernier article nous avons considéré le problème à l'échelle de l'orogène et avons étudié la tectonique de prismes (i) ductiles, et (ii) visco-élasto-plastiques en considérant les transferts de chaleur. Nous avons démontré que durant la compression d'une marge passive ductile, a la fois des nappes de plis et des nappes de charriages peuvent se développer. Nous avons aussi démontré que nappes de plis et de charriages sont deux cas extrêmes d'une même structure (i.e. nappe) La transition entre le développement d'une nappe de pli ou d'une nappe de charriage est contrôlé au premier ordre par la déformation du socle. -- Le système des nappes helvétiques en Suisse occidentale est un emblement de nappes de plis et de nappes de chaînage qui se sont mis en place à faible grade métamoiphique. Les nappes de plis et les nappes de charriage sont parmi les objets géologiques les plus communs dans les orogènes. Les nappes de plis sont des plis couchés d'échelle kilométrique caractérisés par un flanc normal faiblement déformé, au contraire de leur flanc inverse, intensément déformé. Les nappes de charriage, à l'inverse se caractérisent par l'absence d'un flanc inverse bien défini. Elles peuvent être définies comme des blocs de croûte terrestre qui se déplacent de manière presque rigide qui sont déplacés sub-horizontalement jusqu'à plusieurs dizaines de kilomètres. La nappe de Morcles and la nappe du Doldenhorn sont des exemples classiques de nappes de plis et constitue le complexe infra-helvétique en Suisse occidentale et centrale, respectivement. Ce complexe repose sous des nappes de charriages telles les nappes des Diablerets et du Widlhörn en Suisse occidentale. La nappe du Glarüs en Suisse centrale est certainement l'exemple de nappe de charriage le plus célèbre au monde. Elle se distingue par un déplacement de plus de 35 kilomètres qui s'est effectué à la faveur d'une zone de cisaillement basale épaisse de seulement 1 mètre. La géométrie et la cinématique des nappes alpines fait l'objet d'un consensus général parmi les géologues. Au contraire les processus physiques par lesquels ces nappes sont mises en place reste mal compris. Les sédiments qui forment les nappes alpines se sont déposés à l'ère secondaire et à l'ère tertiaire sur le socle de la marge européenne qui a été étiré durant l'ouverture de l'océan Téthys. Lors de la fermeture de la Téthys, qui donnera naissance aux Alpes, le socle et les sédiments de la marge européenne ont été déformés pour former les nappes alpines. Le but de cette thèse était de développer et d'appliquer des modèles basés sur la théorie de la mécanique des milieux continus et sur les transferts de chaleur pour comprendre l'emplacement des nappes. Ces modèles ont été solutionnés de manière analytique ou numérique. Dans les deux premiers articles présentés dans ce mémoire nous nous sommes intéressés à la localisation de la déformation à l'échelle d'une nappe. Nous avons appliqué le modèle développé à la nappe de Morcles et à plusieurs zones de cisaillement provenant de différents orogènes à travers le monde. Dans le dernier article nous avons étudié la relation entre la déformation du socle et la défonnation des sédiments. Nous avons démontré que nappe de plis et nappes de charriages constituent les cas extrêmes d'un continuum. La transition entre nappe de pli et nappe de charriage est intrinsèquement lié à la déformation du socle sur lequel les sédiments reposent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rockfall hazard zoning is usually achieved using a qualitative estimate of hazard, and not an absolute scale. In Switzerland, danger maps, which correspond to a hazard zoning depending on the intensity of the considered phenomenon (e.g. kinetic energy for rockfalls), are replacing hazard maps. Basically, the danger grows with the mean frequency and with the intensity of the rockfall. This principle based on intensity thresholds may also be applied to other intensity threshold values than those used in Switzerland for rockfall hazard zoning method, i.e. danger mapping. In this paper, we explore the effect of slope geometry and rockfall frequency on the rockfall hazard zoning. First, the transition from 2D zoning to 3D zoning based on rockfall trajectory simulation is examined; then, its dependency on slope geometry is emphasized. The spatial extent of hazard zones is examined, showing that limits may vary widely depending on the rockfall frequency. This approach is especially dedicated to highly populated regions, because the hazard zoning has to be very fine in order to delineate the greatest possible territory containing acceptable risks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, kernel-based Machine Learning methods have gained great popularity in many data analysis and data mining fields: pattern recognition, biocomputing, speech and vision, engineering, remote sensing etc. The paper describes the use of kernel methods to approach the processing of large datasets from environmental monitoring networks. Several typical problems of the environmental sciences and their solutions provided by kernel-based methods are considered: classification of categorical data (soil type classification), mapping of environmental and pollution continuous information (pollution of soil by radionuclides), mapping with auxiliary information (climatic data from Aral Sea region). The promising developments, such as automatic emergency hot spot detection and monitoring network optimization are discussed as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rapport de synthèse : L'article qui fait l'objet de ma thèse évalue une nouvelle approche pédagogique pour l'apprentissage de certains chapitres de physiopathologie. Le dispositif pédagogique se base sur l'alternance d'apprentissage ex-cathedra et de l'utilisation d'un site web comprenant des vignettes cliniques. Lors de la consultation de ces-dernières, l'étudiant est invité à demander des examens de laboratoire dont il pourrait justifier la pertinence selon le cas clinique étudié. La nouveauté du procédé réside dans le fait que, préalablement à son cours ex-cathedra, l'enseignant peut consulter les statistiques de demandes de laboratoire et ainsi orienter son cours selon les éléments mal compris par les étudiants. A la suite du cours ex-cathedra, les étudiants peuvent consulter sur internet la vignette clinique complète avec des explications. A l'issue de tout le cours, une évaluation auprès des étudiants a été conduite. Le procédé a été mis en place durant deux années consécutives et l'article en discute notamment les résultats. Nous avons pu conclure que cette méthode innovatrice d'enseignement amène les étudiants à mieux se préparer pour les cours ex-cathedra tout en permettant à l'enseignant d'identifier plus précisément quelles thématiques étaient difficiles pour les étudiants et donc d'ajuster au mieux son cours. Mon travail de thèse a consisté à créer ce dispositif d'apprentissage, à créer l'application web des vignettes cliniques et à l'implanter durant deux années consécutives. J'ai ensuite analysé les données des évaluations et écrit l'article que j'ai présenté à la revue 'Medical Teacher'. Après quelques corrections et précisions demandées par le comité de lecture, l'article a été accepté et publié. Ce travail a débouché sur une seconde version de l'application web qui est actuellement utilisée lors du module 3.1 de 3è année à l'Ecole de Médecine à Lausanne. Summary : Since the early days of sexual selection, our understanding of the selective forces acting on males and females during reproduction has increased remarkably. However, despite a long tradition of experimental and theoretical work in this field and relentless effort, numerous questions remain unanswered and many results are conflicting. Moreover, the interface between sexual selection and conservation biology has to date received little attention, despite existing evidence for its importance. In the present thesis, I first used an empirical approach to test various sexual selection hypotheses in a population of whitefish of central Switzerland. This precise population is characterized by a high prevalence of gonadal alterations in males. In particular, I challenged the hypothesis that whitefish males displaying peculiar gonadal features are of lower genetic quality than other seemingly normal males. Additionally, I also worked on identifying important determinant of sperm behavior. During a second theoretical part of my work, which is part of a larger project on the evolution of female mate preferences in harvested fish populations, I developed an individual-based simulation model to estimate how different mate discrimination costs affect the demographical behavior of fish populations and the evolutionary trajectories of female mate preferences. This latter work provided me with some insight on a recently published article addressing the importance of sexual selection for harvesting-induced evolution. I built upon this insight in a short perspective paper. In parallel, I let some methodological questions drive my thoughts, and wrote an essay about possible synergies between the biological, the philosophical and the statistical approach to biological questions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several ink dating methods based on solvents analysis using gas chromatography/mass spectrometry (GC/MS) were proposed in the last decades. These methods follow the drying of solvents from ballpoint pen inks on paper and seem very promising. However, several questions arose over the last few years among questioned documents examiners regarding the transparency and reproducibility of the proposed techniques. These questions should be carefully studied for accurate and ethical application of this methodology in casework. Inspired by a real investigation involving ink dating, the present paper discusses this particular issue throughout four main topics: aging processes, dating methods, validation procedures and data interpretation. This work presents a wide picture of the ink dating field, warns about potential shortcomings and also proposes some solutions to avoid reporting errors in court.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computed Tomography (CT) represents the standard imaging modality for tumor volume delineation for radiotherapy treatment planning of retinoblastoma despite some inherent limitations. CT scan is very useful in providing information on physical density for dose calculation and morphological volumetric information but presents a low sensitivity in assessing the tumor viability. On the other hand, 3D ultrasound (US) allows a highly accurate definition of the tumor volume thanks to its high spatial resolution but it is not currently integrated in the treatment planning but used only for diagnosis and follow-up. Our ultimate goal is an automatic segmentation of gross tumor volume (GTV) in the 3D US, the segmentation of the organs at risk (OAR) in the CT and the registration of both modalities. In this paper, we present some preliminary results in this direction. We present 3D active contour-based segmentation of the eye ball and the lens in CT images; the presented approach incorporates the prior knowledge of the anatomy by using a 3D geometrical eye model. The automated segmentation results are validated by comparing with manual segmentations. Then, we present two approaches for the fusion of 3D CT and US images: (i) landmark-based transformation, and (ii) object-based transformation that makes use of eye ball contour information on CT and US images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular species identification in mixed or contaminated biological material has always been problematic. We developed a simple and accurate method for mammal DNA identification in mixtures, based on interspecific mitochondrial DNA control region length polymorphism. Contrary to other published methods dealing with species mixtures, our protocol requires a single universal primer pair and amplification step, and is not based on a pre-defined panel of species. This protocol has been routinely employed by our laboratory for species identification in dozens of human and animal forensic caseworks. Six representative forensic caseworks involving the specific identification of mixed animal samples are reported in this paper, in order to demonstrate the applicability and usefulness of the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The high complexity of cortical convolutions in humans is very challenging both for engineers to measure and compare it, and for biologists and physicians to understand it. In this paper, we propose a surface-based method for the quantification of cortical gyrification. Our method uses accurate 3-D cortical reconstruction and computes local measurements of gyrification at thousands of points over the whole cortical surface. The potential of our method to identify and localize precisely gyral abnormalities is illustrated by a clinical study on a group of children affected by 22q11 Deletion Syndrome, compared to control individuals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This contribution builds upon a former paper by the authors (Lipps and Betz 2004), in which a stochastic population projection for East- and West Germany is performed. Aim was to forecast relevant population parameters and their distribution in a consistent way. We now present some modifications, which have been modelled since. First, population parameters for the entire German population are modelled. In order to overcome the modelling problem of the structural break in the East during reunification, we show that the adaptation process of the relevant figures by the East can be considered to be completed by now. As a consequence, German parameters can be modelled just by using the West German historic patterns, with the start-off population of entire Germany. Second, a new model to simulate age specific fertility rates is presented, based on a quadratic spline approach. This offers a higher flexibility to model various age specific fertility curves. The simulation results are compared with the scenario based official forecasts for Germany in 2050. Exemplary for some population parameters (e.g. dependency ratio), it can be shown that the range spanned by the medium and extreme variants correspond to the s-intervals in the stochastic framework. It seems therefore more appropriate to treat this range as a s-interval covering about two thirds of the true distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Machine learning has been largely applied to analyze data in various domains, but it is still new to personalized medicine, especially dose individualization. In this paper, we focus on the prediction of drug concentrations using Support Vector Machines (S VM) and the analysis of the influence of each feature to the prediction results. Our study shows that SVM-based approaches achieve similar prediction results compared with pharmacokinetic model. The two proposed example-based SVM methods demonstrate that the individual features help to increase the accuracy in the predictions of drug concentration with a reduced library of training data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the segmentation of bilateral parotid glands in the Head and Neck (H&N) CT images using an active contour based atlas registration. We compare segmentation results from three atlas selection strategies: (i) selection of "single-most-similar" atlas for each image to be segmented, (ii) fusion of segmentation results from multiple atlases using STAPLE, and (iii) fusion of segmentation results using majority voting. Among these three approaches, fusion using majority voting provided the best results. Finally, we present a detailed evaluation on a dataset of eight images (provided as a part of H&N auto segmentation challenge conducted in conjunction with MICCAI-2010 conference) using majority voting strategy.