856 resultados para Population set-based methods
Resumo:
In this thesis we have developed solutions to common issues regarding widefield microscopes, facing the problem of the intensity inhomogeneity of an image and dealing with two strong limitations: the impossibility of acquiring either high detailed images representative of whole samples or deep 3D objects. First, we cope with the problem of the non-uniform distribution of the light signal inside a single image, named vignetting. In particular we proposed, for both light and fluorescent microscopy, non-parametric multi-image based methods, where the vignetting function is estimated directly from the sample without requiring any prior information. After getting flat-field corrected images, we studied how to fix the problem related to the limitation of the field of view of the camera, so to be able to acquire large areas at high magnification. To this purpose, we developed mosaicing techniques capable to work on-line. Starting from a set of overlapping images manually acquired, we validated a fast registration approach to accurately stitch together the images. Finally, we worked to virtually extend the field of view of the camera in the third dimension, with the purpose of reconstructing a single image completely in focus, stemming from objects having a relevant depth or being displaced in different focus planes. After studying the existing approaches for extending the depth of focus of the microscope, we proposed a general method that does not require any prior information. In order to compare the outcome of existing methods, different standard metrics are commonly used in literature. However, no metric is available to compare different methods in real cases. First, we validated a metric able to rank the methods as the Universal Quality Index does, but without needing any reference ground truth. Second, we proved that the approach we developed performs better in both synthetic and real cases.
Resumo:
This thesis studies molecular dynamics simulations on two levels of resolution: the detailed level of atomistic simulations, where the motion of explicit atoms in a many-particle system is considered, and the coarse-grained level, where the motion of superatoms composed of up to 10 atoms is modeled. While atomistic models are capable of describing material specific effects on small scales, the time and length scales they can cover are limited due to their computational costs. Polymer systems are typically characterized by effects on a broad range of length and time scales. Therefore it is often impossible to atomistically simulate processes, which determine macroscopic properties in polymer systems. Coarse-grained (CG) simulations extend the range of accessible time and length scales by three to four orders of magnitude. However, no standardized coarse-graining procedure has been established yet. Following the ideas of structure-based coarse-graining, a coarse-grained model for polystyrene is presented. Structure-based methods parameterize CG models to reproduce static properties of atomistic melts such as radial distribution functions between superatoms or other probability distributions for coarse-grained degrees of freedom. Two enhancements of the coarse-graining methodology are suggested. Correlations between local degrees of freedom are implicitly taken into account by additional potentials acting between neighboring superatoms in the polymer chain. This improves the reproduction of local chain conformations and allows the study of different tacticities of polystyrene. It also gives better control of the chain stiffness, which agrees perfectly with the atomistic model, and leads to a reproduction of experimental results for overall chain dimensions, such as the characteristic ratio, for all different tacticities. The second new aspect is the computationally cheap development of nonbonded CG potentials based on the sampling of pairs of oligomers in vacuum. Static properties of polymer melts are obtained as predictions of the CG model in contrast to other structure-based CG models, which are iteratively refined to reproduce reference melt structures. The dynamics of simulations at the two levels of resolution are compared. The time scales of dynamical processes in atomistic and coarse-grained simulations can be connected by a time scaling factor, which depends on several specific system properties as molecular weight, density, temperature, and other components in mixtures. In this thesis the influence of molecular weight in systems of oligomers and the situation in two-component mixtures is studied. For a system of small additives in a melt of long polymer chains the temperature dependence of the additive diffusion is predicted and compared to experiments.
Resumo:
Over the past ten years, the cross-correlation of long-time series of ambient seismic noise (ASN) has been widely adopted to extract the surface-wave part of the Green’s Functions (GF). This stochastic procedure relies on the assumption that ASN wave-field is diffuse and stationary. At frequencies <1Hz, the ASN is mainly composed by surface-waves, whose origin is attributed to the sea-wave climate. Consequently, marked directional properties may be observed, which call for accurate investigation about location and temporal evolution of the ASN-sources before attempting any GF retrieval. Within this general context, this thesis is aimed at a thorough investigation about feasibility and robustness of the noise-based methods toward the imaging of complex geological structures at the local (∼10-50km) scale. The study focused on the analysis of an extended (11 months) seismological data set collected at the Larderello-Travale geothermal field (Italy), an area for which the underground geological structures are well-constrained thanks to decades of geothermal exploration. Focusing on the secondary microseism band (SM;f>0.1Hz), I first investigate the spectral features and the kinematic properties of the noise wavefield using beamforming analysis, highlighting a marked variability with time and frequency. For the 0.1-0.3Hz frequency band and during Spring- Summer-time, the SMs waves propagate with high apparent velocities and from well-defined directions, likely associated with ocean-storms in the south- ern hemisphere. Conversely, at frequencies >0.3Hz the distribution of back- azimuths is more scattered, thus indicating that this frequency-band is the most appropriate for the application of stochastic techniques. For this latter frequency interval, I tested two correlation-based methods, acting in the time (NCF) and frequency (modified-SPAC) domains, respectively yielding esti- mates of the group- and phase-velocity dispersions. Velocity data provided by the two methods are markedly discordant; comparison with independent geological and geophysical constraints suggests that NCF results are more robust and reliable.
Resumo:
In this thesis, new advances in the development of spectroscopic based methods for the characterization of heritage materials have been achieved. As concern FTIR spectroscopy new approaches aimed at exploiting near and far IR region for the characterization of inorganic or organic materials have been tested. Paint cross-section have been analysed by FTIR spectroscopy in the NIR range and an “ad hoc” chemometric approach has been developed for the elaboration of hyperspectral maps. Moreover, a new method for the characterization of calcite based on the use of grinding curves has been set up both in MIR and in FAR region. Indeed, calcite is a material widely applied in cultural heritage, and this spectroscopic approach is an efficient and rapid tool to distinguish between different calcite samples. Different enhanced vibrational techniques for the characterisation of dyed fibres have been tested. First a SEIRA (Surface Enhanced Infra-Red Absorption) protocol has been optimised allowing the analysis of colorant micro-extracts thanks to the enhancement produced by the addition of gold nanoparticles. These preliminary studies permitted to identify a new enhanced FTIR method, named ATR/RAIRS, which allowed to reach lower detection limits. Regarding Raman microscopy, the research followed two lines, which have in common the aim of avoiding the use of colloidal solutions. AgI based supports obtained after deposition on a gold-coated glass slides have been developed and tested spotting colorant solutions. A SERS spectrum can be obtained thanks to the photoreduction, which the laser may induce on the silver salt. Moreover, these supports can be used for the TLC separation of a mixture of colorants and the analyses by means of both Raman/SERS and ATR-RAIRS can be successfully reached. Finally, a photoreduction method for the “on fiber” analysis of colorant without the need of any extraction have been optimised.
Resumo:
In der vorliegenden Arbeit wurde eine Top Down (TD) und zwei Bottom Up (BU) MALDI/ESI Massenspektrometrie/HPLC-Methoden entwickelt mit dem Ziel Augenoberfächenkomponenten, d.h. Tränenfilm und Konjunktivalzellen zu analysieren. Dabei wurde ein detaillierter Einblick in die Entwicklungsschritte gegeben und die Ansätze auf Eignung und methodische Grenzen untersucht. Während der TD Ansatz vorwiegend Eignung zur Analyse von rohen, weitgehend unbearbeiteten Zellproben fand, konnten mittels des BU Ansatzes bearbeitete konjunktivale Zellen, aber auch Tränenfilm mit hoher Sensitivität und Genauigkeit proteomisch analysiert werden. Dabei konnten mittels LC MALDI BU-Methode mehr als 200 Tränenproteine und mittels der LC ESI Methode mehr als 1000 Tränen- sowie konjunktivale Zellproteine gelistet werden. Dabei unterschieden sich ESI- and MALDI- Methoden deutlich bezüglich der Quantität und Qualität der Ergebnisse, weshalb differente proteomische Anwendungsgebiete der beiden Methoden vorgeschlagen wurden. Weiterhin konnten mittels der entwickelten LC MALDI/ESI BU Plattform, basierend auf den Vorteilen gegenüber dem TD Ansatz, therapeutische Einflüsse auf die Augenoberfläche mit Fokus auf die topische Anwendung von Taurin sowie Taflotan® sine, untersucht werden. Für Taurin konnten entzündungshemmende Effekte, belegt durch dynamische Veränderungen des Tränenfilms, dokumentiert werden. Außerdem konnten vorteilhafte, konzentrationsabhängige Wirkweisen auch in Studien an konjunktival Zellen gezeigt werden. Für die Anwendung von konservierungsmittelfreien Taflotan® sine, konnte mittels LC ESI BU Analyse eine Regenerierung der Augenoberfläche in Patienten mit Primärem Offenwinkel Glaukom (POWG), welche unter einem “Trockenem Auge“ litten nach einem therapeutischen Wechsel von Xalatan® basierend auf dynamischen Tränenproteomveränderungen gezeigt werden. Die Ergebnisse konnten mittels Microarray (MA) Analysen bestätigt werden. Sowohl in den Taurin Studien, als auch in der Taflotan® sine Studie, konnten charakteristische Proteine der Augenoberfläche dokumentiert werden, welche eine objektive Bewertung des Gesundheitszustandes der Augenoberfläche ermöglichen. Eine Kombination von Taflotan® sine und Taurin wurde als mögliche Strategie zur Therapie des Trockenen Auges bei POWG Patienten vorgeschlagen und diskutiert.
Resumo:
Conventional inorganic materials for x-ray radiation sensors suffer from several drawbacks, including their inability to cover large curved areas, me- chanical sti ffness, lack of tissue-equivalence and toxicity. Semiconducting organic polymers represent an alternative and have been employed as di- rect photoconversion material in organic diodes. In contrast to inorganic detector materials, polymers allow low-cost and large area fabrication by sol- vent based methods. In addition their processing is compliant with fexible low-temperature substrates. Flexible and large-area detectors are needed for dosimetry in medical radiotherapy and security applications. The objective of my thesis is to achieve optimized organic polymer diodes for fexible, di- rect x-ray detectors. To this end polymer diodes based on two different semi- conducting polymers, polyvinylcarbazole (PVK) and poly(9,9-dioctyluorene) (PFO) have been fabricated. The diodes show state-of-the-art rectifying be- haviour and hole transport mobilities comparable to reference materials. In order to improve the X-ray stopping power, high-Z nanoparticle Bi2O3 or WO3 where added to realize a polymer-nanoparticle composite with opti- mized properities. X-ray detector characterization resulted in sensitivties of up to 14 uC/Gy/cm2 for PVK when diodes were operated in reverse. Addition of nanoparticles could further improve the performance and a maximum sensitivy of 19 uC/Gy/cm2 was obtained for the PFO diodes. Compared to the pure PFO diode this corresponds to a five-fold increase and thus highlights the potentiality of nanoparticles for polymer detector design. In- terestingly the pure polymer diodes showed an order of magnitude increase in sensitivity when operated in forward regime. The increase was attributed to a different detection mechanism based on the modulation of the diodes conductivity.
Resumo:
n this paper we present a novel hybrid approach for multimodal medical image registration based on diffeomorphic demons. Diffeomorphic demons have proven to be a robust and efficient way for intensity-based image registration. A very recent extension even allows to use mutual information (MI) as a similarity measure to registration multimodal images. However, due to the intensity correspondence uncertainty existing in some anatomical parts, it is difficult for a purely intensity-based algorithm to solve the registration problem. Therefore, we propose to combine the resulting transformations from both intensity-based and landmark-based methods for multimodal non-rigid registration based on diffeomorphic demons. Several experiments on different types of MR images were conducted, for which we show that a better anatomical correspondence between the images can be obtained using the hybrid approach than using either intensity information or landmarks alone.
Resumo:
This study aimed to evaluate the influence of professional prophylactic methods on the DIAGNOdent 2095, DIAGNOdent 2190 and VistaProof performance in detecting occlusal caries. Assessments were performed in 110 permanent teeth at baseline and after bicarbonate jet or prophylactic paste and rinsing. Performance in terms of sensitivity improved after rinsing of the occlusal surfaces when the prophylactic paste was used. However, the sodium bicarbonate jet did not significantly influence the performance of the fluorescence-based methods. It can be concluded that different professional prophylactic methods can significantly influence the performance of fluorescence-based methods for occlusal caries detection.
Resumo:
Misconceptions exist in all fields of learning and develop through a person’s preconception of how the world works. Students with misconceptions in chemical engineering are not capable of correctly transferring knowledge to a new situation and will likely arrive at an incorrect solution. The purpose of this thesis was to repair misconceptions in thermodynamics by using inquiry-based activities. Inquiry-based learning is a method of teaching that involves hands-on learning and self-discovery. Previous work has shown inquiry-based methods result in better conceptual understanding by students relative to traditional lectures. The thermodynamics activities were designed to guide students towards the correct conceptual understanding through observing a preconception fail to hold up through an experiment or simulation. The developed activities focus on the following topics in thermodynamics: “internal energy versus enthalpy”, “equilibrium versus steady state”, and “entropy”. For each topic, two activities were designed to clarify the concept and assure it was properly grasped. Each activity was coupled with an instructions packet containing experimental procedure as well as pre- and post-analysis questions, which were used to analyze the effect of the activities on the students’ responses. Concept inventories were used to monitor students’ conceptual understanding at the beginning and end of the semester. The results did not show a statistically significant increase in the overall concept inventory scores for students who performed the activities compared to traditional learning. There was a statistically significant increase in concept area scores for “internal energy versus enthalpy” and “equilibrium versus steady state”. Although there was not a significant increase in concept inventory scores for “entropy”, written analyses showed most students’ misconceptions were repaired. Students transferred knowledge effectively and retained most of the information in the concept areas of “internal energy versus enthalpy” and “equilibrium versus steady state”.
Resumo:
Most butterfly monitoring protocols rely on counts along transects (Pollard walks) to generate species abundance indices and track population trends. It is still too often ignored that a population count results from two processes: the biological process (true abundance) and the statistical process (our ability to properly quantify abundance). Because individual detectability tends to vary in space (e.g., among sites) and time (e.g., among years), it remains unclear whether index counts truly reflect population sizes and trends. This study compares capture-mark-recapture (absolute abundance) and count-index (relative abundance) monitoring methods in three species (Maculinea nausithous and Iolana iolas: Lycaenidae; Minois dryas: Satyridae) in contrasted habitat types. We demonstrate that intraspecific variability in individual detectability under standard monitoring conditions is probably the rule rather than the exception, which questions the reliability of count-based indices to estimate and compare specific population abundance. Our results suggest that the accuracy of count-based methods depends heavily on the ecology and behavior of the target species, as well as on the type of habitat in which surveys take place. Monitoring programs designed to assess the abundance and trends in butterfly populations should incorporate a measure of detectability. We discuss the relative advantages and inconveniences of current monitoring methods and analytical approaches with respect to the characteristics of the species under scrutiny and resources availability.
Resumo:
Quantitative data on lung structure are essential to set up structure-function models for assessing the functional performance of the lung or to make statistically valid comparisons in experimental morphology, physiology, or pathology. The methods of choice for microscopy-based lung morphometry are those of stereology, the science of quantitative characterization of irregular three-dimensional objects on the basis of measurements made on two-dimensional sections. From a practical perspective, stereology is an assumption-free set of methods of unbiased sampling with geometric probes, based on a solid mathematical foundation. Here, we discuss the pitfalls of lung morphometry and present solutions, from specimen preparation to the sampling scheme in multiple stages, for obtaining unbiased estimates of morphometric parameters such as volumes, surfaces, lengths, and numbers. This is demonstrated on various examples. Stereological methods are accurate, efficient, simple, and transparent; the precision of the estimates depends on the size and distribution of the sample. For obtaining quantitative data on lung structure at all microscopic levels, state-of-the-art stereology is the gold standard.
Resumo:
Professor Sir David R. Cox (DRC) is widely acknowledged as among the most important scientists of the second half of the twentieth century. He inherited the mantle of statistical science from Pearson and Fisher, advanced their ideas, and translated statistical theory into practice so as to forever change the application of statistics in many fields, but especially biology and medicine. The logistic and proportional hazards models he substantially developed, are arguably among the most influential biostatistical methods in current practice. This paper looks forward over the period from DRC's 80th to 90th birthdays, to speculate about the future of biostatistics, drawing lessons from DRC's contributions along the way. We consider "Cox's model" of biostatistics, an approach to statistical science that: formulates scientific questions or quantities in terms of parameters gamma in probability models f(y; gamma) that represent in a parsimonious fashion, the underlying scientific mechanisms (Cox, 1997); partition the parameters gamma = theta, eta into a subset of interest theta and other "nuisance parameters" eta necessary to complete the probability distribution (Cox and Hinkley, 1974); develops methods of inference about the scientific quantities that depend as little as possible upon the nuisance parameters (Barndorff-Nielsen and Cox, 1989); and thinks critically about the appropriate conditional distribution on which to base infrences. We briefly review exciting biomedical and public health challenges that are capable of driving statistical developments in the next decade. We discuss the statistical models and model-based inferences central to the CM approach, contrasting them with computationally-intensive strategies for prediction and inference advocated by Breiman and others (e.g. Breiman, 2001) and to more traditional design-based methods of inference (Fisher, 1935). We discuss the hierarchical (multi-level) model as an example of the future challanges and opportunities for model-based inference. We then consider the role of conditional inference, a second key element of the CM. Recent examples from genetics are used to illustrate these ideas. Finally, the paper examines causal inference and statistical computing, two other topics we believe will be central to biostatistics research and practice in the coming decade. Throughout the paper, we attempt to indicate how DRC's work and the "Cox Model" have set a standard of excellence to which all can aspire in the future.
Resumo:
Simulation-based assessment is a popular and frequently necessary approach to evaluation of statistical procedures. Sometimes overlooked is the ability to take advantage of underlying mathematical relations and we focus on this aspect. We show how to take advantage of large-sample theory when conducting a simulation using the analysis of genomic data as a motivating example. The approach uses convergence results to provide an approximation to smaller-sample results, results that are available only by simulation. We consider evaluating and comparing a variety of ranking-based methods for identifying the most highly associated SNPs in a genome-wide association study, derive integral equation representations of the pre-posterior distribution of percentiles produced by three ranking methods, and provide examples comparing performance. These results are of interest in their own right and set the framework for a more extensive set of comparisons.
Resumo:
This book will serve as a foundation for a variety of useful applications of graph theory to computer vision, pattern recognition, and related areas. It covers a representative set of novel graph-theoretic methods for complex computer vision and pattern recognition tasks. The first part of the book presents the application of graph theory to low-level processing of digital images such as a new method for partitioning a given image into a hierarchy of homogeneous areas using graph pyramids, or a study of the relationship between graph theory and digital topology. Part II presents graph-theoretic learning algorithms for high-level computer vision and pattern recognition applications, including a survey of graph based methodologies for pattern recognition and computer vision, a presentation of a series of computationally efficient algorithms for testing graph isomorphism and related graph matching tasks in pattern recognition and a new graph distance measure to be used for solving graph matching problems. Finally, Part III provides detailed descriptions of several applications of graph-based methods to real-world pattern recognition tasks. It includes a critical review of the main graph-based and structural methods for fingerprint classification, a new method to visualize time series of graphs, and potential applications in computer network monitoring and abnormal event detection.
Resumo:
Background The Swiss government decided to freeze new accreditations for physicians in private practice in Switzerland based on the assumption that demand-induced health care spending may be cut by limiting care offers. This legislation initiated an ongoing controversial public debate in Switzerland. The aim of this study is therefore the determination of socio-demographic and health system-related factors of per capita consultation rates with primary care physicians in the multicultural population of Switzerland. Methods The data were derived from the complete claims data of Swiss health insurers for 2004 and included 21.4 million consultations provided by 6564 Swiss primary care physicians on a fee-for-service basis. Socio-demographic data were obtained from the Swiss Federal Statistical Office. Utilisation-based health service areas were created and were used as observational units for statistical procedures. Multivariate and hierarchical models were applied to analyze the data. Results Models within the study allowed the definition of 1018 primary care service areas with a median population of 3754 and an average per capita consultation rate of 2.95 per year. Statistical models yielded significant effects for various geographical, socio-demographic and cultural factors. The regional density of physicians in independent practice was also significantly associated with annual consultation rates and indicated an associated increase 0.10 for each additional primary care physician in a population of 10,000 inhabitants. Considerable differences across Swiss language regions were observed with reference to the supply of ambulatory health resources provided either by primary care physicians, specialists, or hospital-based ambulatory care. Conclusion The study documents a large small-area variation in utilisation and provision of health care resources in Switzerland. Effects of physician density appeared to be strongly related to Swiss language regions and may be rooted in the different cultural backgrounds of the served populations.