897 resultados para Simulation-based methods
Resumo:
At the ecosystem level, sustainable exploitation of fisheries resources depends not only on the status of target species but also on that of bycatch species, some of which are even more sensitive to exploitation. This is the case for a number of elasmobranchs (skates, rays and sharks) species whose abundance declined during the 20th century. Further, the biology of elamobranchs is still poorly known and traditional fisheries stock assessment methods using fisheries catches and scientific survey data for estimating abundance are expensive or even inapplicable due to the small numbers observed. The GenoPopTaille project attempts to apply to the case of the thornback ray (Raja clavata) recent genetic-based methods for absolute population abundance estimation as well as characterizing its genetic diversity and population structure in the Northeast Atlantic. The poster will present the objectives, challenges and progress made so far by the project.
Resumo:
Leishmania donovani is the known causative agent of both cutaneous (CL) and visceral leishmaniasis in Sri Lanka. CL is considered to be under-reported partly due to relatively poor sensitivity and specificity of microscopic diagnosis. We compared robustness of three previously described polymerase chain reaction (PCR) based methods to detect Leishmania DNA in 38 punch biopsy samples from patients presented with suspected lesions in 2010. Both, Leishmania genus-specific JW11/JW12 KDNA and LITSR/L5.8S internal transcribed spacer (ITS)1 PCR assays detected 92% (35/38) of the samples whereas a KDNA assay specific for L. donovani (LdF/LdR) detected only 71% (27/38) of samples. All positive samples showed a L. donovani banding pattern upon HaeIII ITS1 PCR-restriction fragment length polymorphism analysis. PCR assay specificity was evaluated in samples containing Mycobacterium tuberculosis , Mycobacterium leprae , and human DNA, and there was no cross-amplification in JW11/JW12 and LITSR/L5.8S PCR assays. The LdF/LdR PCR assay did not amplify M. leprae or human DNA although 500 bp and 700 bp bands were observed in M. tuberculosis samples. In conclusion, it was successfully shown in this study that it is possible to diagnose Sri Lankan CL with high accuracy, to genus and species identification, using Leishmania DNA PCR assays.
Resumo:
This dissertation verifies whether the following two hypotheses are true: (1) High-occupancy/toll lanes (and therefore other dedicated lanes) have capacity that could still be used; (2) such unused capacity (or more precisely, “unused managed capacity”) can be sold successfully through a real-time auction. To show that the second statement is true, this dissertation proposes an auction-based metering (ABM) system, that is, a mechanism that regulates traffic that enters the dedicated lanes. Participation in the auction is voluntary and can be skipped by paying the toll or by not registering to the new system. This dissertation comprises the following four components: a measurement of unused managed capacity on an existing HOT facility, a game-theoretic model of an ABM system, an operational description of the ABM system, and a simulation-based evaluation of the system. Some other and more specific contributions of this dissertation include the following: (1) It provides a definition and a methodology for measuring unused managed capacity and another important variable referred as “potential volume increase”. (2) It proves that the game-theoretic model has a unique Bayesian Nash equilibrium. (3) And it provides a specific road design that can be applied or extended to other facilities. The results provide evidence that the hypotheses are true and suggest that the ABM system would benefit a public operator interested in reducing traffic congestion significantly, would benefit drivers when making low-reliability trips (such as work-to-home trips), and would potentially benefit a private operator interested in raising revenue.
In Situ Characterization of Optical Absorption by Carbonaceous Aerosols: Calibration and Measurement
Resumo:
Light absorption by aerosols has a great impact on climate change. A Photoacoustic spectrometer (PA) coupled with aerosol-based classification techniques represents an in situ method that can quantify the light absorption by aerosols in a real time, yet significant differences have been reported using this method versus filter based methods or the so-called difference method based upon light extinction and light scattering measurements. This dissertation focuses on developing calibration techniques for instruments used in measuring the light absorption cross section, including both particle diameter measurements by the differential mobility analyzer (DMA) and light absorption measurements by PA. Appropriate reference materials were explored for the calibration/validation of both measurements. The light absorption of carbonaceous aerosols was also investigated to provide fundamental understanding to the absorption mechanism. The first topic of interest in this dissertation is the development of calibration nanoparticles. In this study, bionanoparticles were confirmed to be a promising reference material for particle diameter as well as ion-mobility. Experimentally, bionanoparticles demonstrated outstanding homogeneity in mobility compared to currently used calibration particles. A numerical method was developed to calculate the true distribution and to explain the broadening of measured distribution. The high stability of bionanoparticles was also confirmed. For PA measurement, three aerosol with spherical or near spherical shapes were investigated as possible candidates for a reference standard: C60, copper and silver. Comparisons were made between experimental photoacoustic absorption data with Mie theory calculations. This resulted in the identification of C60 particles with a mobility diameter of 150 nm to 400 nm as an absorbing standard at wavelengths of 405 nm and 660 nm. Copper particles with a mobility diameter of 80 nm to 300 nm are also shown to be a promising reference candidate at wavelength of 405 nm. The second topic of this dissertation focuses on the investigation of light absorption by carbonaceous particles using PA. Optical absorption spectra of size and mass selected laboratory generated aerosols consisting of black carbon (BC), BC with non-absorbing coating (ammonium sulfate and sodium chloride) and BC with a weakly absorbing coating (brown carbon derived from humic acid) were measured across the visible to near-IR (500 nm to 840 nm). The manner in which BC mixed with each coating material was investigated. The absorption enhancement of BC was determined to be wavelength dependent. Optical absorption spectra were also taken for size and mass selected smoldering smoke produced from six types of commonly seen wood in a laboratory scale apparatus.
Resumo:
Part 1: Introduction
Resumo:
Very high resolution remotely sensed images are an important tool for monitoring fragmented agricultural landscapes, which allows farmers and policy makers to make better decisions regarding management practices. An object-based methodology is proposed for automatic generation of thematic maps of the available classes in the scene, which combines edge-based and superpixel processing for small agricultural parcels. The methodology employs superpixels instead of pixels as minimal processing units, and provides a link between them and meaningful objects (obtained by the edge-based method) in order to facilitate the analysis of parcels. Performance analysis on a scene dominated by agricultural small parcels indicates that the combination of both superpixel and edge-based methods achieves a classification accuracy slightly better than when those methods are performed separately and comparable to the accuracy of traditional object-based analysis, with automatic approach.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Ciência da Computação, 2016.
Resumo:
The Dirichlet process mixture model (DPMM) is a ubiquitous, flexible Bayesian nonparametric statistical model. However, full probabilistic inference in this model is analytically intractable, so that computationally intensive techniques such as Gibbs sampling are required. As a result, DPMM-based methods, which have considerable potential, are restricted to applications in which computational resources and time for inference is plentiful. For example, they would not be practical for digital signal processing on embedded hardware, where computational resources are at a serious premium. Here, we develop a simplified yet statistically rigorous approximate maximum a-posteriori (MAP) inference algorithm for DPMMs. This algorithm is as simple as DP-means clustering, solves the MAP problem as well as Gibbs sampling, while requiring only a fraction of the computational effort. (For freely available code that implements the MAP-DP algorithm for Gaussian mixtures see http://www.maxlittle.net/.) Unlike related small variance asymptotics (SVA), our method is non-degenerate and so inherits the “rich get richer” property of the Dirichlet process. It also retains a non-degenerate closed-form likelihood which enables out-of-sample calculations and the use of standard tools such as cross-validation. We illustrate the benefits of our algorithm on a range of examples and contrast it to variational, SVA and sampling approaches from both a computational complexity perspective as well as in terms of clustering performance. We demonstrate the wide applicabiity of our approach by presenting an approximate MAP inference method for the infinite hidden Markov model whose performance contrasts favorably with a recently proposed hybrid SVA approach. Similarly, we show how our algorithm can applied to a semiparametric mixed-effects regression model where the random effects distribution is modelled using an infinite mixture model, as used in longitudinal progression modelling in population health science. Finally, we propose directions for future research on approximate MAP inference in Bayesian nonparametrics.
Resumo:
This thesis is concerned with change point analysis for time series, i.e. with detection of structural breaks in time-ordered, random data. This long-standing research field regained popularity over the last few years and is still undergoing, as statistical analysis in general, a transformation to high-dimensional problems. We focus on the fundamental »change in the mean« problem and provide extensions of the classical non-parametric Darling-Erdős-type cumulative sum (CUSUM) testing and estimation theory within highdimensional Hilbert space settings. In the first part we contribute to (long run) principal component based testing methods for Hilbert space valued time series under a rather broad (abrupt, epidemic, gradual, multiple) change setting and under dependence. For the dependence structure we consider either traditional m-dependence assumptions or more recently developed m-approximability conditions which cover, e.g., MA, AR and ARCH models. We derive Gumbel and Brownian bridge type approximations of the distribution of the test statistic under the null hypothesis of no change and consistency conditions under the alternative. A new formulation of the test statistic using projections on subspaces allows us to simplify the standard proof techniques and to weaken common assumptions on the covariance structure. Furthermore, we propose to adjust the principal components by an implicit estimation of a (possible) change direction. This approach adds flexibility to projection based methods, weakens typical technical conditions and provides better consistency properties under the alternative. In the second part we contribute to estimation methods for common changes in the means of panels of Hilbert space valued time series. We analyze weighted CUSUM estimates within a recently proposed »high-dimensional low sample size (HDLSS)« framework, where the sample size is fixed but the number of panels increases. We derive sharp conditions on »pointwise asymptotic accuracy« or »uniform asymptotic accuracy« of those estimates in terms of the weighting function. Particularly, we prove that a covariance-based correction of Darling-Erdős-type CUSUM estimates is required to guarantee uniform asymptotic accuracy under moderate dependence conditions within panels and that these conditions are fulfilled, e.g., by any MA(1) time series. As a counterexample we show that for AR(1) time series, close to the non-stationary case, the dependence is too strong and uniform asymptotic accuracy cannot be ensured. Finally, we conduct simulations to demonstrate that our results are practically applicable and that our methodological suggestions are advantageous.
Resumo:
Research has demonstrated that mining activities can cause serious impacts on the environment, as well as the surrounding communities, mainly due to the unsafe storage of mine tailings. This research focuses on the sustainability assessment of new technologies for the recovery of metals from mine residues. The assessment consists in the evaluation of the environmental, economic, and social impacts through the Life Cycle based methods: Life Cycle Assessment (LCA), Life Cycle Costing (LCC), and Social Life Cycle Assessment (SLCA). The analyses are performed on the Mondo Minerals bioleaching project, which aim is to recover nickel and cobalt from the Sotkamo and Vuonos mine tailings. The LCA demonstrates that the project contributes to the avoided production of nickel and cobalt concentrates from new resources, hence reducing several environmental impacts. The LCC analysis shows that the company’s main costs are linked to the bioleaching process, caused by electricity consumption and the chemicals used. The SLCA analyses the impacts on three main stakeholder categories: workers, local community, and society. The results demonstrated that a fair salary (or the absence of it) impacts the workers the most, while the local community stakeholder category impacts are related to the access to material resources. The health and safety category is the most impacted category for the society stakeholder. The environmental and economic analyses demonstrate that the recovery of mine tailings may represents a good opportunity for mine companies both to reduce the environmental impacts linked to mine tailings and to increase the profitability. In particular, the project helps reduce the amounts of metals extracted from new resources and demonstrates that the use of the bioleaching technology for the extraction of metals can be economically profitable.
Resumo:
Biology is now a “Big Data Science” thanks to technological advancements allowing the characterization of the whole macromolecular content of a cell or a collection of cells. This opens interesting perspectives, but only a small portion of this data may be experimentally characterized. From this derives the demand of accurate and efficient computational tools for automatic annotation of biological molecules. This is even more true when dealing with membrane proteins, on which my research project is focused leading to the development of two machine learning-based methods: BetAware-Deep and SVMyr. BetAware-Deep is a tool for the detection and topology prediction of transmembrane beta-barrel proteins found in Gram-negative bacteria. These proteins are involved in many biological processes and primary candidates as drug targets. BetAware-Deep exploits the combination of a deep learning framework (bidirectional long short-term memory) and a probabilistic graphical model (grammatical-restrained hidden conditional random field). Moreover, it introduced a modified formulation of the hydrophobic moment, designed to include the evolutionary information. BetAware-Deep outperformed all the available methods in topology prediction and reported high scores in the detection task. Glycine myristoylation in Eukaryotes is the binding of a myristic acid on an N-terminal glycine. SVMyr is a fast method based on support vector machines designed to predict this modification in dataset of proteomic scale. It uses as input octapeptides and exploits computational scores derived from experimental examples and mean physicochemical features. SVMyr outperformed all the available methods for co-translational myristoylation prediction. In addition, it allows (as a unique feature) the prediction of post-translational myristoylation. Both the tools here described are designed having in mind best practices for the development of machine learning-based tools outlined by the bioinformatics community. Moreover, they are made available via user-friendly web servers. All this make them valuable tools for filling the gap between sequential and annotated data.
Resumo:
Inverse problems are at the core of many challenging applications. Variational and learning models provide estimated solutions of inverse problems as the outcome of specific reconstruction maps. In the variational approach, the result of the reconstruction map is the solution of a regularized minimization problem encoding information on the acquisition process and prior knowledge on the solution. In the learning approach, the reconstruction map is a parametric function whose parameters are identified by solving a minimization problem depending on a large set of data. In this thesis, we go beyond this apparent dichotomy between variational and learning models and we show they can be harmoniously merged in unified hybrid frameworks preserving their main advantages. We develop several highly efficient methods based on both these model-driven and data-driven strategies, for which we provide a detailed convergence analysis. The arising algorithms are applied to solve inverse problems involving images and time series. For each task, we show the proposed schemes improve the performances of many other existing methods in terms of both computational burden and quality of the solution. In the first part, we focus on gradient-based regularized variational models which are shown to be effective for segmentation purposes and thermal and medical image enhancement. We consider gradient sparsity-promoting regularized models for which we develop different strategies to estimate the regularization strength. Furthermore, we introduce a novel gradient-based Plug-and-Play convergent scheme considering a deep learning based denoiser trained on the gradient domain. In the second part, we address the tasks of natural image deblurring, image and video super resolution microscopy and positioning time series prediction, through deep learning based methods. We boost the performances of supervised, such as trained convolutional and recurrent networks, and unsupervised deep learning strategies, such as Deep Image Prior, by penalizing the losses with handcrafted regularization terms.
Resumo:
Contaminants of emerging concern are increasingly detected in the water cycle, with endocrine-disrupting chemicals (EDCs) receiving attention due to their potential to cause adverse health effects even at low concentrations. Although the EU has recently introduced some EDCs into drinking water legislation, most drinking water treatment plants (DWTPs) are not designed to remove EDCs, making their detection and removal in DWTPs an important challenge. The aim of this doctoral project was to investigate hormones and phenolic compounds as suspected EDCs in drinking waters across the Romagna area (Italy). The main objectives were to assess the occurrence of considered contaminants in source and drinking water from three DWTPs, characterize the effectiveness of removal by different water treatment processes, and evaluate the potential biological impact on drinking water and human health. Specifically, a complementary approach of target chemical analysis and effect-based methods was adopted to explore drinking water quality, treatment efficacy, and biological potential. This study found that nonylphenol (NP) was prevalent in all samples, followed by BPA. Sporadic contamination of hormones was found only in source waters. Although the measured EDC concentrations in drinking water did not exceed threshold guideline values, the potential role of DWTPs as an additional source of EDC contamination should be considered. Significant increases in BPA and NP levels were observed during water treatment steps, which were also reflected in estrogenic and mutagenic responses in water samples after the ultrafiltration. This highlights the need to monitor water quality during various treatment processes to improve the efficiency of DWTPs. Biological assessments on finished water did not reveal any bioactivity, except for few treated water samples that exhibited estrogenic responses. Overall, the data emphasize the high quality of produced drinking water and the value of applying integrated chemical analysis and in vitro bioassays for water quality assessment.
Resumo:
The scope of the thesis is to broaden the knowledge about axially loaded pipe piles, that can play as foundations for offshore wind turbines based on jacket structures. The goal of the work was pursued by interpreting experimental data on large-scale model piles and by developing numerical tools for the prediction of their monotonic response to tensile and compressive loads to failure. The availability of experimental results on large scale model piles produced in two different campaigns at Fraunhofer IWES (Hannover, Germany) represented the reference for the whole work. Data from CPTs, blow counts during installation and load-displacement curves allowed to develop considerations on the experimental results and comparison with empirical methods from literature, such as CPT-based methods and Load Transfer methods. The understanding of soil-structure interaction mechanisms has been involved in the study in order to better assess the mechanical response of the sand with the scope to help in developing predictive tools of the experiments. A lack of information on the response of Rohsand 3152 when in contact with steel was highlighted, so the necessity of better assessing its response was fulfilled with a comprehensive campaign of interface shear test. It was found how the response of the sand to ultimate conditions evolve with the roughness of the steel, which is a precious information to take account of when attempting the prediction of a pile capacity. Parallel to this topic, the work has developed a numerical modelling procedure that was validated on the available large-scale model piles at IWES. The modelling strategy is intended to build a FE model whose mechanical properties of the sand come from an interpretation of commonly available geotechnical tests. The results of the FE model were compared with other predictive tools currently used in the engineering practice.
Resumo:
Honey bees are considered keystone species in ecosystem, the effect of harmful pesticides for the honey bees, the action of extreme climatic waves and their consequence on honey bees health can cause the loss of many colonies which could contribute to the reduction of the effective population size and incentive the use of non-autochthonous queens to replace dead colonies. Over the last decades, the use of non-ligustica bee subspecies in Italy has increased and together with the mentioned phenomena exposed native honey bees to hybridization, laeding to a dramatic loss of genetic erosion and admixture. Healthy genetic diversity within honey bee populations is critical to provide tolerance and resistance to current and future threatening. Nowadays it is urgent to design strategies for the conservation of local subspecies and their valorisation on a productive scale. In this Thesis we applied genomics tool for the analysis of the genetic diversity and the genomic integrity of honey bee populations in Italy are described. In this work mtDNA based methods are presented using honey bee DNA or honey eDNA as source of information of the genetic diversity of A. mellifera at different level. Taken together, the results derived from these studies should enlarge the knowledge of the genetic diversity and integrity of the honey bee populations in Italy, filling the gap of information necessary to design efficient conservation programmes. Furthermore, the methods presented in these works will provide a tool for the honey authentication to sustain and valorise beekeeping products and sector against frauds.