912 resultados para Appearance-based methods


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the past decade, several major food safety crises originated from problems with feed. Consequently, there is an urgent need for early detection of fraudulent adulteration and contamination in the feed chain. Strategies are presented for two specific cases, viz. adulterations of (i) soybean meal with melamine and other types of adulterants/contaminants and (ii) vegetable oils with mineral oil, transformer oil or other oils. These strategies comprise screening at the feed mill or port of entry with non-destructive spectroscopic methods (NIRS and Raman), followed by post-screening and confirmation in the laboratory with MS-based methods. The spectroscopic techniques are suitable for on-site and on-line applications. Currently they are suited to detect fraudulent adulteration at relatively high levels but not to detect low level contamination. The potential use of the strategies for non-targeted analysis is demonstrated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Esophageal adenocarcinoma (EA) is one of the fastest rising cancers in western countries. Barrett’s Esophagus (BE) is the premalignant precursor of EA. However, only a subset of BE patients develop EA, which complicates the clinical management in the absence of valid predictors. Genetic risk factors for BE and EA are incompletely understood. This study aimed to identify novel genetic risk factors for BE and EA.Methods: Within an international consortium of groups involved in the genetics of BE/EA, we performed the first meta-analysis of all genome-wide association studies (GWAS) available, involving 6,167 BE patients, 4,112 EA patients, and 17,159 representative controls, all of European ancestry, genotyped on Illumina high-density SNP-arrays, collected from four separate studies within North America, Europe, and Australia. Meta-analysis was conducted using the fixed-effects inverse variance-weighting approach. We used the standard genome-wide significant threshold of 5×10-8 for this study. We also conducted an association analysis following reweighting of loci using an approach that investigates annotation enrichment among the genome-wide significant loci. The entire GWAS-data set was also analyzed using bioinformatics approaches including functional annotation databases as well as gene-based and pathway-based methods in order to identify pathophysiologically relevant cellular pathways.Findings: We identified eight new associated risk loci for BE and EA, within or near the CFTR (rs17451754, P=4·8×10-10), MSRA (rs17749155, P=5·2×10-10), BLK (rs10108511, P=2·1×10-9), KHDRBS2 (rs62423175, P=3·0×10-9), TPPP/CEP72 (rs9918259, P=3·2×10-9), TMOD1 (rs7852462, P=1·5×10-8), SATB2 (rs139606545, P=2·0×10-8), and HTR3C/ABCC5 genes (rs9823696, P=1·6×10-8). A further novel risk locus at LPA (rs12207195, posteriori probability=0·925) was identified after re-weighting using significantly enriched annotations. This study thereby doubled the number of known risk loci. The strongest disease pathways identified (P<10-6) belong to muscle cell differentiation and to mesenchyme development/differentiation, which fit with current pathophysiological BE/EA concepts. To our knowledge, this study identified for the first time an EA-specific association (rs9823696, P=1·6×10-8) near HTR3C/ABCC5 which is independent of BE development (P=0·45).Interpretation: The identified disease loci and pathways reveal new insights into the etiology of BE and EA. Furthermore, the EA-specific association at HTR3C/ABCC5 may constitute a novel genetic marker for the prediction of transition from BE to EA. Mutations in CFTR, one of the new risk loci identified in this study, cause cystic fibrosis (CF), the most common recessive disorder in Europeans. Gastroesophageal reflux (GER) belongs to the phenotypic CF-spectrum and represents the main risk factor for BE/EA. Thus, the CFTR locus may trigger a common GER-mediated pathophysiology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Estimates of HIV prevalence are important for policy in order to establish the health status of a country's population and to evaluate the effectiveness of population-based interventions and campaigns. However, participation rates in testing for surveillance conducted as part of household surveys, on which many of these estimates are based, can be low. HIV positive individuals may be less likely to participate because they fear disclosure, in which case estimates obtained using conventional approaches to deal with missing data, such as imputation-based methods, will be biased. We develop a Heckman-type simultaneous equation approach which accounts for non-ignorable selection, but unlike previous implementations, allows for spatial dependence and does not impose a homogeneous selection process on all respondents. In addition, our framework addresses the issue of separation, where for instance some factors are severely unbalanced and highly predictive of the response, which would ordinarily prevent model convergence. Estimation is carried out within a penalized likelihood framework where smoothing is achieved using a parametrization of the smoothing criterion which makes estimation more stable and efficient. We provide the software for straightforward implementation of the proposed approach, and apply our methodology to estimating national and sub-national HIV prevalence in Swaziland, Zimbabwe and Zambia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVE: The main difficulty of PCR-based clonality studies for B-cell lymphoproliferative disorders (B-LPD) is discrimination between monoclonal and polyclonal PCR products, especially when there is a high background of polyclonal B cells in the tumor sample. Actually, PCR-based methods for clonality assessment require additional analysis of the PCR products in order to discern between monoclonal and polyclonal samples. Heteroduplex analysis represents an attractive approach since it is easy to perform and avoids the use of radioactive substrates or expensive equipment. DESIGN AND METHODS: We studied the sensitivity and specificity of heteroduplex PCR analysis for monoclonal detection in samples from 90 B-cell non Hodgkin's lymphoma (B-NHL) patients and in 28 individuals without neoplastic B-cell disorders (negative controls). Furthermore, in 42 B-NHL and in the same 28 negative controls, we compared heteroduplex analysis vs the classical PCR technique. We also compared ethidium bromide (EtBr) vs. silver nitrate (AgNO(3)) staining as well as agarose vs. polyacrylamide gel electrophoresis (PAGE). RESULTS: Using two pair consensus primers sited at VH (FR3 and FR2) and at JH, 91% of B-NHL samples displayed monoclonal products after heteroduplex PCR analysis using PAGE and AgNO(3) staining. Moreover, no polyclonal sample showed a monoclonal PCR product. By contrast, false positive results were obtained when using agarose (5/28) and PAGE without heteroduplex analysis: 2/28 and 8/28 with EtBr and AgNO(3) staining, respectively. In addition, false negative results only appeared with EtBr staining: 13/42 in agarose, 4/42 in PAGE without heteroduplex analysis and 7/42 in PAGE after heteroduplex analysis. INTERPRETATION AND CONCLUSIONS: We conclude that AgNO(3) stained PAGE after heteroduplex analysis is the most suitable strategy for detecting monoclonal rearrangements in B-NHL samples because it does not produce false-positive results and the risk of false-negative results is very low.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[EN]Enabling natural human-robot interaction using computer vision based applications requires fast and accurate hand detection. However, previous works in this field assume different constraints, like a limitation in the number of detected gestures, because hands are highly complex objects difficult to locate. This paper presents an approach which integrates temporal coherence cues and hand detection based on wrists using a cascade classifier. With this approach, we introduce three main contributions: (1) a transparent initialization mechanism without user participation for segmenting hands independently of their gesture, (2) a larger number of detected gestures as well as a faster training phase than previous cascade classifier based methods and (3) near real-time performance for hand pose detection in video streams.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Post-abortion contraceptive use in India is low and the use of modern methods of contraception is rare, especially in rural areas. This study primarily compares contraceptive use among women whose abortion outcome was assessed in-clinic with women who assessed their abortion outcome at home, in a low-resource, primary health care setting. Moreover, it investigates how background characteristics and abortion service provision influences contraceptive use post-abortion. METHODS: A randomized controlled, non-inferiority, trial (RCT) compared clinic follow-up with home-assessment of abortion outcome at 2 weeks post-abortion. Additionally, contraceptive-use at 3 months post-abortion was investigated through a cross-sectional follow-up interview with a largely urban sub-sample of women from the RCT. Women seeking abortion with a gestational age of up to 9 weeks and who agreed to a 2-week follow-up were included (n = 731). Women with known contraindications to medical abortions, Hb < 85 mg/l and aged below 18 were excluded. Data were collected between April 2013 and August 2014 in six primary health-care clinics in Rajasthan. A computerised random number generator created the randomisation sequence (1:1) in blocks of six. Contraceptive use was measured at 2 weeks among women successfully followed-up (n = 623) and 3 months in the sub-set of women who were included if they were recruited at one of the urban study sites, owned a phone and agreed to a 3-month follow-up (n = 114). RESULTS: There were no differences between contraceptive use and continuation between study groups at 3 months (76 % clinic follow-up, 77 % home-assessment), however women in the clinic follow-up group were most likely to adopt a contraceptive method at 2 weeks (62 ± 12 %), while women in the home-assessment group were most likely to adopt a method after next menstruation (60 ± 13 %). Fifty-two per cent of women who initiated a method at 2 weeks chose the 3-month injection or the copper intrauterine device. Only 4 % of women preferred sterilization. Caste, educational attainment, or type of residence did not influence contraceptive use. CONCLUSIONS: Simplified follow-up after early medical abortion will not change women's opportunities to access contraception in a low-resource setting, if contraceptive services are provided as intra-abortion services as early as on day one. Women's postabortion contraceptive use at 3 months is unlikely to be affected by mode of followup after medical abortion, also in a low-resource setting. Clinical guidelines need to encourage intra-abortion contraception, offering the full spectrum of evidence-based methods, especially long-acting reversible methods. TRIAL REGISTRATION: Clinicaltrials.gov NCT01827995.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the recent years, vibration-based structural damage identification has been subject of significant research in structural engineering. The basic idea of vibration-based methods is that damage induces mechanical properties changes that cause anomalies in the dynamic response of the structure, which measures allow to localize damage and its extension. Vibration measured data, such as frequencies and mode shapes, can be used in the Finite Element Model Updating in order to adjust structural parameters sensible at damage (e.g. Young’s Modulus). The novel aspect of this thesis is the introduction into the objective function of accurate measures of strains mode shapes, evaluated through FBG sensors. After a review of the relevant literature, the case of study, i.e. an irregular prestressed concrete beam destined for roofing of industrial structures, will be presented. The mathematical model was built through FE models, studying static and dynamic behaviour of the element. Another analytical model was developed, based on the ‘Ritz method’, in order to investigate the possible interaction between the RC beam and the steel supporting table used for testing. Experimental data, recorded through the contemporary use of different measurement techniques (optical fibers, accelerometers, LVDTs) were compared whit theoretical data, allowing to detect the best model, for which have been outlined the settings for the updating procedure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objectives: Mycological contamination of occupational environments can be a result of fungal spores’ dispersion in the air and on surfaces. Therefore, it is very important to assess it in both types of the samples. In the present study we assessed fungal contamination in the air and in the surface samples to show relevance of surfaces sampling in complementing the results obtained in the air samples. Material and Methods: In total, 42 settings were assessed by the analysis of air and surfaces samples. The settings were divided into settings with a high fungal load (7 poultry farms and 7 pig farms, 3 cork industries, 3 waste management plants, 2 wastewater treatment plants and 1 horse stable) and a low fungal load (10 hospital canteens, 8 college canteens and 1 maternity hospital). In addition to culture-based methods, molecular tools were also applied to detect fungal burden in the settings with a higher fungal load. Results: From the 218 sampling sites, 140 (64.2%) presented different species in the examined surfaces when compared with the species identified in the air. A positive association in the high fungal load settings was found between the presence of different species in the air and surfaces. Wastewater treatment plants constituted the setting with the highest number of different species between the air and surface. Conclusions: We observed that surfaces sampling and application of molecular tools showed the same efficacy of species detection in high fungal load settings, corroborating the fact that surface sampling is crucial for a correct and complete analysis of occupational scenarios.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Perturbation of natural ecosystems, namely by increasing freshwater use and its degradative use, as well as topsoil erosion by water of land-use production systems, have been emerging as topics of high environmental concern. Freshwater use has become a focus of attention in the last few years for all stakeholders involved in the production of goods, mainly agro-industrial and forest-based products, which are freshwater-intensive consumers, requiring large inputs of green and blue water. This thesis presents a global review on the available Water Footprint Assessment and Life Cycle Assessment (LCA)-based methods for measuring and assessing the environmental relevance of freshwater resources use, based on a life cycle perspective. Using some of the available midpoint LCA-based methods, the freshwater use-related impacts of a Portuguese wine (white ‘vinho verde’) were assessed. However, the relevance of environmental green water has been neglected because of the absence of a comprehensive impact assessment method associated with green water flows. To overcome this constraint, this thesis helps to improve and enhance the LCA-based methods by providing a midpoint and spatially explicit Life Cycle Impact Assessment (LCIA) method for assessing impacts on terrestrial green water flow and addressing reductions in surface blue water production caused by reductions in surface runoff due to land-use production systems. The applicability of the proposed method is illustrated by a case study on Eucalyptus globulus conducted in Portugal, as the growth of short rotation forestry is largely dependent on local precipitation. Topsoil erosion by water has been characterised as one of the most upsetting problems for rivers. Because of this, this thesis also focuses on the ecosystem impacts caused by suspended solids (SS) from topsoil erosion that reach freshwater systems. A framework to conduct a spatially distributed SS delivery to freshwater streams and a fate and effect LCIA method to derive site-specific characterisation factors (CFs) for endpoint damage on aquatic ecosystem diversity, namely on algae, macrophyte, and macroinvertebrates organisms, were developed. The applicability of this framework, combined with the derived site-specific CFs, is shown by conducting a case study on E. globulus stands located in Portugal as an example of a land use based system. A spatially explicit LCA assessment was shown to be necessary, since the impacts associated with both green water flows and SS vary greatly as a function of spatial location.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Relationship between organisms within an ecosystem is one of the main focuses in the study of ecology and evolution. For instance, host-parasite interactions have long been under close interest of ecology, evolutionary biology and conservation science, due to great variety of strategies and interaction outcomes. The monogenean ecto-parasites consist of a significant portion of flatworms. Gyrodactylus salaris is a monogenean freshwater ecto-parasite of Atlantic salmon (Salmo salar) whose damage can make fish to be prone to further bacterial and fungal infections. G. salaris is the only one parasite whose genome has been studied so far. The RNA-seq data analyzed in this thesis has already been annotated by using LAST. The RNA-seq data was obtained from Illumina sequencing i.e. yielded reads were assembled into 15777 transcripts. Last resulted in annotation of 46% transcripts and remaining were left unknown. This thesis work was started with whole data and annotation process was continued by the use of PANNZER, CDD and InterProScan. This annotation resulted in 56% successfully annotated sequences having parasite specific proteins identified. This thesis represents the first of Monogenean transcriptomic information which gives an important source for further research on this specie. Additionally, comparison of annotation methods interestingly revealed that description and domain based methods perform better than simple similarity search methods. Therefore it is more likely to suggest the use of these tools and databases for functional annotation. These results also emphasize the need for use of multiple methods and databases. It also highlights the need of more genomic information related to G. salaris.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work presents the analysis of wave and turbulence measurements collected at a tidal energy site. A new method is introduced to produce more consistent and rigorous estimations of the velocity fluctuations power spectral densities. An analytical function is further proposed to fit the observed spectra and could be input to the numerical models predicting power production and structural loading on tidal turbines. Another new approach is developed to correct for the effect of the Doppler noise on the high frequencies power spectral densities. The analysis of velocity time series combining wave and turbulent contributions demonstrates that the turbulent motions are coherent throughout the water column, rendering the wave coherence-based methods not applicable to our dataset. To avoid this problem, an alternative approach relying on the pressure data collected by the ADCP is introduced and shows appreciable improvement in the wave-turbulence separation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Seafood products fraud, the misrepresentation of them, have been discovered all around the world in different forms as false labeling, species substitution, short-weighting or over glazing in order to hide the correct identity, origin or weight of the seafood products. Due to the value of seafood products such as canned tuna, swordfish or grouper, these species are the subject of the commercial fraud is mainly there placement of valuable species with other little or no value species. A similar situation occurs with the shelled shrimp or shellfish that are reduced into pieces for the commercialization. Food fraud by species substitution is an emerging risk given the increasingly global food supply chain and the potential food safety issues. Economic food fraud is committed when food is deliberately placed on the market, for financial gain deceiving consumers (Woolfe, M. & Primrose, S. 2004). As a result of the increased demand and the globalization of the seafood supply, more fish species are encountered in the market. In this scenary, it becomes essential to unequivocally identify the species. The traditional taxonomy, based primarily on identification keys of species, has shown a number of limitations in the use of the distinctive features in many animal taxa, amplified when fish, crustacean or shellfish are commercially transformed. Many fish species show a similar texture, thus the certification of fish products is particularly important when fishes have undergone procedures which affect the overall anatomical structure, such as heading, slicing or filleting (Marko et al., 2004). The absence of morphological traits, a main characteristic usually used to identify animal species, represents a challenge and molecular identification methods are required. Among them, DNA-based methods are more frequently employed for food authentication (Lockley & Bardsley, 2000). In addition to food authentication and traceability, studies of taxonomy, population and conservation genetics as well as analysis of dietary habits and prey selection, also rely on genetic analyses including the DNA barcoding technology (Arroyave & Stiassny, 2014; Galimberti et al., 2013; Mafra, Ferreira, & Oliveira, 2008; Nicolé et al., 2012; Rasmussen & Morrissey, 2008), consisting in PCR amplification and sequencing of a COI mitochondrial gene specific region. The system proposed by P. Hebert et al. (2003) locates inside the mitochondrial COI gene (cytochrome oxidase subunit I) the bioidentification system useful in taxonomic identification of species (Lo Brutto et al., 2007). The COI region, used for genetic identification - DNA barcode - is short enough to allow, with the current technology, to decode sequence (the pairs of nucleotide bases) in a single step. Despite, this region only represents a tiny fraction of the mitochondrial DNA content in each cell, the COI region has sufficient variability to distinguish the majority of species among them (Biondo et al. 2016). This technique has been already employed to address the demand of assessing the actual identity and/or provenance of marketed products, as well as to unmask mislabelling and fraudulent substitutions, difficult to detect especially in manufactured seafood (Barbuto et al., 2010; Galimberti et al., 2013; Filonzi, Chiesa, Vaghi, & Nonnis Marzano, 2010). Nowadays,the research concerns the use of genetic markers to identify not only the species and/or varieties of fish, but also to identify molecular characters able to trace the origin and to provide an effective control tool forproducers and consumers as a supply chain in agreementwith local regulations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Graphical User Interface (GUI) is an integral component of contemporary computer software. A stable and reliable GUI is necessary for correct functioning of software applications. Comprehensive verification of the GUI is a routine part of most software development life-cycles. The input space of a GUI is typically large, making exhaustive verification difficult. GUI defects are often revealed by exercising parts of the GUI that interact with each other. It is challenging for a verification method to drive the GUI into states that might contain defects. In recent years, model-based methods, that target specific GUI interactions, have been developed. These methods create a formal model of the GUI’s input space from specification of the GUI, visible GUI behaviors and static analysis of the GUI’s program-code. GUIs are typically dynamic in nature, whose user-visible state is guided by underlying program-code and dynamic program-state. This research extends existing model-based GUI testing techniques by modelling interactions between the visible GUI of a GUI-based software and its underlying program-code. The new model is able to, efficiently and effectively, test the GUI in ways that were not possible using existing methods. The thesis is this: Long, useful GUI testcases can be created by examining the interactions between the GUI, of a GUI-based application, and its program-code. To explore this thesis, a model-based GUI testing approach is formulated and evaluated. In this approach, program-code level interactions between GUI event handlers will be examined, modelled and deployed for constructing long GUI testcases. These testcases are able to drive the GUI into states that were not possible using existing models. Implementation and evaluation has been conducted using GUITAR, a fully-automated, open-source GUI testing framework.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Understanding how aquatic species grow is fundamental in fisheries because stock assessment often relies on growth dependent statistical models. Length-frequency-based methods become important when more applicable data for growth model estimation are either not available or very expensive. In this article, we develop a new framework for growth estimation from length-frequency data using a generalized von Bertalanffy growth model (VBGM) framework that allows for time-dependent covariates to be incorporated. A finite mixture of normal distributions is used to model the length-frequency cohorts of each month with the means constrained to follow a VBGM. The variances of the finite mixture components are constrained to be a function of mean length, reducing the number of parameters and allowing for an estimate of the variance at any length. To optimize the likelihood, we use a minorization–maximization (MM) algorithm with a Nelder–Mead sub-step. This work was motivated by the decline in catches of the blue swimmer crab (BSC) (Portunus armatus) off the east coast of Queensland, Australia. We test the method with a simulation study and then apply it to the BSC fishery data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many exchange rate papers articulate the view that instabilities constitute a major impediment to exchange rate predictability. In this thesis we implement Bayesian and other techniques to account for such instabilities, and examine some of the main obstacles to exchange rate models' predictive ability. We first consider in Chapter 2 a time-varying parameter model in which fluctuations in exchange rates are related to short-term nominal interest rates ensuing from monetary policy rules, such as Taylor rules. Unlike the existing exchange rate studies, the parameters of our Taylor rules are allowed to change over time, in light of the widespread evidence of shifts in fundamentals - for example in the aftermath of the Global Financial Crisis. Focusing on quarterly data frequency from the crisis, we detect forecast improvements upon a random walk (RW) benchmark for at least half, and for as many as seven out of 10, of the currencies considered. Results are stronger when we allow the time-varying parameters of the Taylor rules to differ between countries. In Chapter 3 we look closely at the role of time-variation in parameters and other sources of uncertainty in hindering exchange rate models' predictive power. We apply a Bayesian setup that incorporates the notion that the relevant set of exchange rate determinants and their corresponding coefficients, change over time. Using statistical and economic measures of performance, we first find that predictive models which allow for sudden, rather than smooth, changes in the coefficients yield significant forecast improvements and economic gains at horizons beyond 1-month. At shorter horizons, however, our methods fail to forecast better than the RW. And we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients variability to incorporate in the models, as the main factors obstructing predictive ability. Chapter 4 focus on the problem of the time-varying predictive ability of economic fundamentals for exchange rates. It uses bootstrap-based methods to uncover the time-specific conditioning information for predicting fluctuations in exchange rates. Employing several metrics for statistical and economic evaluation of forecasting performance, we find that our approach based on pre-selecting and validating fundamentals across bootstrap replications generates more accurate forecasts than the RW. The approach, known as bumping, robustly reveals parsimonious models with out-of-sample predictive power at 1-month horizon; and outperforms alternative methods, including Bayesian, bagging, and standard forecast combinations. Chapter 5 exploits the predictive content of daily commodity prices for monthly commodity-currency exchange rates. It builds on the idea that the effect of daily commodity price fluctuations on commodity currencies is short-lived, and therefore harder to pin down at low frequencies. Using MIxed DAta Sampling (MIDAS) models, and Bayesian estimation methods to account for time-variation in predictive ability, the chapter demonstrates the usefulness of suitably exploiting such short-lived effects in improving exchange rate forecasts. It further shows that the usual low-frequency predictors, such as money supplies and interest rates differentials, typically receive little support from the data at monthly frequency, whereas MIDAS models featuring daily commodity prices are highly likely. The chapter also introduces the random walk Metropolis-Hastings technique as a new tool to estimate MIDAS regressions.