22 resultados para NIRS. Plum. Multivariate calibration. Variables selection
em Universidade do Minho
Resumo:
The aim of this paper is to predict time series of SO2 concentrations emitted by coal-fired power stations in order to estimate in advance emission episodes and analyze the influence of some meteorological variables in the prediction. An emission episode is said to occur when the series of bi-hourly means of SO2 is greater than a specific level. For coal-fired power stations it is essential to predict emission epi- sodes sufficiently in advance so appropriate preventive measures can be taken. We proposed a meth- odology to predict SO2 emission episodes based on using an additive model and an algorithm for variable selection. The methodology was applied to the estimation of SO2 emissions registered in sampling lo- cations near a coal-fired power station located in Northern Spain. The results obtained indicate a good performance of the model considering only two terms of the time series and that the inclusion of the meteorological variables in the model is not significant.
Resumo:
Customer lifetime value (LTV) enables using client characteristics, such as recency, frequency and monetary (RFM) value, to describe the value of a client through time in terms of profitability. We present the concept of LTV applied to telemarketing for improving the return-on-investment, using a recent (from 2008 to 2013) and real case study of bank campaigns to sell long- term deposits. The goal was to benefit from past contacts history to extract additional knowledge. A total of twelve LTV input variables were tested, un- der a forward selection method and using a realistic rolling windows scheme, highlighting the validity of five new LTV features. The results achieved by our LTV data-driven approach using neural networks allowed an improvement up to 4 pp in the Lift cumulative curve for targeting the deposit subscribers when compared with a baseline model (with no history data). Explanatory knowledge was also extracted from the proposed model, revealing two highly relevant LTV features, the last result of the previous campaign to sell the same product and the frequency of past client successes. The obtained results are particularly valuable for contact center companies, which can improve pre- dictive performance without even having to ask for more information to the companies they serve.
Resumo:
Glazing is a technique used to retard fish deterioration during storage. This work focuses on the study of distinct variables (fish temperature, coating temperature, dipping time) that affect the thickness of edible coatings (water glazing and 1.5% chitosan) applied on frozen fish. Samples of frozen Atlantic salmon (Salmo salar) at -15, -20, and -25 °C were either glazed with water at 0.5, 1.5 or 2.5 °C or coated with 1.5% chitosan solution at 2.5, 5 or 8 °C, by dipping during 10 to 60 s. For both water and chitosan coatings, lowering the salmon and coating solution temperatures resulted in an increase of coating thickness. At the same conditions, higher thickness values were obtained when using chitosan (max. thickness of 1.41±0.05 mm) compared to water (max. thickness of 0.84±0.03 mm). Freezing temperature and crystallization heat were found to be lower for 1.5% chitosan solution than for water, thus favoring phase change. Salmon temperature profiles allowed determining, for different dipping conditions, whether the salmon temperature was within food safety standards to prevent the growth of pathogenic microorganisms. The concept of safe dipping time is proposed to define how long a frozen product can be dipped into a solution without the temperature raising to a point where it can constitute a hazard.
Resumo:
Novel input modalities such as touch, tangibles or gestures try to exploit human's innate skills rather than imposing new learning processes. However, despite the recent boom of different natural interaction paradigms, it hasn't been systematically evaluated how these interfaces influence a user's performance or whether each interface could be more or less appropriate when it comes to: 1) different age groups; and 2) different basic operations, as data selection, insertion or manipulation. This work presents the first step of an exploratory evaluation about whether or not the users' performance is indeed influenced by the different interfaces. The key point is to understand how different interaction paradigms affect specific target-audiences (children, adults and older adults) when dealing with a selection task. 60 participants took part in this study to assess how different interfaces may influence the interaction of specific groups of users with regard to their age. Four input modalities were used to perform a selection task and the methodology was based on usability testing (speed, accuracy and user preference). The study suggests a statistically significant difference between mean selection times for each group of users, and also raises new issues regarding the “old” mouse input versus the “new” input modalities.
Resumo:
Lecture Notes in Computer Science, 9273
Resumo:
Nowadays, the sustainability of buildings has an extreme importance. This concept goes towards the European aims of the Program Horizon 2020, which concerns about the reduction of the environmental impacts through such aspects as the energy efficiency and renewable technologies, among others. Sustainability is an extremely broad concept but, in this work, it is intended to include the concept of sustainability in buildings. Within the concept that aims the integration of environmental, social and economic levels towards the preservation of the planet and the integrity of the users, there are, currently, several types of tools of environmental certification that are applicable to the construction industry (LEED, BREEAM, DGNB, SBTool, among others). Within this context, it is highlighted the tool SBTool (Sustainable Building Tool) that is employed in several countries and can be subject to review in institutions of basic education, which are the base for the formation of the critical masses and for the development of a country. The main aim of this research is to select indicators that can be used in a methodology for sustainability assessment (SBTool) of school buildings in Portugal and in Brazil. In order to achieve it, it will also be analyzed other methodologies that already incorporate parameters directly related with the schools environment, such as BREEAM or LEED.
Resumo:
This paper describes the trigger and offline reconstruction, identification and energy calibration algorithms for hadronic decays of tau leptons employed for the data collected from pp collisions in 2012 with the ATLAS detector at the LHC center-of-mass energy s√ = 8 TeV. The performance of these algorithms is measured in most cases with Z decays to tau leptons using the full 2012 dataset, corresponding to an integrated luminosity of 20.3 fb−1. An uncertainty on the offline reconstructed tau energy scale of 2% to 4%, depending on transverse energy and pseudorapidity, is achieved using two independent methods. The offline tau identification efficiency is measured with a precision of 2.5% for hadronically decaying tau leptons with one associated track, and of 4% for the case of three associated tracks, inclusive in pseudorapidity and for a visible transverse energy greater than 20 GeV. For hadronic tau lepton decays selected by offline algorithms, the tau trigger identification efficiency is measured with a precision of 2% to 8%, depending on the transverse energy. The performance of the tau algorithms, both offline and at the trigger level, is found to be stable with respect to the number of concurrent proton--proton interactions and has supported a variety of physics results using hadronically decaying tau leptons at ATLAS.
Resumo:
In longitudinal studies of disease, patients may experience several events through a follow-up period. In these studies, the sequentially ordered events are often of interest and lead to problems that have received much attention recently. Issues of interest include the estimation of bivariate survival, marginal distributions and the conditional distribution of gap times. In this work we consider the estimation of the survival function conditional to a previous event. Different nonparametric approaches will be considered for estimating these quantities, all based on the Kaplan-Meier estimator of the survival function. We explore the finite sample behavior of the estimators through simulations. The different methods proposed in this article are applied to a data set from a German Breast Cancer Study. The methods are used to obtain predictors for the conditional survival probabilities as well as to study the influence of recurrence in overall survival.
Resumo:
The immune system can recognize virtually any antigen, yet T cell responses against several pathogens, including Mycobacterium tuberculosis, are restricted to a limited number of immunodominant epitopes. The host factors that affect immunodominance are incompletely understood. Whether immunodominant epitopes elicit protective CD8+ T cell responses or instead act as decoys to subvert immunity and allow pathogens to establish chronic infection is unknown. Here we show that anatomically distinct human granulomas contain clonally expanded CD8+ T cells with overlapping T cell receptor (TCR) repertoires. Similarly, the murine CD8+ T cell response against M. tuberculosis is dominated by TB10.44-11-specific T cells with extreme TCRß bias. Using a retro genic model of TB10.44-11-specific CD8+ Tcells, we show that TCR dominance can arise because of competition between clonotypes driven by differences in affinity. Finally, we demonstrate that TB10.4-specific CD8+ T cells mediate protection against tuberculosis, which requires interferon-? production and TAP1-dependent antigen presentation in vivo. Our study of how immunodominance, biased TCR repertoires, and protection are inter-related, provides a new way to measure the quality of T cell immunity, which if applied to vaccine evaluation, could enhance our understanding of how to elicit protective T cell immunity.
Resumo:
The currently available clinical imaging methods do not provide highly detailed information about location and severity of axonal injury or the expected recovery time of patients with traumatic brain injury [1]. High-Definition Fiber Tractography (HDFT) is a novel imaging modality that allows visualizing and quantifying, directly, the degree of axons damage, predicting functional deficits due to traumatic axonal injury and loss of cortical projections. This imaging modality is based on diffusion technology [2]. The inexistence of a phantom able to mimic properly the human brain hinders the possibility of testing, calibrating and validating these medical imaging techniques. Most research done in this area fails in key points, such as the size limit reproduced of the brain fibers and the quick and easy reproducibility of phantoms [3]. For that reason, it is necessary to develop similar structures matching the micron scale of axon tubes. Flexible textiles can play an important role since they allow producing controlled packing densities and crossing structures that match closely the human crossing patterns of the brain. To build a brain phantom, several parameters must be taken into account in what concerns to the materials selection, like hydrophobicity, density and fiber diameter, since these factors influence directly the values of fractional anisotropy. Fiber cross-section shape is other important parameter. Earlier studies showed that synthetic fibrous materials are a good choice for building a brain phantom [4]. The present work is integrated in a broader project that aims to develop a brain phantom made by fibrous materials to validate and calibrate HDFT. Due to the similarity between thousands of hollow multifilaments in a fibrous arrangement, like a yarn, and the axons, low twist polypropylene multifilament yarns were selected for this development. In this sense, extruded hollow filaments were analysed in scanning electron microscope to characterize their main dimensions and shape. In order to approximate the dimensional scale to human axons, five types of polypropylene yarns with different linear density (denier) were used, aiming to understand the effect of linear density on the filament inner and outer areas. Moreover, in order to achieve the required dimensions, the polypropylene filaments cross-section was diminished in a drawing stage of a filament extrusion line. Subsequently, tensile tests were performed to characterize the mechanical behaviour of hollow filaments and to evaluate the differences between stretched and non-stretched filaments. In general, an increase of the linear density causes the increase in the size of the filament cross section. With the increase of structure orientation of filaments, induced by stretching, breaking tenacity increases and elongation at break decreases. The production of hollow fibers, with the required characteristics, is one of the key steps to create a brain phantom that properly mimics the human brain that may be used for the validation and calibration of HDFT, an imaging approach that is expected to contribute significantly to the areas of brain related research.
Resumo:
Dissertação de mestrado integrado em Engenharia Biomédica (área de especialização em Eletrónica Médica)
Resumo:
Olive oil quality grading is traditionally assessed by human sensory evaluation of positive and negative attributes (olfactory, gustatory, and final olfactorygustatory sensations). However, it is not guaranteed that trained panelist can correctly classify monovarietal extra-virgin olive oils according to olive cultivar. In this work, the potential application of human (sensory panelists) and artificial (electronic tongue) sensory evaluation of olive oils was studied aiming to discriminate eight single-cultivar extra-virgin olive oils. Linear discriminant, partial least square discriminant, and sparse partial least square discriminant analyses were evaluated. The best predictive classification was obtained using linear discriminant analysis with simulated annealing selection algorithm. A low-level data fusion approach (18 electronic tongue signals and nine sensory attributes) enabled 100 % leave-one-out cross-validation correct classification, improving the discrimination capability of the individual use of sensor profiles or sensory attributes (70 and 57 % leave-one-out correct classifications, respectively). So, human sensory evaluation and electronic tongue analysis may be used as complementary tools allowing successful monovarietal olive oil discrimination.
Resumo:
A high-resolution mtDNA phylogenetic tree allowed us to look backward in time to investigate purifying selection. Purifying selection was very strong in the last 2,500 years, continuously eliminating pathogenic mutations back until the end of the Younger Dryas (∼11,000 years ago), when a large population expansion likely relaxed selection pressure. This was preceded by a phase of stable selection until another relaxation occurred in the out-of-Africa migration. Demography and selection are closely related: expansions led to relaxation of selection and higher pathogenicity mutations significantly decreased the growth of descendants. The only detectible positive selection was the recurrence of highly pathogenic nonsynonymous mutations (m.3394T>C-m.3397A>G-m.3398T>C) at interior branches of the tree, preventing the formation of a dinucleotide STR (TATATA) in the MT-ND1 gene. At the most recent time scale in 124 mother-children transmissions, purifying selection was detectable through the loss of mtDNA variants with high predicted pathogenicity. A few haplogroup-defining sites were also heteroplasmic, agreeing with a significant propensity in 349 positions in the phylogenetic tree to revert back to the ancestral variant. This nonrandom mutation property explains the observation of heteroplasmic mutations at some haplogroup-defining sites in sequencing datasets, which may not indicate poor quality as has been claimed.
Resumo:
Tese de Doutoramento em Ciências Empresariais
Resumo:
Tese de Doutoramento em Medicina.