59 resultados para Initial data problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biometric system performance can be improved by means of data fusion. Several kinds of information can be fused in order to obtain a more accurate classification (identification or verification) of an input sample. In this paper we present a method for computing the weights in a weighted sum fusion for score combinations, by means of a likelihood model. The maximum likelihood estimation is set as a linear programming problem. The scores are derived from a GMM classifier working on a different feature extractor. Our experimental results assesed the robustness of the system in front a changes on time (different sessions) and robustness in front a change of microphone. The improvements obtained were significantly better (error bars of two standard deviations) than a uniform weighted sum or a uniform weighted product or the best single classifier. The proposed method scales computationaly with the number of scores to be fussioned as the simplex method for linear programming.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Amino-N is preserved because of the scarcity and nutritional importance of protein. Excretion requires its conversion to ammonia, later incorporated into urea. Under conditions of excess dietary energy, the body cannot easily dispose of the excess amino-N against the evolutively adapted schemes that prevent its wastage; thus ammonia and glutamine formation (and urea excretion) are decreased. High lipid (and energy) availability limits the utilisation of glucose, and high glucose spares the production of ammonium from amino acids, limiting the synthesis of glutamine and its utilisation by the intestine and kidney. The amino acid composition of the diet affects the production of ammonium depending on its composition and the individual amino acid catabolic pathways. Surplus amino acids enhance protein synthesis and growth, and the synthesis of non-protein-N-containing compounds. But these outlets are not enough; consequently, less-conventional mechanisms are activated, such as increased synthesis of NO∙ followed by higher nitrite (and nitrate) excretion and changes in the microbiota. There is also a significant production of N(2) gas, through unknown mechanisms. Health consequences of amino-N surplus are difficult to fathom because of the sparse data available, but it can be speculated that the effects may be negative, largely because the fundamental N homeostasis is stretched out of normalcy, forcing the N removal through pathways unprepared for that task. The unreliable results of hyperproteic diets, and part of the dysregulation found in the metabolic syndrome may be an unwanted consequence of this N disposal conflict.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this correspondence, we propose applying the hiddenMarkov models (HMM) theory to the problem of blind channel estimationand data detection. The Baum–Welch (BW) algorithm, which is able toestimate all the parameters of the model, is enriched by introducingsome linear constraints emerging from a linear FIR hypothesis on thechannel. Additionally, a version of the algorithm that is suitable for timevaryingchannels is also presented. Performance is analyzed in a GSMenvironment using standard test channels and is found to be close to thatobtained with a nonblind receiver.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context. The understanding of Galaxy evolution can be facilitated by the use of population synthesis models, which allow to test hypotheses on the star formation history, star evolution, as well as chemical and dynamical evolution of the Galaxy. Aims. The new version of the Besanc¸on Galaxy Model (hereafter BGM) aims to provide a more flexible and powerful tool to investigate the Initial Mass Function (IMF) and Star Formation Rate (SFR) of the Galactic disc. Methods. We present a new strategy for the generation of thin disc stars which assumes the IMF, SFR and evolutionary tracks as free parameters. We have updated most of the ingredients for the star count production and, for the first time, binary stars are generated in a consistent way. We keep in this new scheme the local dynamical self-consistency as in Bienayme et al (1987). We then compare simulations from the new model with Tycho-2 data and the local luminosity function, as a first test to verify and constrain the new ingredients. The effects of changing thirteen different ingredients of the model are systematically studied. Results. For the first time, a full sky comparison is performed between BGM and data. This strategy allows to constrain the IMF slope at high masses which is found to be close to 3.0, excluding a shallower slope such as Salpeter"s one. The SFR is found decreasing whatever IMF is assumed. The model is compatible with a local dark matter density of 0.011 M pc−3 implying that there is no compelling evidence for significant amount of dark matter in the disc. While the model is fitted to Tycho2 data, a magnitude limited sample with V<11, we check that it is still consistent with fainter stars. Conclusions. The new model constitutes a new basis for further comparisons with large scale surveys and is being prepared to become a powerful tool for the analysis of the Gaia mission data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A newspaper content management system has to deal with a very heterogeneous information space as the experience in the Diari Segre newspaper has shown us. The greatest problem is to harmonise the different ways the involved users (journalist, archivists...) structure the newspaper information space, i.e. news, topics, headlines, etc. Our approach is based on ontology and differentiated universes of discourse (UoD). Users interact with the system and, from this interaction, integration rules are derived. These rules are based on Description Logic ontological relations for subsumption and equivalence. They relate the different UoD and produce a shared conceptualisation of the newspaper information domain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alteration and contamination processes modify the chemical composition of ceramic artefacts. This is not restricted solely to the affected elements, but also affects general concentrations. This is due to the compositional nature of chemical data, enclosed by the restriction of unit sum. Since it is impossible to know prior to data treatment whether the original compositions have been changed by such processes, the methodological approach used in provenance studies must be robust enough to handle materials that might have been altered or contaminated. The ability of the logratio transformation proposed by Aitchison to handle compositional data is studied and compared with that of present data treatments. The logaratio transformation appears to offer the most robust approach

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a new method for blindly inverting a nonlinear mapping which transforms a sum of random variables. This is the case of post-nonlinear (PNL) source separation mixtures. The importance of the method is based on the fact that it permits to decouple the estimation of the nonlinear part from the estimation of the linear one. Only the nonlinear part is inverted, without considering on the linear part. Hence the initial problem is transformed into a linear one that can then be solved with any convenient linear algorithm. The method is compared with other existing algorithms for blindly approximating nonlinear mappings. Experiments show that the proposed algorithm outperforms the results obtained with other algorithms and give a reasonably good linearized data

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context. The understanding of Galaxy evolution can be facilitated by the use of population synthesis models, which allow to test hypotheses on the star formation history, star evolution, as well as chemical and dynamical evolution of the Galaxy. Aims. The new version of the Besanc¸on Galaxy Model (hereafter BGM) aims to provide a more flexible and powerful tool to investigate the Initial Mass Function (IMF) and Star Formation Rate (SFR) of the Galactic disc. Methods. We present a new strategy for the generation of thin disc stars which assumes the IMF, SFR and evolutionary tracks as free parameters. We have updated most of the ingredients for the star count production and, for the first time, binary stars are generated in a consistent way. We keep in this new scheme the local dynamical self-consistency as in Bienayme et al (1987). We then compare simulations from the new model with Tycho-2 data and the local luminosity function, as a first test to verify and constrain the new ingredients. The effects of changing thirteen different ingredients of the model are systematically studied. Results. For the first time, a full sky comparison is performed between BGM and data. This strategy allows to constrain the IMF slope at high masses which is found to be close to 3.0, excluding a shallower slope such as Salpeter"s one. The SFR is found decreasing whatever IMF is assumed. The model is compatible with a local dark matter density of 0.011 M pc−3 implying that there is no compelling evidence for significant amount of dark matter in the disc. While the model is fitted to Tycho2 data, a magnitude limited sample with V<11, we check that it is still consistent with fainter stars. Conclusions. The new model constitutes a new basis for further comparisons with large scale surveys and is being prepared to become a powerful tool for the analysis of the Gaia mission data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a pose-based algorithm to solve the full SLAM problem for an autonomous underwater vehicle (AUV), navigating in an unknown and possibly unstructured environment. The technique incorporate probabilistic scan matching with range scans gathered from a mechanical scanning imaging sonar (MSIS) and the robot dead-reckoning displacements estimated from a Doppler velocity log (DVL) and a motion reference unit (MRU). The proposed method utilizes two extended Kalman filters (EKF). The first, estimates the local path travelled by the robot while grabbing the scan as well as its uncertainty and provides position estimates for correcting the distortions that the vehicle motion produces in the acoustic images. The second is an augment state EKF that estimates and keeps the registered scans poses. The raw data from the sensors are processed and fused in-line. No priory structural information or initial pose are considered. The algorithm has been tested on an AUV guided along a 600 m path within a marina environment, showing the viability of the proposed approach

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Majolica pottery was the most characteristic tableware produced in Europe during the Medieval and Renaissance periods. Because of the prestige and importance attributed to this ware, Spanish majolica was imported in vast quantities into the Americas during the Spanish Colonial period. A study of Spanish majolica was conducted on a set of 186 samples from the 10 primary majolica production centres on the Iberian Peninsula and 22 sherds from two early colonial archaeological sites on the Canary Islands. The samples were analysed by neutron activation analysis (NAA), and the resulting data were interpreted using an array of multivariate statistical approaches. Our results show a clear discrimination between different production centres, allowing a reliable provenance attribution of the sherds from the Canary Islands.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Catalan Research Portal (Portal de la Recerca de Catalunya or PRC) is an initiative carried out by the Consortium for University Services in Catalonia (CSUC) in coordination with nearly all universities in Catalonia. The Portal will provide an online CERIF-compliant collection of all research outputs produced by Catalan HEIs together with an appropriate contextual information describing the specific environment where the output was generated (such as researchers, research group, research project, etc). The initial emphasis of the Catalan Research Portal approach to research outputs will be made on publications, but other outputs such as patents and eventually research data will eventually be addressed as well. These guidelines provide information for PRC data providers to expose and exchange their research information metadata in CERIFXML compatible structure, thus allowing them not just to exchange validated CERIF XML data with the PRC platform, but to improve their general interoperability by being able to deliver CERIFcompatible outputs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper sets out to identify the initial positions of the different decisionmakers who intervene in a group decision making process with a reducednumber of actors, and to establish possible consensus paths between theseactors. As a methodological support, it employs one of the most widely-knownmulticriteria decision techniques, namely, the Analytic Hierarchy Process(AHP). Assuming that the judgements elicited by the decision makers follow theso-called multiplicative model (Crawford and Williams, 1985; Altuzarra et al.,1997; Laininen and Hämäläinen, 2003) with log-normal errors and unknownvariance, a Bayesian approach is used in the estimation of the relative prioritiesof the alternatives being compared. These priorities, estimated by way of themedian of the posterior distribution and normalised in a distributive manner(priorities add up to one), are a clear example of compositional data that will beused in the search for consensus between the actors involved in the resolution ofthe problem through the use of Multidimensional Scaling tools

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One main assumption in the theory of rough sets applied to information tables is that the elements that exhibit the same information are indiscernible (similar) and form blocks that can be understood as elementary granules of knowledge about the universe. We propose a variant of this concept defining a measure of similarity between the elements of the universe in order to consider that two objects can be indiscernible even though they do not share all the attribute values because the knowledge is partial or uncertain. The set of similarities define a matrix of a fuzzy relation satisfying reflexivity and symmetry but transitivity thus a partition of the universe is not attained. This problem can be solved calculating its transitive closure what ensure a partition for each level belonging to the unit interval [0,1]. This procedure allows generalizing the theory of rough sets depending on the minimum level of similarity accepted. This new point of view increases the rough character of the data because increases the set of indiscernible objects. Finally, we apply our results to a not real application to be capable to remark the differences and the improvements between this methodology and the classical one

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: None of the HIV T-cell vaccine candidates that have reached advanced clinical testing have been able to induce protective T cell immunity. A major reason for these failures may have been suboptimal T cell immunogen designs. Methods: To overcome this problem, we used a novel immunogen design approach that is based on functional T cell response data from more than 1,000 HIV-1 clade B and C infected individuals and which aims to direct the T cell response to the most vulnerable sites of HIV-1. Results: Our approach identified 16 regions in Gag, Pol, Vif and Nef that were relatively conserved and predominantly targeted by individuals with reduced viral loads. These regions formed the basis of the HIVACAT T-cell Immunogen (HTI) sequence which is 529 amino acids in length, includes more than 50 optimally defined CD4+ and CD8+ T-cell epitopes restricted by a wide range of HLA class I and II molecules and covers viral sites where mutations led to a dramatic reduction in viral replicative fitness. In both, C57BL/6 mice and Indian rhesus macaques immunized with an HTI-expressing DNA plasmid (DNA.HTI) induced broad and balanced T-cell responses to several segments within Gag, Pol, and Vif. DNA.HTI induced robust CD4+ and CD8+ T cell responses that were increased by a booster vaccination using modified virus Ankara (MVA.HTI), expanding the DNA.HTI induced response to up to 3.2% IFN-γ T-cells in macaques. HTI-specific T cells showed a central and effector memory phenotype with a significant fraction of the IFN-γ+ CD8+ T cells being Granzyme B+ and able to degranulate (CD107a+). Conclusions: These data demonstrate the immunogenicity of a novel HIV-1 T cell vaccine concept that induced broadly balanced responses to vulnerable sites of HIV-1 while avoiding the induction of responses to potential decoy targets that may divert effective T-cell responses towards variable and less protective viral determinants.