846 resultados para Kernel Functions


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let $Q$ be a suitable real function on $C$. An $n$-Fekete set corresponding to $Q$ is a subset ${Z_{n1}},\dotsb, Z_{nn}}$ of $C$ which maximizes the expression $\Pi^n_i_{

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Notre consommation en eau souterraine, en particulier comme eau potable ou pour l'irrigation, a considérablement augmenté au cours des années. De nombreux problèmes font alors leur apparition, allant de la prospection de nouvelles ressources à la remédiation des aquifères pollués. Indépendamment du problème hydrogéologique considéré, le principal défi reste la caractérisation des propriétés du sous-sol. Une approche stochastique est alors nécessaire afin de représenter cette incertitude en considérant de multiples scénarios géologiques et en générant un grand nombre de réalisations géostatistiques. Nous rencontrons alors la principale limitation de ces approches qui est le coût de calcul dû à la simulation des processus d'écoulements complexes pour chacune de ces réalisations. Dans la première partie de la thèse, ce problème est investigué dans le contexte de propagation de l'incertitude, oú un ensemble de réalisations est identifié comme représentant les propriétés du sous-sol. Afin de propager cette incertitude à la quantité d'intérêt tout en limitant le coût de calcul, les méthodes actuelles font appel à des modèles d'écoulement approximés. Cela permet l'identification d'un sous-ensemble de réalisations représentant la variabilité de l'ensemble initial. Le modèle complexe d'écoulement est alors évalué uniquement pour ce sousensemble, et, sur la base de ces réponses complexes, l'inférence est faite. Notre objectif est d'améliorer la performance de cette approche en utilisant toute l'information à disposition. Pour cela, le sous-ensemble de réponses approximées et exactes est utilisé afin de construire un modèle d'erreur, qui sert ensuite à corriger le reste des réponses approximées et prédire la réponse du modèle complexe. Cette méthode permet de maximiser l'utilisation de l'information à disposition sans augmentation perceptible du temps de calcul. La propagation de l'incertitude est alors plus précise et plus robuste. La stratégie explorée dans le premier chapitre consiste à apprendre d'un sous-ensemble de réalisations la relation entre les modèles d'écoulement approximé et complexe. Dans la seconde partie de la thèse, cette méthodologie est formalisée mathématiquement en introduisant un modèle de régression entre les réponses fonctionnelles. Comme ce problème est mal posé, il est nécessaire d'en réduire la dimensionnalité. Dans cette optique, l'innovation du travail présenté provient de l'utilisation de l'analyse en composantes principales fonctionnelles (ACPF), qui non seulement effectue la réduction de dimensionnalités tout en maximisant l'information retenue, mais permet aussi de diagnostiquer la qualité du modèle d'erreur dans cet espace fonctionnel. La méthodologie proposée est appliquée à un problème de pollution par une phase liquide nonaqueuse et les résultats obtenus montrent que le modèle d'erreur permet une forte réduction du temps de calcul tout en estimant correctement l'incertitude. De plus, pour chaque réponse approximée, une prédiction de la réponse complexe est fournie par le modèle d'erreur. Le concept de modèle d'erreur fonctionnel est donc pertinent pour la propagation de l'incertitude, mais aussi pour les problèmes d'inférence bayésienne. Les méthodes de Monte Carlo par chaîne de Markov (MCMC) sont les algorithmes les plus communément utilisés afin de générer des réalisations géostatistiques en accord avec les observations. Cependant, ces méthodes souffrent d'un taux d'acceptation très bas pour les problèmes de grande dimensionnalité, résultant en un grand nombre de simulations d'écoulement gaspillées. Une approche en deux temps, le "MCMC en deux étapes", a été introduite afin d'éviter les simulations du modèle complexe inutiles par une évaluation préliminaire de la réalisation. Dans la troisième partie de la thèse, le modèle d'écoulement approximé couplé à un modèle d'erreur sert d'évaluation préliminaire pour le "MCMC en deux étapes". Nous démontrons une augmentation du taux d'acceptation par un facteur de 1.5 à 3 en comparaison avec une implémentation classique de MCMC. Une question reste sans réponse : comment choisir la taille de l'ensemble d'entrainement et comment identifier les réalisations permettant d'optimiser la construction du modèle d'erreur. Cela requiert une stratégie itérative afin que, à chaque nouvelle simulation d'écoulement, le modèle d'erreur soit amélioré en incorporant les nouvelles informations. Ceci est développé dans la quatrième partie de la thèse, oú cette méthodologie est appliquée à un problème d'intrusion saline dans un aquifère côtier. -- Our consumption of groundwater, in particular as drinking water and for irrigation, has considerably increased over the years and groundwater is becoming an increasingly scarce and endangered resource. Nofadays, we are facing many problems ranging from water prospection to sustainable management and remediation of polluted aquifers. Independently of the hydrogeological problem, the main challenge remains dealing with the incomplete knofledge of the underground properties. Stochastic approaches have been developed to represent this uncertainty by considering multiple geological scenarios and generating a large number of realizations. The main limitation of this approach is the computational cost associated with performing complex of simulations in each realization. In the first part of the thesis, we explore this issue in the context of uncertainty propagation, where an ensemble of geostatistical realizations is identified as representative of the subsurface uncertainty. To propagate this lack of knofledge to the quantity of interest (e.g., the concentration of pollutant in extracted water), it is necessary to evaluate the of response of each realization. Due to computational constraints, state-of-the-art methods make use of approximate of simulation, to identify a subset of realizations that represents the variability of the ensemble. The complex and computationally heavy of model is then run for this subset based on which inference is made. Our objective is to increase the performance of this approach by using all of the available information and not solely the subset of exact responses. Two error models are proposed to correct the approximate responses follofing a machine learning approach. For the subset identified by a classical approach (here the distance kernel method) both the approximate and the exact responses are knofn. This information is used to construct an error model and correct the ensemble of approximate responses to predict the "expected" responses of the exact model. The proposed methodology makes use of all the available information without perceptible additional computational costs and leads to an increase in accuracy and robustness of the uncertainty propagation. The strategy explored in the first chapter consists in learning from a subset of realizations the relationship between proxy and exact curves. In the second part of this thesis, the strategy is formalized in a rigorous mathematical framework by defining a regression model between functions. As this problem is ill-posed, it is necessary to reduce its dimensionality. The novelty of the work comes from the use of functional principal component analysis (FPCA), which not only performs the dimensionality reduction while maximizing the retained information, but also allofs a diagnostic of the quality of the error model in the functional space. The proposed methodology is applied to a pollution problem by a non-aqueous phase-liquid. The error model allofs a strong reduction of the computational cost while providing a good estimate of the uncertainty. The individual correction of the proxy response by the error model leads to an excellent prediction of the exact response, opening the door to many applications. The concept of functional error model is useful not only in the context of uncertainty propagation, but also, and maybe even more so, to perform Bayesian inference. Monte Carlo Markov Chain (MCMC) algorithms are the most common choice to ensure that the generated realizations are sampled in accordance with the observations. Hofever, this approach suffers from lof acceptance rate in high dimensional problems, resulting in a large number of wasted of simulations. This led to the introduction of two-stage MCMC, where the computational cost is decreased by avoiding unnecessary simulation of the exact of thanks to a preliminary evaluation of the proposal. In the third part of the thesis, a proxy is coupled to an error model to provide an approximate response for the two-stage MCMC set-up. We demonstrate an increase in acceptance rate by a factor three with respect to one-stage MCMC results. An open question remains: hof do we choose the size of the learning set and identify the realizations to optimize the construction of the error model. This requires devising an iterative strategy to construct the error model, such that, as new of simulations are performed, the error model is iteratively improved by incorporating the new information. This is discussed in the fourth part of the thesis, in which we apply this methodology to a problem of saline intrusion in a coastal aquifer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the most important problems in optical pattern recognition by correlation is the appearance of sidelobes in the correlation plane, which causes false alarms. We present a method that eliminate sidelobes of up to a given height if certain conditions are satisfied. The method can be applied to any generalized synthetic discriminant function filter and is capable of rejecting lateral peaks that are even higher than the central correlation. Satisfactory results were obtained in both computer simulations and optical implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main challenge for gaining biological insights from genetic associations is identifying which genes and pathways explain the associations. Here we present DEPICT, an integrative tool that employs predicted gene functions to systematically prioritize the most likely causal genes at associated loci, highlight enriched pathways and identify tissues/cell types where genes from associated loci are highly expressed. DEPICT is not limited to genes with established functions and prioritizes relevant gene sets for many phenotypes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thesis studies role based access control and its suitability in the enterprise environment. The aim is to research how extensively role based access control can be implemented in the case organization and how it support organization’s business and IT functions. This study points out the enterprise’s needs for access control, factors of access control in the enterprise environment and requirements for implementation and the benefits and challenges it brings along. To find the scope how extensively role based access control can be implemented into the case organization, firstly is examined the actual state of access control. Secondly is defined a rudimentary desired state (how things should be) and thirdly completed it by using the results of the implementation of role based access control application. The study results the role model for case organization unit, and the building blocks and the framework for the organization wide implementation. Ultimate value for organization is delivered by facilitating the normal operations of the organization whilst protecting its information assets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The conversion of cellular prion protein (PrPc), a GPI-anchored protein, into a protease-K-resistant and infective form (generally termed PrPsc) is mainly responsible for Transmissible Spongiform Encephalopathies (TSEs), characterized by neuronal degeneration and progressive loss of basic brain functions. Although PrPc is expressed by a wide range of tissues throughout the body, the complete repertoire of its functions has not been fully determined. Recent studies have confirmed its participation in basic physiological processes such as cell proliferation and the regulation of cellular homeostasis. Other studies indicate that PrPc interacts with several molecules to activate signaling cascades with a high number of cellular effects. To determine PrPc functions, transgenic mouse models have been generated in the last decade. In particular, mice lacking specific domains of the PrPc protein have revealed the contribution of these domains to neurodegenerative processes. A dual role of PrPc has been shown, since most authors report protective roles for this protein while others describe pro-apoptotic functions. In this review, we summarize new findings on PrPc functions, especially those related to neural degeneration and cell signaling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: The aim of the current study was to investigate the long-term cognitive effects of electroconvulsive therapy (ECT) in a sample of adolescent patients in whom schizophrenia spectrum disorders were diagnosed. Methods: The sample was composed of nine adolescent subjects in whom schizophrenia or schizoaffective disorder was diagnosed according to DSM-IV-TR criteria on whom ECT was conducted (ECT group) and nine adolescent subjects matched by age, socioeconomic status, and diagnostic and Positive and Negative Syndrome Scale (PANSS) total score at baseline on whom ECT was not conducted (NECT group). Clinical and neuropsychological assessments were carried out at baseline before ECT treatment and at 2-year follow-up. Results: Significant differences were found between groups in the number of unsuccessful medication trials. No statistically significant differences were found between the ECT group and theNECT group in either severity as assessed by the PANSS, or in any cognitive variables at baseline.At follow-up, both groups showed significant improvement in clinical variables (subscales of positive, general, and total scores of PANSS and Clinical Global Impressions-Improvement). In the cognitive assessment at follow-up, significant improvement was found in both groups in the semantic category of verbal fluency task and digits forward. However, no significant differences were found between groups in any clinical or cognitive variable at follow-up. Repeated measures analysis found no significant interaction of time · group in any clinical or neuropsychological measures. Conclusions: The current study showed no significant differences in change over time in clinical or neuropsychological variables between the ECT group and the NECT group at 2-year follow-up. Thus, ECT did not show any negative influence on long-term neuropsychological variables in our sample.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous functional MRI (fMRI) studies have associated anterior hippocampus with imagining and recalling scenes, imagining the future, recalling autobiographical memories and visual scene perception. We have observed that this typically involves the medial rather than the lateral portion of the anterior hippocampus. Here, we investigated which specific structures of the hippocampus underpin this observation. We had participants imagine novel scenes during fMRI scanning, as well as recall previously learned scenes from two different time periods (one week and 30 min prior to scanning), with analogous single object conditions as baselines. Using an extended segmentation protocol focussing on anterior hippocampus, we first investigated which substructures of the hippocampus respond to scenes, and found both imagination and recall of scenes to be associated with activity in presubiculum/parasubiculum, a region associated with spatial representation in rodents. Next, we compared imagining novel scenes to recall from one week or 30 min before scanning. We expected a strong response to imagining novel scenes and 1-week recall, as both involve constructing scene representations from elements stored across cortex. By contrast, we expected a weaker response to 30-min recall, as representations of these scenes had already been constructed but not yet consolidated. Both imagination and 1-week recall of scenes engaged anterior hippocampal structures (anterior subiculum and uncus respectively), indicating possible roles in scene construction. By contrast, 30-min recall of scenes elicited significantly less activation of anterior hippocampus but did engage posterior CA3. Together, these results elucidate the functions of different parts of the anterior hippocampus, a key brain area about which little is definitely known.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most suitable method for estimation of size diversity is investigated. Size diversity is computed on the basis of the Shannon diversity expression adapted for continuous variables, such as size. It takes the form of an integral involving the probability density function (pdf) of the size of the individuals. Different approaches for the estimation of pdf are compared: parametric methods, assuming that data come from a determinate family of pdfs, and nonparametric methods, where pdf is estimated using some kind of local evaluation. Exponential, generalized Pareto, normal, and log-normal distributions have been used to generate simulated samples using estimated parameters from real samples. Nonparametric methods include discrete computation of data histograms based on size intervals and continuous kernel estimation of pdf. Kernel approach gives accurate estimation of size diversity, whilst parametric methods are only useful when the reference distribution have similar shape to the real one. Special attention is given for data standardization. The division of data by the sample geometric mean is proposedas the most suitable standardization method, which shows additional advantages: the same size diversity value is obtained when using original size or log-transformed data, and size measurements with different dimensionality (longitudes, areas, volumes or biomasses) may be immediately compared with the simple addition of ln k where kis the dimensionality (1, 2, or 3, respectively). Thus, the kernel estimation, after data standardization by division of sample geometric mean, arises as the most reliable and generalizable method of size diversity evaluation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We prove that every transcendental meromorphic map $f$ with disconnected Julia set has a weakly repelling fixed point. This implies that the Julia set of Newton's method for finding zeroes of an entire map is connected. Moreover, extending a result of Cowen for holomorphic self-maps of the disc, we show the existence of absorbing domains for holomorphic self-maps of hyperbolic regions, whose iterates tend to a boundary point. In particular, the results imply that periodic Baker domains of Newton's method for entire maps are simply connected, which solves a well-known open question.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An extensive literature suggests a link between executive functions and aggressive behavior in humans, pointing mostly to an inverse relationship, i.e., increased tendencies toward aggression in individuals scoring low on executive function tests. This literature is limited, though, in terms of the groups studied and the measures of executive functions. In this paper, we present data from two studies addressing these issues. In a first behavioral study, we asked whether high trait aggressiveness is related to reduced executive functions. A sample of over 600 students performed in an extensive behavioral test battery including paradigms addressing executive functions such as the Eriksen Flanker task, Stroop task, n-back task, and Tower of London (TOL). High trait aggressive participants were found to have a significantly reduced latency score in the TOL, indicating more impulsive behavior compared to low trait aggressive participants. No other differences were detected. In an EEG-study, we assessed neural and behavioral correlates of error monitoring and response inhibition in participants who were characterized based on their laboratory-induced aggressive behavior in a competitive reaction time task. Participants who retaliated more in the aggression paradigm and had reduced frontal activity when being provoked did not, however, show any reduction in behavioral or neural correlates of executive control compared to the less aggressive participants. Our results question a strong relationship between aggression and executive functions at least for healthy, high-functioning people.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several studies have suggested a bilingual advantage in executive functions, presumably due to bilinguals' massive practice with language switching that requires executive resources, but the results are still somewhat controversial. Previous studies are also plagued by the inherent limitations of a natural groups design where the participant groups are bound to differ in many ways in addition to the variable used to classify them. In an attempt to introduce a complementary analysis approach, we employed multiple regression to study whether the performance of 30- to 75-year-old FinnishSwedish bilinguals (N = 38) on tasks measuring different executive functions (inhibition, updating, and set shifting) could be predicted by the frequency of language switches in everyday life (as measured by a language switching questionnaire), L2 age of acquisition, or by the self-estimated degree of use of both languages in everyday life. Most consistent effects were found for the set shifting task where a higher rate of everyday language switches was related to a smaller mixing cost in errors. Mixing cost is thought to reflect top-down management of competing task sets, thus resembling the bilingual situation where decisions of which language to use has to be made in each conversation. These findings provide additional support to the idea that some executive functions in bilinguals are affected by a lifelong experience in language switching and, perhaps even more importantly, suggest a complementary approach to the study of this issue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the last part of the 1990s the chance of surviving breast cancer increased. Changes in survival functions reflect a mixture of effects. Both, the introduction of adjuvant treatments and early screening with mammography played a role in the decline in mortality. Evaluating the contribution of these interventions using mathematical models requires survival functions before and after their introduction. Furthermore, required survival functions may be different by age groups and are related to disease stage at diagnosis. Sometimes detailed information is not available, as was the case for the region of Catalonia (Spain). Then one may derive the functions using information from other geographical areas. This work presents the methodology used to estimate age- and stage-specific Catalan breast cancer survival functions from scarce Catalan survival data by adapting the age- and stage-specific US functions. Methods: Cubic splines were used to smooth data and obtain continuous hazard rate functions. After, we fitted a Poisson model to derive hazard ratios. The model included time as a covariate. Then the hazard ratios were applied to US survival functions detailed by age and stage to obtain Catalan estimations. Results: We started estimating the hazard ratios for Catalonia versus the USA before and after the introduction of screening. The hazard ratios were then multiplied by the age- and stage-specific breast cancer hazard rates from the USA to obtain the Catalan hazard rates. We also compared breast cancer survival in Catalonia and the USA in two time periods, before cancer control interventions (USA 1975–79, Catalonia 1980–89) and after (USA and Catalonia 1990–2001). Survival in Catalonia in the 1980–89 period was worse than in the USA during 1975–79, but the differences disappeared in 1990–2001. Conclusion: Our results suggest that access to better treatments and quality of care contributed to large improvements in survival in Catalonia. On the other hand, we obtained detailed breast cancer survival functions that will be used for modeling the effect of screening and adjuvant treatments in Catalonia