817 resultados para Proxy servers


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several clinical studies have reported that EEG synchrony is affected by Alzheimer’s disease (AD). In this paper a frequency band analysis of AD EEG signals is presented, with the aim of improving the diagnosis of AD using EEG signals. In this paper, multiple synchrony measures are assessed through statistical tests (Mann–Whitney U test), including correlation, phase synchrony and Granger causality measures. Moreover, linear discriminant analysis (LDA) is conducted with those synchrony measures as features. For the data set at hand, the frequency range (5-6Hz) yields the best accuracy for diagnosing AD, which lies within the classical theta band (4-8Hz). The corresponding classification error is 4.88% for directed transfer function (DTF) Granger causality measure. Interestingly, results show that EEG of AD patients is more synchronous than in healthy subjects within the optimized range 5-6Hz, which is in sharp contrast with the loss of synchrony in AD EEG reported in many earlier studies. This new finding may provide new insights about the neurophysiology of AD. Additional testing on larger AD datasets is required to verify the effectiveness of the proposed approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Electroencephalographic (EEG) recordings are, most of the times, corrupted by spurious artifacts, which should be rejected or cleaned by the practitioner. As human scalp EEG screening is error-prone, automatic artifact detection is an issue of capital importance, to ensure objective and reliable results. In this paper we propose a new approach for discrimination of muscular activity in the human scalp quantitative EEG (QEEG), based on the time-frequency shape analysis. The impact of the muscular activity on the EEG can be evaluated from this methodology. We present an application of this scoring as a preprocessing step for EEG signal analysis, in order to evaluate the amount of muscular activity for two set of EEG recordings for dementia patients with early stage of Alzheimer’s disease and control age-matched subjects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Langattomien lähiverkkojen yleistyessä nopeasti suurten verkkojen teknologiana käyttösääntöjen valvonta tulee tarpeelliseksi. Tässä työssä kuvataan, kuinka käyttäjät voidaan pakottaa noudattamaan käyttösääntöjä julkisissa WLAN-verkoissa. Työssä käsiteltävät ongelmat koskevat menetelmiä epäluotettavien DHCP-palvelinten paljastamiseksi sekä omia IP-osoitteita käyttävien käyttäjien paljastamiseksi tilanteissa, jolloin IP-osoite ei ole virallisen DHCP-palvelimen myöntämä. Jokaisen menetelmän kohdalla pohditaan, kuinka tällaisia käyttäjiä voidaan estää rikkomasta käyttösääntöjä. Lisäksi pohditaan keskitetyn tietojen keruun hyödyntämistä kuvattujen tehtävien suorittamiseksi. Esitetyt ratkaisut on erityisesti suunniteltu testiverkkoa varten, mutta yleiset ideat ovat toimivia missä tahansa langattomassa verkossa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Past temperature variations are usually inferred from proxy data or estimated using general circulation models. Comparisons between climate estimations derived from proxy records and from model simulations help to better understand mechanisms driving climate variations, and also offer the possibility to identify deficiencies in both approaches. This paper presents regional temperature reconstructions based on tree-ring maximum density series in the Pyrenees, and compares them with the output of global simulations for this region and with regional climate model simulations conducted for the target region. An ensemble of 24 reconstructions of May-to-September regional mean temperature was derived from 22 maximum density tree-ring site chronologies distributed over the larger Pyrenees area. Four different tree-ring series standardization procedures were applied, combining two detrending methods: 300-yr spline and the regional curve standardization (RCS). Additionally, different methodological variants for the regional chronology were generated by using three different aggregation methods. Calibration verification trials were performed in split periods and using two methods: regression and a simple variance matching. The resulting set of temperature reconstructions was compared with climate simulations performed with global (ECHO-G) and regional (MM5) climate models. The 24 variants of May-to-September temperature reconstructions reveal a generally coherent pattern of inter-annual to multi-centennial temperature variations in the Pyrenees region for the last 750 yr. However, some reconstructions display a marked positive trend for the entire length of the reconstruction, pointing out that the application of the RCS method to a suboptimal set of samples may lead to unreliable results. Climate model simulations agree with the tree-ring based reconstructions at multi-decadal time scales, suggesting solar variability and volcanism as the main factors controlling preindustrial mean temperature variations in the Pyrenees. Nevertheless, the comparison also highlights differences with the reconstructions, mainly in the amplitude of past temperature variations and in the 20th century trends. Neither proxy-based reconstructions nor model simulations are able to perfectly track the temperature variations of the instrumental record, suggesting that both approximations still need further improvements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Notre consommation en eau souterraine, en particulier comme eau potable ou pour l'irrigation, a considérablement augmenté au cours des années. De nombreux problèmes font alors leur apparition, allant de la prospection de nouvelles ressources à la remédiation des aquifères pollués. Indépendamment du problème hydrogéologique considéré, le principal défi reste la caractérisation des propriétés du sous-sol. Une approche stochastique est alors nécessaire afin de représenter cette incertitude en considérant de multiples scénarios géologiques et en générant un grand nombre de réalisations géostatistiques. Nous rencontrons alors la principale limitation de ces approches qui est le coût de calcul dû à la simulation des processus d'écoulements complexes pour chacune de ces réalisations. Dans la première partie de la thèse, ce problème est investigué dans le contexte de propagation de l'incertitude, oú un ensemble de réalisations est identifié comme représentant les propriétés du sous-sol. Afin de propager cette incertitude à la quantité d'intérêt tout en limitant le coût de calcul, les méthodes actuelles font appel à des modèles d'écoulement approximés. Cela permet l'identification d'un sous-ensemble de réalisations représentant la variabilité de l'ensemble initial. Le modèle complexe d'écoulement est alors évalué uniquement pour ce sousensemble, et, sur la base de ces réponses complexes, l'inférence est faite. Notre objectif est d'améliorer la performance de cette approche en utilisant toute l'information à disposition. Pour cela, le sous-ensemble de réponses approximées et exactes est utilisé afin de construire un modèle d'erreur, qui sert ensuite à corriger le reste des réponses approximées et prédire la réponse du modèle complexe. Cette méthode permet de maximiser l'utilisation de l'information à disposition sans augmentation perceptible du temps de calcul. La propagation de l'incertitude est alors plus précise et plus robuste. La stratégie explorée dans le premier chapitre consiste à apprendre d'un sous-ensemble de réalisations la relation entre les modèles d'écoulement approximé et complexe. Dans la seconde partie de la thèse, cette méthodologie est formalisée mathématiquement en introduisant un modèle de régression entre les réponses fonctionnelles. Comme ce problème est mal posé, il est nécessaire d'en réduire la dimensionnalité. Dans cette optique, l'innovation du travail présenté provient de l'utilisation de l'analyse en composantes principales fonctionnelles (ACPF), qui non seulement effectue la réduction de dimensionnalités tout en maximisant l'information retenue, mais permet aussi de diagnostiquer la qualité du modèle d'erreur dans cet espace fonctionnel. La méthodologie proposée est appliquée à un problème de pollution par une phase liquide nonaqueuse et les résultats obtenus montrent que le modèle d'erreur permet une forte réduction du temps de calcul tout en estimant correctement l'incertitude. De plus, pour chaque réponse approximée, une prédiction de la réponse complexe est fournie par le modèle d'erreur. Le concept de modèle d'erreur fonctionnel est donc pertinent pour la propagation de l'incertitude, mais aussi pour les problèmes d'inférence bayésienne. Les méthodes de Monte Carlo par chaîne de Markov (MCMC) sont les algorithmes les plus communément utilisés afin de générer des réalisations géostatistiques en accord avec les observations. Cependant, ces méthodes souffrent d'un taux d'acceptation très bas pour les problèmes de grande dimensionnalité, résultant en un grand nombre de simulations d'écoulement gaspillées. Une approche en deux temps, le "MCMC en deux étapes", a été introduite afin d'éviter les simulations du modèle complexe inutiles par une évaluation préliminaire de la réalisation. Dans la troisième partie de la thèse, le modèle d'écoulement approximé couplé à un modèle d'erreur sert d'évaluation préliminaire pour le "MCMC en deux étapes". Nous démontrons une augmentation du taux d'acceptation par un facteur de 1.5 à 3 en comparaison avec une implémentation classique de MCMC. Une question reste sans réponse : comment choisir la taille de l'ensemble d'entrainement et comment identifier les réalisations permettant d'optimiser la construction du modèle d'erreur. Cela requiert une stratégie itérative afin que, à chaque nouvelle simulation d'écoulement, le modèle d'erreur soit amélioré en incorporant les nouvelles informations. Ceci est développé dans la quatrième partie de la thèse, oú cette méthodologie est appliquée à un problème d'intrusion saline dans un aquifère côtier. -- Our consumption of groundwater, in particular as drinking water and for irrigation, has considerably increased over the years and groundwater is becoming an increasingly scarce and endangered resource. Nofadays, we are facing many problems ranging from water prospection to sustainable management and remediation of polluted aquifers. Independently of the hydrogeological problem, the main challenge remains dealing with the incomplete knofledge of the underground properties. Stochastic approaches have been developed to represent this uncertainty by considering multiple geological scenarios and generating a large number of realizations. The main limitation of this approach is the computational cost associated with performing complex of simulations in each realization. In the first part of the thesis, we explore this issue in the context of uncertainty propagation, where an ensemble of geostatistical realizations is identified as representative of the subsurface uncertainty. To propagate this lack of knofledge to the quantity of interest (e.g., the concentration of pollutant in extracted water), it is necessary to evaluate the of response of each realization. Due to computational constraints, state-of-the-art methods make use of approximate of simulation, to identify a subset of realizations that represents the variability of the ensemble. The complex and computationally heavy of model is then run for this subset based on which inference is made. Our objective is to increase the performance of this approach by using all of the available information and not solely the subset of exact responses. Two error models are proposed to correct the approximate responses follofing a machine learning approach. For the subset identified by a classical approach (here the distance kernel method) both the approximate and the exact responses are knofn. This information is used to construct an error model and correct the ensemble of approximate responses to predict the "expected" responses of the exact model. The proposed methodology makes use of all the available information without perceptible additional computational costs and leads to an increase in accuracy and robustness of the uncertainty propagation. The strategy explored in the first chapter consists in learning from a subset of realizations the relationship between proxy and exact curves. In the second part of this thesis, the strategy is formalized in a rigorous mathematical framework by defining a regression model between functions. As this problem is ill-posed, it is necessary to reduce its dimensionality. The novelty of the work comes from the use of functional principal component analysis (FPCA), which not only performs the dimensionality reduction while maximizing the retained information, but also allofs a diagnostic of the quality of the error model in the functional space. The proposed methodology is applied to a pollution problem by a non-aqueous phase-liquid. The error model allofs a strong reduction of the computational cost while providing a good estimate of the uncertainty. The individual correction of the proxy response by the error model leads to an excellent prediction of the exact response, opening the door to many applications. The concept of functional error model is useful not only in the context of uncertainty propagation, but also, and maybe even more so, to perform Bayesian inference. Monte Carlo Markov Chain (MCMC) algorithms are the most common choice to ensure that the generated realizations are sampled in accordance with the observations. Hofever, this approach suffers from lof acceptance rate in high dimensional problems, resulting in a large number of wasted of simulations. This led to the introduction of two-stage MCMC, where the computational cost is decreased by avoiding unnecessary simulation of the exact of thanks to a preliminary evaluation of the proposal. In the third part of the thesis, a proxy is coupled to an error model to provide an approximate response for the two-stage MCMC set-up. We demonstrate an increase in acceptance rate by a factor three with respect to one-stage MCMC results. An open question remains: hof do we choose the size of the learning set and identify the realizations to optimize the construction of the error model. This requires devising an iterative strategy to construct the error model, such that, as new of simulations are performed, the error model is iteratively improved by incorporating the new information. This is discussed in the fourth part of the thesis, in which we apply this methodology to a problem of saline intrusion in a coastal aquifer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The oxidative potential (OP) of particulate matter has been proposed as a toxicologically relevant metric. This concept is already frequently used for hazard characterization of ambient particles but it is still seldom applied in the occupational field. The objective of this study was to assess the OP in two different types of workplaces and to investigate the relationship between the OP and the physicochemical characteristics of the collected particles. At a toll station, at the entrance of a tunnel ('Tunnel' site), and at three different mechanical yards ('Depot' sites), we assessed particle mass (PM4 and PM2.5 and size distribution), number and surface area, organic and elemental carbon, polycyclic aromatic hydrocarbon (PAH), and four quinones as well as iron and copper concentration. The OP was determined directly on filters without extraction by using the dithiothreitol assay (DTT assay-OP(DTT)). The averaged mass concentration of respirable particles (PM4) at the Tunnel site was about twice the one at the Depot sites (173±103 and 90±36 µg m(-3), respectively), whereas the OP(DTT) was practically identical for all the sites (10.6±7.2 pmol DTT min(-1) μg(-1) at the Tunnel site; 10.4±4.6 pmol DTT min(-1) μg(-1) at the Depot sites). The OP(DTT) of PM4 was mostly present on the smallest PM2.5 fraction (OP(DTT) PM2.5: 10.2±8.1 pmol DTT min(-1) μg(-1); OP(DTT) PM4: 10.5±5.8 pmol DTT min(-1) μg(-1) for all sites), suggesting the presence of redox inactive components in the PM2.5-4 fraction. Although the reactivity was similar at the Tunnel and Depot sites irrespective of the metric chosen (OP(DTT) µg(-1) or OP(DTT) m(-3)), the chemicals associated with OP(DTT) were different between the two types of workplaces. The organic carbon, quinones, and/or metal content (Fe, Cu) were strongly associated with the DTT reactivity at the Tunnel site whereas only Fe and PAH were associated (positively and negatively, respectively) with this reactivity at the Depot sites. These results demonstrate the feasibility of measuring of the OP(DTT) in occupational environments and suggest that the particulate OP(DTT) is integrative of different physicochemical properties. This parameter could be a potentially useful exposure proxy for investigating particle exposure-related oxidative stress and its consequences. Further research is needed mostly to demonstrate the association of OP(DTT) with relevant oxidative endpoints in humans exposed to particles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stable isotope abundances of carbon (δ13C) and nitrogen (δ15N) in the bone of 13 species of marine mammals from the northwest coast of Africa were investigated to assess their positions in the local trophic web and their preferred habitats. Also, samples of primary producers and potential prey species from the study area were collected to characterise the local isotopic landscape. This characterisation indicated that δ13C values increased from offshore to nearshore and that δ15N was a good proxy for trophic level. Therefore, the most coastal species were Monachus monachus and Sousa teuszii, whereas the most pelagic were Physeter macrocephalus and Balaenoptera acutorostrata. δ15N values indicated that marine mammals located at the lowest trophic level were B. acutorostrata, Stenella coeruleoalba and Delphinus sp., and those occupying the highest trophic level were M. monachus and P. macrocephalus. The trophic level of Orcinus orca was similar to that of M. monachus, suggesting that O. orca preys on fish. Conservation of coastal and threatened species (M. monachus and S. teuszii) off NW Africa should be a priority because these species, as the main apex predators, cannot be replaced by other marine mammals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The analysis of the activity of neuronal cultures is considered to be a good proxy of the functional connectivity of in vivo neuronal tissues. Thus, the functional complex network inferred from activity patterns is a promising way to unravel the interplay between structure and functionality of neuronal systems. Here, we monitor the spontaneous self-sustained dynamics in neuronal cultures formed by interconnected aggregates of neurons (clusters). Dynamics is characterized by the fast activation of groups of clusters in sequences termed bursts. The analysis of the time delays between clusters' activations within the bursts allows the reconstruction of the directed functional connectivity of the network. We propose a method to statistically infer this connectivity and analyze the resulting properties of the associated complex networks. Surprisingly enough, in contrast to what has been reported for many biological networks, the clustered neuronal cultures present assortative mixing connectivity values, meaning that there is a preference for clusters to link to other clusters that share similar functional connectivity, as well as a rich-club core, which shapes a"connectivity backbone" in the network. These results point out that the grouping of neurons and the assortative connectivity between clusters are intrinsic survival mechanisms of the culture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Smoking is an important cardiovascular disease risk factor, but the mechanisms linking smoking to blood pressure are poorly understood. METHODS AND RESULTS: Data on 141 317 participants (62 666 never, 40 669 former, 37 982 current smokers) from 23 population-based studies were included in observational and Mendelian randomization meta-analyses of the associations of smoking status and smoking heaviness with systolic and diastolic blood pressure, hypertension, and resting heart rate. For the Mendelian randomization analyses, a genetic variant rs16969968/rs1051730 was used as a proxy for smoking heaviness in current smokers. In observational analyses, current as compared with never smoking was associated with lower systolic blood pressure and diastolic blood pressure and lower hypertension risk, but with higher resting heart rate. In observational analyses among current smokers, 1 cigarette/day higher level of smoking heaviness was associated with higher (0.21 bpm; 95% confidence interval 0.19; 0.24) resting heart rate and slightly higher diastolic blood pressure (0.05 mm Hg; 95% confidence interval 0.02; 0.08) and systolic blood pressure (0.08 mm Hg; 95% confidence interval 0.03; 0.13). However, in Mendelian randomization analyses among current smokers, although each smoking increasing allele of rs16969968/rs1051730 was associated with higher resting heart rate (0.36 bpm/allele; 95% confidence interval 0.18; 0.54), there was no strong association with diastolic blood pressure, systolic blood pressure, or hypertension. This would suggest a 7 bpm higher heart rate in those who smoke 20 cigarettes/day. CONCLUSIONS: This Mendelian randomization meta-analysis supports a causal association of smoking heaviness with higher level of resting heart rate, but not with blood pressure. These findings suggest that part of the cardiovascular risk of smoking may operate through increasing resting heart rate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We tested whether stereotypical situations would affect low-status group members' performance more strongly than high-status group members'. Experiment 1 and 2 tested this hypothesis using gender as a proxy of chronic social status and a gender-neutral task thathas been randomly presented to favor boys (men superiority condition), favor girls (women superiority condition), or show no gender preference (control condition). Both experiments found that women's (Experiment 1) and girls' performance (Experiment 2) suffered more from the evoked stereotypes than did men's and boys' ones. This result was replicated in Experiment 3, indicating that short men (low-status group) were more affected compared to tallmen (high-status group). Additionally, men were more affected compared to women when they perceived height as a threat. Hence, individuals are more or less vulnerable to identity threats as a function of the chronic social status at play; enjoying a high status provides protection and endorsing a low one weakens individual performance in stereotypical situations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Clúster format per una màquina principal HEAD Node més 19 nodes de càlcul de la gama SGI13 Altix14 XE Servers and Clusters, unides en una topologia de màster subordinat, amb un total de 40 processadors Dual Core i aproximadament 160Gb de RAM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper provides further insights into the dynamics of exports and outward foreign direct investment (FDI) flows in Spain from a time-series approach. The contribution of the paper is twofold: 1) the existence of either substitution or a complementary relationship between Spanish outward investments and exports is empirically tested using a multivariate cointegrated model (VECM). The evolution in exchange flows (1993-2008) and country-specific variables (such as world demand - including Spain’s main recently growing foreign markets - for trade flows and the relative price of exports in order to proxy new global competitors) are taken into account for the first time. And 2) the growth in the trade of services in recent decades leads us to test a specific causality relationship by disaggregating between goods and services flows. Our results provide evidence of a positive (Granger) causality relationship running from FDI to exports of goods (stronger) and to exports of services (weaker) in the long run, the complementarity relation of which is consistent with vertical FDI strategies. In the short run, however, only exports of goods are affected (positively) by FDIs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

GDP has usually been used as a proxy for human well-being. Nevertheless, other social aspects should also be considered, such as life expectancy, infant mortality, educational enrolment and crime issues. With this paper we investigate not only economic convergence but also social convergence between regions in a developing country, Colombia, in the period 1975-2005. We consider several techniques in our analysis: sigma convergence, stochastic kernel estimations, and also several empirical models to find out the beta convergence parameter (cross section and panel estimates, with and without spatial dependence). The main results confirm that we can talk about convergence in Colombia in key social variables, although not in the classic economic variable, GDP per capita. We have also found that spatial autocorrelation reinforces convergence processes through deepening market and social factors, while isolation condemns regions to nonconvergence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Different types of aerosolization and deagglomeration testing systems exist for studying the properties of nanomaterial powders and their aerosols. However, results are dependent on the specific methods used. In order to have well-characterized aerosols, we require a better understanding of how system parameters and testing conditions influence the properties of the aerosols generated. In the present study, four experimental setups delivering different aerosolization energies were used to test the resultant aerosols of two distinct nanomaterials (hydrophobic and hydrophilic TiO2). The reproducibility of results within each system was good. However, the number concentrations and size distributions of the aerosols created varied across the four systems; for number concentrations, e.g., from 10(3) to 10(6) #/cm(3). Moreover, distinct differences were also observed between the two materials with different surface coatings. The article discusses how system characteristics and other pertinent conditions modify the test results. We propose using air velocity as a suitable proxy for estimating energy input levels in aerosolization systems. The information derived from this work will be especially useful for establishing standard operating procedures for testing nanopowders, as well as for estimating their release rates under different energy input conditions, which is relevant for occupational exposure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Huntington's disease (HD) is an inherited neurodegenerative disorder triggered by an expanded polyglutamine tract in huntingtin that is thought to confer a new conformational property on this large protein. The propensity of small amino-terminal fragments with mutant, but not wild-type, glutamine tracts to self-aggregate is consistent with an altered conformation but such fragments occur relatively late in the disease process in human patients and mouse models expressing full-length mutant protein. This suggests that the altered conformational property may act within the full-length mutant huntingtin to initially trigger pathogenesis. Indeed, genotypephenotype studies in HD have defined genetic criteria for the disease initiating mechanism, and these are all fulfilled by phenotypes associated with expression of full-length mutant huntingtin, but not amino-terminal fragment, in mouse models. As the in vitro aggregation of amino-terminal mutant huntingtin fragment offers a ready assay to identify small compounds that interfere with the conformation of the polyglutamine tract, we have identified a number of aggregation inhibitors, and tested whether these are also capable of reversing a phenotype caused by endogenous expressionof mutant huntingtin in a striatal cell line from the HdhQ111/Q111 knock-in mouse. Results: We screened the NINDS Custom Collection of 1,040 FDA approved drugs and bioactive compounds for their ability to prevent in vitro aggregation of Q58-htn 1¿171 amino terminal fragment. Ten compounds were identified that inhibited aggregation with IC50 < 15 ¿M, including gossypol, gambogic acid, juglone, celastrol, sanguinarine and anthralin. Of these, both juglone and celastrol were effective in reversing the abnormal cellular localization of full-length mutant huntingtin observed in mutant HdhQ111/Q111 striatal cells. Conclusions: At least some compounds identified as aggregation inhibitors also prevent a neuronal cellular phenotype caused by full-length mutant huntingtin, suggesting that in vitro fragment aggregation can act as a proxy for monitoring the disease-producing conformational property in HD. Thus, identification and testing of compounds that alter in vitro aggregation is a viable approach for defining potential therapeutic compounds that may act on the deleterious conformational property of full-length mutant huntingtin.