34 resultados para penalty-based aggregation functions


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Peatlands are widely exploited archives of paleoenvironmental change. We developed and compared multiple transfer functions to infer peatland depth to the water table (DWT) and pH based on testate amoeba (percentages, or presence/absence), bryophyte presence/absence, and vascular plant presence/absence data from sub-alpine peatlands in the SE Swiss Alps in order to 1) compare the performance of single-proxy vs. multi-proxy models and 2) assess the performance of presence/absence models. Bootstrapping cross-validation showing the best performing single-proxy transfer functions for both DWT and pH were those based on bryophytes. The best performing transfer functions overall for DWT were those based on combined testate amoebae percentages, bryophytes and vascular plants; and, for pH, those based on testate amoebae and bryophytes. The comparison of DWT and pH inferred from testate amoeba percentages and presence/absence data showed similar general patterns but differences in the magnitude and timing of some shifts. These results show new directions for paleoenvironmental research, 1) suggesting that it is possible to build good-performing transfer functions using presence/absence data, although with some loss of accuracy, and 2) supporting the idea that multi-proxy inference models may improve paleoecological reconstruction. The performance of multi-proxy and single-proxy transfer functions should be further compared in paleoecological data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Responses of many real-world problems can only be evaluated perturbed by noise. In order to make an efficient optimization of these problems possible, intelligent optimization strategies successfully coping with noisy evaluations are required. In this article, a comprehensive review of existing kriging-based methods for the optimization of noisy functions is provided. In summary, ten methods for choosing the sequential samples are described using a unified formalism. They are compared on analytical benchmark problems, whereby the usual assumption of homoscedastic Gaussian noise made in the underlying models is meet. Different problem configurations (noise level, maximum number of observations, initial number of observations) and setups (covariance functions, budget, initial sample size) are considered. It is found that the choices of the initial sample size and the covariance function are not critical. The choice of the method, however, can result in significant differences in the performance. In particular, the three most intuitive criteria are found as poor alternatives. Although no criterion is found consistently more efficient than the others, two specialized methods appear more robust on average.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several strategies relying on kriging have recently been proposed for adaptively estimating contour lines and excursion sets of functions under severely limited evaluation budget. The recently released R package KrigInv 3 is presented and offers a sound implementation of various sampling criteria for those kinds of inverse problems. KrigInv is based on the DiceKriging package, and thus benefits from a number of options concerning the underlying kriging models. Six implemented sampling criteria are detailed in a tutorial and illustrated with graphical examples. Different functionalities of KrigInv are gradually explained. Additionally, two recently proposed criteria for batch-sequential inversion are presented, enabling advanced users to distribute function evaluations in parallel on clusters or clouds of machines. Finally, auxiliary problems are discussed. These include the fine tuning of numerical integration and optimization procedures used within the computation and the optimization of the considered criteria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kriging-based optimization relying on noisy evaluations of complex systems has recently motivated contributions from various research communities. Five strategies have been implemented in the DiceOptim package. The corresponding functions constitute a user-friendly tool for solving expensive noisy optimization problems in a sequential framework, while offering some flexibility for advanced users. Besides, the implementation is done in a unified environment, making this package a useful device for studying the relative performances of existing approaches depending on the experimental setup. An overview of the package structure and interface is provided, as well as a description of the strategies and some insight about the implementation challenges and the proposed solutions. The strategies are compared to some existing optimization packages on analytical test functions and show promising performances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Moraxella catarrhalis, a major nasopharyngeal pathogen of the human respiratory tract, is exposed to rapid downshifts of environmental temperature when humans breathe cold air. The prevalence of pharyngeal colonization and respiratory tract infections caused by M. catarrhalis is greatest in winter. We investigated how M. catarrhalis uses the physiologic exposure to cold air to regulate pivotal survival systems that may contribute to M. catarrhalis virulence. RESULTS In this study we used the RNA-seq techniques to quantitatively catalogue the transcriptome of M. catarrhalis exposed to a 26 °C cold shock or to continuous growth at 37 °C. Validation of RNA-seq data using quantitative RT-PCR analysis demonstrated the RNA-seq results to be highly reliable. We observed that a 26 °C cold shock induces the expression of genes that in other bacteria have been related to virulence a strong induction was observed for genes involved in high affinity phosphate transport and iron acquisition, indicating that M. catarrhalis makes a better use of both phosphate and iron resources after exposure to cold shock. We detected the induction of genes involved in nitrogen metabolism, as well as several outer membrane proteins, including ompA, m35-like porin and multidrug efflux pump (acrAB) indicating that M. catarrhalis remodels its membrane components in response to downshift of temperature. Furthermore, we demonstrate that a 26 °C cold shock enhances the induction of genes encoding the type IV pili that are essential for natural transformation, and increases the genetic competence of M. catarrhalis, which may facilitate the rapid spread and acquisition of novel virulence-associated genes. CONCLUSION Cold shock at a physiologically relevant temperature of 26 °C induces in M. catarrhalis a complex of adaptive mechanisms that could convey novel pathogenic functions and may contribute to enhanced colonization and virulence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In e+e− event shapes studies at LEP, two different measurements were sometimes performed: a “calorimetric” measurement using both charged and neutral particles and a “track-based” measurement using just charged particles. Whereas calorimetric measurements are infrared and collinear safe, and therefore calculable in perturbative QCD, track-based measurements necessarily depend on nonperturbative hadronization effects. On the other hand, track-based measurements typically have smaller experimental uncertainties. In this paper, we present the first calculation of the event shape “track thrust” and compare to measurements performed at ALEPH and DELPHI. This calculation is made possible through the recently developed formalism of track functions, which are nonperturbative objects describing how energetic partons fragment into charged hadrons. By incorporating track functions into soft-collinear effective theory, we calculate the distribution for track thrust with next-to-leading logarithmic resummation. Due to a partial cancellation between nonperturbative parameters, the distributions for calorimeter thrust and track thrust are remarkably similar, a feature also seen in LEP data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

By using observables that only depend on charged particles (tracks), one can efficiently suppress pileup contamination at the LHC. Such measurements are not infrared safe in perturbation theory, so any calculation of track-based observables must account for hadronization effects. We develop a formalism to perform these calculations in QCD, by matching partonic cross sections onto new nonperturbative objects called track functions which absorb infrared divergences. The track function Ti(x) describes the energy fraction x of a hard parton i which is converted into charged hadrons. We give a field-theoretic definition of the track function and derive its renormalization group evolution, which is in excellent agreement with the pythia parton shower. We then perform a next-to-leading order calculation of the total energy fraction of charged particles in e+e−→ hadrons. To demonstrate the implications of our framework for the LHC, we match the pythia parton shower onto a set of track functions to describe the track mass distribution in Higgs plus one jet events. We also show how to reduce smearing due to hadronization fluctuations by measuring dimensionless track-based ratios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work addresses the evolution of an artificial neural network (ANN) to assist in the problem of indoor robotic localization. We investigate the design and building of an autonomous localization system based on information gathered from wireless networks (WN). The article focuses on the evolved ANN, which provides the position of a robot in a space, as in a Cartesian coordinate system, corroborating with the evolutionary robotic research area and showing its practical viability. The proposed system was tested in several experiments, evaluating not only the impact of different evolutionary computation parameters but also the role of the transfer functions on the evolution of the ANN. Results show that slight variations in the parameters lead to significant differences on the evolution process and, therefore, in the accuracy of the robot position.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Prospective memory (PM), the ability to remember to perform intended activities in the future (Kliegel & Jäger, 2007), is crucial to succeed in everyday life. PM seems to improve gradually over the childhood years (Zimmermann & Meier, 2006), but yet little is known about PM competences in young school children in general, and even less is known about factors influencing its development. Currently, a number of studies suggest that executive functions (EF) are potentially influencing processes (Ford, Driscoll, Shum & Macaulay, 2012; Mahy & Moses, 2011). Additionally, metacognitive processes (MC: monitoring and control) are assumed to be involved while optimizing one’s performance (Krebs & Roebers, 2010; 2012; Roebers, Schmid, & Roderer, 2009). Yet, the relations between PM, EF and MC remain relatively unspecified. We intend to empirically examine the structural relations between these constructs. Method A cross-sectional study including 119 2nd graders (mage = 95.03, sdage = 4.82) will be presented. Participants (n = 68 girls) completed three EF tasks (stroop, updating, shifting), a computerised event-based PM task and a MC spelling task. The latent variables PM, EF and MC that were represented by manifest variables deriving from the conducted tasks, were interrelated by structural equation modelling. Results Analyses revealed clear associations between the three cognitive constructs PM, EF and MC (rpm-EF = .45, rpm-MC = .23, ref-MC = .20). A three factor model, as opposed to one or two factor models, appeared to fit excellently to the data (chi2(17, 119) = 18.86, p = .34, remsea = .030, cfi = .990, tli = .978). Discussion The results indicate that already in young elementary school children, PM, EF and MC are empirically well distinguishable, but nevertheless substantially interrelated. PM and EF seem to share a substantial amount of variance while for MC, more unique processes may be assumed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increased renal resistive index (RRI) has been recently associated with target organ damage and cardiovascular or renal outcomes in patients with hypertension and diabetes mellitus. However, reference values in the general population and information on familial aggregation are largely lacking. We determined the distribution of RRI, associated factors, and heritability in a population-based study. Families of European ancestry were randomly selected in 3 Swiss cities. Anthropometric parameters and cardiovascular risk factors were assessed. A renal Doppler ultrasound was performed, and RRI was measured in 3 segmental arteries of both kidneys. We used multilevel linear regression analysis to explore the factors associated with RRI, adjusting for center and family relationships. Sex-specific reference values for RRI were generated according to age. Heritability was estimated by variance components using the ASSOC program (SAGE software). Four hundred women (mean age±SD, 44.9±16.7 years) and 326 men (42.1±16.8 years) with normal renal ultrasound had mean RRI of 0.64±0.05 and 0.62±0.05, respectively (P<0.001). In multivariable analyses, RRI was positively associated with female sex, age, systolic blood pressure, and body mass index. We observed an inverse correlation with diastolic blood pressure and heart rate. Age had a nonlinear association with RRI. We found no independent association of RRI with diabetes mellitus, hypertension treatment, smoking, cholesterol levels, or estimated glomerular filtration rate. The adjusted heritability estimate was 42±8% (P<0.001). In a population-based sample with normal renal ultrasound, RRI normal values depend on sex, age, blood pressure, heart rate, and body mass index. The significant heritability of RRI suggests that genes influence this phenotype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the novel notion of offering a radio access network as a service. Its components may be instantiated on general purpose platforms with pooled resources (both radio and hardware ones) dimensioned on-demand, elastically and following the pay-per-use principle. A novel architecture is proposed that supports this concept. The architecture's success is in its modularity, well-defined functional elements and clean separation between operational and control functions. By moving much processing traditionally located in hardware for computation in the cloud, it allows the optimisation of hardware utilization and reduction of deployment and operation costs. It enables operators to upgrade their network as well as quickly deploy and adapt resources to demand. Also, new players may easily enter the market, permitting a virtual network operator to provide connectivity to its users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The vestibular system contributes to the control of posture and eye movements and is also involved in various cognitive functions including spatial navigation and memory. These functions are subtended by projections to a vestibular cortex, whose exact location in the human brain is still a matter of debate (Lopez and Blanke, 2011). The vestibular cortex can be defined as the network of all cortical areas receiving inputs from the vestibular system, including areas where vestibular signals influence the processing of other sensory (e.g. somatosensory and visual) and motor signals. Previous neuroimaging studies used caloric vestibular stimulation (CVS), galvanic vestibular stimulation (GVS), and auditory stimulation (clicks and short-tone bursts) to activate the vestibular receptors and localize the vestibular cortex. However, these three methods differ regarding the receptors stimulated (otoliths, semicircular canals) and the concurrent activation of the tactile, thermal, nociceptive and auditory systems. To evaluate the convergence between these methods and provide a statistical analysis of the localization of the human vestibular cortex, we performed an activation likelihood estimation (ALE) meta-analysis of neuroimaging studies using CVS, GVS, and auditory stimuli. We analyzed a total of 352 activation foci reported in 16 studies carried out in a total of 192 healthy participants. The results reveal that the main regions activated by CVS, GVS, or auditory stimuli were located in the Sylvian fissure, insula, retroinsular cortex, fronto-parietal operculum, superior temporal gyrus, and cingulate cortex. Conjunction analysis indicated that regions showing convergence between two stimulation methods were located in the median (short gyrus III) and posterior (long gyrus IV) insula, parietal operculum and retroinsular cortex (Ri). The only area of convergence between all three methods of stimulation was located in Ri. The data indicate that Ri, parietal operculum and posterior insula are vestibular regions where afferents converge from otoliths and semicircular canals, and may thus be involved in the processing of signals informing about body rotations, translations and tilts. Results from the meta-analysis are in agreement with electrophysiological recordings in monkeys showing main vestibular projections in the transitional zone between Ri, the insular granular field (Ig), and SII.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Higher visual functions can be defined as cognitive processes responsible for object recognition, color and shape perception, and motion detection. People with impaired higher visual functions after unilateral brain lesion are often tested with paper pencil tests, but such tests do not assess the degree of interaction between the healthy brain hemisphere and the impaired one. Hence, visual functions are not tested separately in the contralesional and ipsilesional visual hemifields. METHODS: A new measurement setup, that involves real-time comparisons of shape and size of objects, orientation of lines, speed and direction of moving patterns, in the right or left visual hemifield, has been developed. The setup was implemented in an immersive environment like a hemisphere to take into account the effects of peripheral and central vision, and eventual visual field losses. Due to the non-flat screen of the hemisphere, a distortion algorithm was needed to adapt the projected images to the surface. Several approaches were studied and, based on a comparison between projected images and original ones, the best one was used for the implementation of the test. Fifty-seven healthy volunteers were then tested in a pilot study. A Satisfaction Questionnaire was used to assess the usability of the new measurement setup. RESULTS: The results of the distortion algorithm showed a structural similarity between the warped images and the original ones higher than 97%. The results of the pilot study showed an accuracy in comparing images in the two visual hemifields of 0.18 visual degrees and 0.19 visual degrees for size and shape discrimination, respectively, 2.56° for line orientation, 0.33 visual degrees/s for speed perception and 7.41° for recognition of motion direction. The outcome of the Satisfaction Questionnaire showed a high acceptance of the battery by the participants. CONCLUSIONS: A new method to measure higher visual functions in an immersive environment was presented. The study focused on the usability of the developed battery rather than the performance at the visual tasks. A battery of five subtasks to study the perception of size, shape, orientation, speed and motion direction was developed. The test setup is now ready to be tested in neurological patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel approach for the reconstruction of spectra from Euclidean correlator data that makes close contact to modern Bayesian concepts. It is based upon an axiomatically justified dimensionless prior distribution, which in the case of constant prior function m(ω) only imprints smoothness on the reconstructed spectrum. In addition we are able to analytically integrate out the only relevant overall hyper-parameter α in the prior, removing the necessity for Gaussian approximations found e.g. in the Maximum Entropy Method. Using a quasi-Newton minimizer and high-precision arithmetic, we are then able to find the unique global extremum of P[ρ|D] in the full Nω » Nτ dimensional search space. The method actually yields gradually improving reconstruction results if the quality of the supplied input data increases, without introducing artificial peak structures, often encountered in the MEM. To support these statements we present mock data analyses for the case of zero width delta peaks and more realistic scenarios, based on the perturbative Euclidean Wilson Loop as well as the Wilson Line correlator in Coulomb gauge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigated the empirical differentiation of prospective memory, executive functions, and metacognition and their structural relationships in 119 elementary school children (M = 95 months, SD = 4.8 months). These cognitive abilities share many characteristics on the theoretical level and are all highly relevant in many everyday contexts when intentions must be executed. Nevertheless, their empirical relationships have not been examined on the latent level, although an empirical approach would contribute to our knowledge concerning the differentiation of cognitive abilities during childhood. We administered a computerized event-based prospective memory task, three executive function tasks (updating, inhibition, shifting), and a metacognitive control task in the context of spelling. Confirmatory factor analysis revealed that the three cognitive abilities are already empirically differentiable in young elementary school children. At the same time, prospective memory and executive functions were found to be strongly related, and there was also a close link between prospective memory and metacognitive control. Furthermore, executive functions and metacognitive control were marginally significantly related. The findings are discussed within a framework of developmental differentiation and conceptual similarities and differences.