74 resultados para multiple-input single-output FRF
em Université de Lausanne, Switzerland
Resumo:
The sparsely spaced highly permeable fractures of the granitic rock aquifer at Stang-er-Brune (Brittany, France) form a well-connected fracture network of high permeability but unknown geometry. Previous work based on optical and acoustic logging together with single-hole and cross-hole flowmeter data acquired in 3 neighbouring boreholes (70-100 m deep) has identified the most important permeable fractures crossing the boreholes and their hydraulic connections. To constrain possible flow paths by estimating the geometries of known and previously unknown fractures, we have acquired, processed and interpreted multifold, single- and cross-hole GPR data using 100 and 250 MHz antennas. The GPR data processing scheme consisting of timezero corrections, scaling, bandpass filtering and F-X deconvolution, eigenvector filtering, muting, pre-stack Kirchhoff depth migration and stacking was used to differentiate fluid-filled fracture reflections from source generated noise. The final stacked and pre-stack depth-migrated GPR sections provide high-resolution images of individual fractures (dipping 30-90°) in the surroundings (2-20 m for the 100 MHz antennas; 2-12 m for the 250 MHz antennas) of each borehole in a 2D plane projection that are of superior quality to those obtained from single-offset sections. Most fractures previously identified from hydraulic testing can be correlated to reflections in the single-hole data. Several previously unknown major near vertical fractures have also been identified away from the boreholes.
Resumo:
Problem solving (including insight, divergent thinking) seems to rely on the right hemisphere (RH). These functions are difficult to assess behaviorally. We propose anagram resolution as a suitable paradigm. University students (n=32) performed three tachistoscopic lateralized visual half-field experiments (stimulus presentation 150ms). In Experiment 1, participants recalled four-letter strings. Subsequently, participants provided solutions for four-letter anagrams (one solution in Experiment 2; two solutions in Experiment 3). Additionally, participants completed a schizotypy questionnaire (O-LIFE). Results showed a right visual field advantage in Experiment 1 and 2, but no visual field advantage in Experiment 3. In Experiment 1, increasing positive schizotypy associated with a RH performance shift. Problem solving seems to require increasingly the RH when facing several rather than one solution. This result supports previous studies on the RH's role in remote associative, metaphor and discourse processing. The more complex language requirements, the less personality traits seem to matter.
Resumo:
Measuring school efficiency is a challenging task. First, a performance measurement technique has to be selected. Within Data Envelopment Analysis (DEA), one such technique, alternative models have been developed in order to deal with environmental variables. The majority of these models lead to diverging results. Second, the choice of input and output variables to be included in the efficiency analysis is often dictated by data availability. The choice of the variables remains an issue even when data is available. As a result, the choice of technique, model and variables is probably, and ultimately, a political judgement. Multi-criteria decision analysis methods can help the decision makers to select the most suitable model. The number of selection criteria should remain parsimonious and not be oriented towards the results of the models in order to avoid opportunistic behaviour. The selection criteria should also be backed by the literature or by an expert group. Once the most suitable model is identified, the principle of permanence of methods should be applied in order to avoid a change of practices over time. Within DEA, the two-stage model developed by Ray (1991) is the most convincing model which allows for an environmental adjustment. In this model, an efficiency analysis is conducted with DEA followed by an econometric analysis to explain the efficiency scores. An environmental variable of particular interest, tested in this thesis, consists of the fact that operations are held, for certain schools, on multiple sites. Results show that the fact of being located on more than one site has a negative influence on efficiency. A likely way to solve this negative influence would consist of improving the use of ICT in school management and teaching. Planning new schools should also consider the advantages of being located on a unique site, which allows reaching a critical size in terms of pupils and teachers. The fact that underprivileged pupils perform worse than privileged pupils has been public knowledge since Coleman et al. (1966). As a result, underprivileged pupils have a negative influence on school efficiency. This is confirmed by this thesis for the first time in Switzerland. Several countries have developed priority education policies in order to compensate for the negative impact of disadvantaged socioeconomic status on school performance. These policies have failed. As a result, other actions need to be taken. In order to define these actions, one has to identify the social-class differences which explain why disadvantaged children underperform. Childrearing and literary practices, health characteristics, housing stability and economic security influence pupil achievement. Rather than allocating more resources to schools, policymakers should therefore focus on related social policies. For instance, they could define pre-school, family, health, housing and benefits policies in order to improve the conditions for disadvantaged children.
Resumo:
Brain metastases occur in 20-50% of NSCLC and 50-80% of SCLC. In this review, we will look at evidence-based medicine data and give some perspectives on the management of BM. We will address the problems of multiple BM, single BM and prophylactic cranial irradiation. Recursive Partitioning Analysis (RPA) is a powerful prognostic tool to facilitate treatment decisions. Dealing with multiple BM, the use of corticosteroids was established more than 40 years ago by a unique randomized trial (RCT). Palliative effect is high (_80%) as well as side-effects. Whole brain radiotherapy (WBRT) was evaluated in many RCTs with a high (60-90%) response rate; several RT regimes are equivalent, but very high dose per fraction should be avoided. In multiple BM from SCLC, the effect of WBRT is comparable to that in NSCLC but chemotherapy (CXT) although advocated is probably less effective than RT. Single BM from NSCLC occurs in 30% of all BM cases; several prognostic classifications including RPA are very useful. Several options are available in single BM: WBRT, surgery (SX), radiosurgery (RS) or any combination of these. All were studied in RCTs and will be reviewed: the addition of WBRT to SX or RS gives a better neurological tumour control, has little or no impact on survival, and may be more toxic. However omitting WBRT after SX alone gives a higher risk of cerebro-spinal fluid dissemination. Prophylactic cranial irradiation (PCI) has a major role in SCLC. In limited disease, meta-analyses have shown a positive impact of PCI in the decrease of brain relapse and in survival improvement, especially for patients in complete remission. Surprisingly, this has been recently confirmed also in extensive disease. Experience with PCI for NSCLC is still limited, but RCT suggest a reduction of BM with no impact on survival. Toxicity of PCI is a matter of debate, as neurological or neuro-cognitive impairment is already present prior to PCI in almost half of patients. However RT toxicity is probably related to total dose and dose per fraction. Perspectives : Future research should concentrate on : 1) combined modalities in multiple BM. 2) Exploration of treatments in oligo-metastases. 3) Further exploration of PCI in NSCLC. 4) Exploration of new, toxicity-sparing radiotherapy techniques (IMRT, Tomotherapy etc).
Resumo:
It is now widely accepted that adult neurogenesis plays a fundamental role in hippocampal function. Neurons born in the adult dentate gyrus of the hippocampus undergo a series of events before they fully integrate in the network and eventually become undistinguishable from neurons born during embryogenesis. Adult hippocampal neurogenesis is strongly regulated by neuronal activity and neurotransmitters, and the synaptic integration of adult-born neurons occurs in discrete steps, some of which are very different from perinatal synaptogenesis. Here, we review the current knowledge on the development of the synaptic input and output of neurons born in the adult hippocampus, from the stem/progenitor cell to the fully mature neuron. We also provide insight on the regulation of adult neurogenesis by some neurotransmitters and discuss some specificities of the integration of new neurons in an adult environment. The understanding of the mechanisms regulating the synaptic integration of adult-born neurons is not only crucial for our understanding of brain plasticity, but also provides a framework for the manipulation and monitoring of endogenous adult neurogenesis as well as grafted cells, for potential therapeutic applications.
Resumo:
This guide introduces Data Envelopment Analysis (DEA), a performance measurement technique, in such a way as to be appropriate to decision makers with little or no background in economics and operational research. The use of mathematics is kept to a minimum. This guide therefore adopts a strong practical approach in order to allow decision makers to conduct their own efficiency analysis and to easily interpret results. DEA helps decision makers for the following reasons: - By calculating an efficiency score, it indicates if a firm is efficient or has capacity for improvement. - By setting target values for input and output, it calculates how much input must be decreased or output increased in order to become efficient. - By identifying the nature of returns to scale, it indicates if a firm has to decrease or increase its scale (or size) in order to minimize the average cost. - By identifying a set of benchmarks, it specifies which other firms' processes need to be analysed in order to improve its own practices.
Resumo:
This contribution introduces Data Envelopment Analysis (DEA), a performance measurement technique. DEA helps decision makers for the following reasons: (1) By calculating an efficiency score, it indicates if a firm is efficient or has capacity for improvement; (2) By setting target values for input and output, it calculates how much input must be decreased or output increased in order to become efficient; (3) By identifying the nature of returns to scale, it indicates if a firm has to decrease or increase its scale (or size) in order to minimise the average total cost; (4) By identifying a set of benchmarks, it specifies which other firms' processes need to be analysed in order to improve its own practices. This contribution presents the essentials about DEA, alongside a case study to intuitively understand its application. It also introduces Win4DEAP, a software package that conducts efficiency analysis based on DEA methodology. The methodical background of DEA is presented for more demanding readers. Finally, four advanced topics of DEA are treated: adjustment to the environment, preferences, sensitivity analysis and time series data.
Resumo:
The identification of associations between interleukin-28B (IL-28B) variants and the spontaneous clearance of hepatitis C virus (HCV) raises the issues of causality and the net contribution of host genetics to the trait. To estimate more precisely the net effect of IL-28B genetic variation on HCV clearance, we optimized genotyping and compared the host contributions in multiple- and single-source cohorts to control for viral and demographic effects. The analysis included individuals with chronic or spontaneously cleared HCV infections from a multiple-source cohort (n = 389) and a single-source cohort (n = 71). We performed detailed genotyping in the coding region of IL-28B and searched for copy number variations to identify the genetic variant or haplotype carrying the strongest association with viral clearance. This analysis was used to compare the effects of IL-28B variation in the two cohorts. Haplotypes characterized by carriage of the major alleles at IL-28B single-nucleotide polymorphisms (SNPs) were highly overrepresented in individuals with spontaneous clearance versus those with chronic HCV infections (66.1% versus 38.6%, P = 6 × 10(-9) ). The odds ratios for clearance were 2.1 [95% confidence interval (CI) = 1.6-3.0] and 3.9 (95% CI = 1.5-10.2) in the multiple- and single-source cohorts, respectively. Protective haplotypes were in perfect linkage (r(2) = 1.0) with a nonsynonymous coding variant (rs8103142). Copy number variants were not detected. We identified IL-28B haplotypes highly predictive of spontaneous HCV clearance. The high linkage disequilibrium between IL-28B SNPs indicates that association studies need to be complemented by functional experiments to identify single causal variants. The point estimate for the genetic effect was higher in the single-source cohort, which was used to effectively control for viral diversity, sex, and coinfections and, therefore, offered a precise estimate of the net host genetic contribution.
Resumo:
The adult hippocampus generates functional dentate granule cells (GCs) that release glutamate onto target cells in the hilus and cornus ammonis (CA)3 region, and receive glutamatergic and γ-aminobutyric acid (GABA)ergic inputs that tightly control their spiking activity. The slow and sequential development of their excitatory and inhibitory inputs makes them particularly relevant for information processing. Although they are still immature, new neurons are recruited by afferent activity and display increased excitability, enhanced activity-dependent plasticity of their input and output connections, and a high rate of synaptogenesis. Once fully mature, new GCs show all the hallmarks of neurons generated during development. In this review, we focus on how developing neurons remodel the adult dentate gyrus and discuss key aspects that illustrate the potential of neurogenesis as a mechanism for circuit plasticity and function.
Resumo:
Diverse sources of GABAergic inhibition are a major feature of cortical networks, but distinct inhibitory input systems have not been systematically characterized in the thalamus. Here, we contrasted the properties of two independent GABAergic pathways in the posterior thalamic nucleus of rat, one input from the reticular thalamic nucleus (nRT), and one "extrareticular" input from the anterior pretectal nucleus (APT). The vast majority of nRT-thalamic terminals formed single synapses per postsynaptic target and innervated thin distal dendrites of relay cells. In contrast, single APT-thalamic terminals formed synaptic contacts exclusively via multiple, closely spaced synapses on thick relay cell dendrites. Quantal analysis demonstrated that the two inputs displayed comparable quantal amplitudes, release probabilities, and multiple release sites. The morphological and physiological data together indicated multiple, single-site contacts for nRT and multisite contacts for APT axons. The contrasting synaptic arrangements of the two pathways were paralleled by different short-term plasticities. The multisite APT-thalamic pathway showed larger charge transfer during 50-100 Hz stimulation compared with the nRT pathway and a greater persistent inhibition accruing during stimulation trains. Our results demonstrate that the two inhibitory systems are morpho-functionally distinct and suggest and that multisite GABAergic terminals are tailored for maintained synaptic inhibition even at high presynaptic firing rates. These data explain the efficacy of extrareticular inhibition in timing relay cell activity in sensory and motor thalamic nuclei. Finally, based on the classic nomenclature and the difference between reticular and extrareticular terminals, we define a novel, multisite GABAergic terminal type (F3) in the thalamus.
Resumo:
Arbuscular mycorrhizal fungi (AMF) are highly successful plant symbionts. They reproduce clonally producing multinucleate spores. It has been suggested that some AMF harbor genetically different nuclei. However, recent advances in sequencing the Glomus irregulare genome have indicated very low within-fungus polymorphism. We tested the null hypothesis that, with no genetic differences among nuclei, no significant genetic or phenotypic variation would occur among clonal single spore lines generated from one initial AMF spore. Furthermore, no additional variation would be expected in the following generations of single spore lines. Genetic diversity contained in one initial spore repeatedly gave rise to genetically different variants of the fungus with novel phenotypes. The genetic changes represented quantitative changes in allele frequencies, most probably as a result of changes in the frequency of genetic variation partitioned on different nuclei. The genetic and phenotypic variation is remarkable, given that it arose repeatedly from one clonal individual. Our results highlight the dynamic nature of AMF genetics. Even though within-fungus genetic variation is low, some is probably partitioned among nuclei and potentially causes changes in the phenotype. Our results are important for understanding AMF genetics, as well as for researchers and biotechnologists hoping to use AMF genetic diversity for the improvement of AMF inoculum.
Resumo:
In this paper we study the relevance of multiple kernel learning (MKL) for the automatic selection of time series inputs. Recently, MKL has gained great attention in the machine learning community due to its flexibility in modelling complex patterns and performing feature selection. In general, MKL constructs the kernel as a weighted linear combination of basis kernels, exploiting different sources of information. An efficient algorithm wrapping a Support Vector Regression model for optimizing the MKL weights, named SimpleMKL, is used for the analysis. In this sense, MKL performs feature selection by discarding inputs/kernels with low or null weights. The approach proposed is tested with simulated linear and nonlinear time series (AutoRegressive, Henon and Lorenz series).
Resumo:
The M-Coffee server is a web server that makes it possible to compute multiple sequence alignments (MSAs) by running several MSA methods and combining their output into one single model. This allows the user to simultaneously run all his methods of choice without having to arbitrarily choose one of them. The MSA is delivered along with a local estimation of its consistency with the individual MSAs it was derived from. The computation of the consensus multiple alignment is carried out using a special mode of the T-Coffee package [Notredame, Higgins and Heringa (T-Coffee: a novel method for fast and accurate multiple sequence alignment. J. Mol. Biol. 2000; 302: 205-217); Wallace, O'Sullivan, Higgins and Notredame (M-Coffee: combining multiple sequence alignment methods with T-Coffee. Nucleic Acids Res. 2006; 34: 1692-1699)] Given a set of sequences (DNA or proteins) in FASTA format, M-Coffee delivers a multiple alignment in the most common formats. M-Coffee is a freeware open source package distributed under a GPL license and it is available either as a standalone package or as a web service from www.tcoffee.org.
Resumo:
In 1851 the French Social economist Auguste Ott discussed the problem of gluts and commercial crises, together with the issue of distributive justice between workers in co-operative societies. He did so by means of a 'simple reproduction scheme' sharing some features with modern intersectoral transactions tables, in particular in terms of their graphical representation. This paper presents Ott's theory of crises (which was based on the disappointment of expectations) and the context of his model, and discusses its peculiarities, supplying a new piece for the reconstruction of the prehistory of input-output analysis.