888 resultados para Reproducing Kernel
Resumo:
The research considers the problem of spatial data classification using machine learning algorithms: probabilistic neural networks (PNN) and support vector machines (SVM). As a benchmark model simple k-nearest neighbor algorithm is considered. PNN is a neural network reformulation of well known nonparametric principles of probability density modeling using kernel density estimator and Bayesian optimal or maximum a posteriori decision rules. PNN is well suited to problems where not only predictions but also quantification of accuracy and integration of prior information are necessary. An important property of PNN is that they can be easily used in decision support systems dealing with problems of automatic classification. Support vector machine is an implementation of the principles of statistical learning theory for the classification tasks. Recently they were successfully applied for different environmental topics: classification of soil types and hydro-geological units, optimization of monitoring networks, susceptibility mapping of natural hazards. In the present paper both simulated and real data case studies (low and high dimensional) are considered. The main attention is paid to the detection and learning of spatial patterns by the algorithms applied.
Resumo:
Se estudian en este trabajo algunas magnitudes relacionadas con el enfoque de la reproducción social. Ante todo se hace hincapié en tres ideas fundamentales, las nociones de: a) Salidas menos entradas; b) Salidas dividido por entradas; c) Subsistemas. A continuación se subrayan los obstáculos para la cuantificación directa de estos conceptos, y se repasan las vías sugeridas para sortear las dificultades (por medio de constructos teóricos propuestos por Leontief, von Neumann y Sraffa). Luego se examinan dos nuevos indicadores: la ¿tasaespecífica de excedente¿, que se refiere a los bienes autorreproducibles, y el ¿coeficiente neto de reproducción¿, que se predica de todos los bienes básicos. De pasada se apuntan algunas pistas para establecer indicadores del mismogénero en campos como la economía ecológica y la economía feminista. Por último, se anotan algunas conjeturas relacionadas con la dirección adecuada del cambio técnico
Resumo:
Uniform-price assignment games are introduced as those assignment markets with the core reduced to a segment. In these games, for all active agents, competitive prices are uniform although products may be non-homogeneous. A characterization in terms of the assignment matrix is given. The only assignment markets where all submarkets are uniform are the Bohm-Bawerk horse markets. We prove that for uniform-price assignment games the kernel, or set of symmetrically-pairwise bargained allocations, either coincides with the core or reduces to the nucleolus
Resumo:
In groundwater applications, Monte Carlo methods are employed to model the uncertainty on geological parameters. However, their brute-force application becomes computationally prohibitive for highly detailed geological descriptions, complex physical processes, and a large number of realizations. The Distance Kernel Method (DKM) overcomes this issue by clustering the realizations in a multidimensional space based on the flow responses obtained by means of an approximate (computationally cheaper) model; then, the uncertainty is estimated from the exact responses that are computed only for one representative realization per cluster (the medoid). Usually, DKM is employed to decrease the size of the sample of realizations that are considered to estimate the uncertainty. We propose to use the information from the approximate responses for uncertainty quantification. The subset of exact solutions provided by DKM is then employed to construct an error model and correct the potential bias of the approximate model. Two error models are devised that both employ the difference between approximate and exact medoid solutions, but differ in the way medoid errors are interpolated to correct the whole set of realizations. The Local Error Model rests upon the clustering defined by DKM and can be seen as a natural way to account for intra-cluster variability; the Global Error Model employs a linear interpolation of all medoid errors regardless of the cluster to which the single realization belongs. These error models are evaluated for an idealized pollution problem in which the uncertainty of the breakthrough curve needs to be estimated. For this numerical test case, we demonstrate that the error models improve the uncertainty quantification provided by the DKM algorithm and are effective in correcting the bias of the estimate computed solely from the MsFV results. The framework presented here is not specific to the methods considered and can be applied to other combinations of approximate models and techniques to select a subset of realizations
Resumo:
BACKGROUND: We present the results of EGASP, a community experiment to assess the state-of-the-art in genome annotation within the ENCODE regions, which span 1% of the human genome sequence. The experiment had two major goals: the assessment of the accuracy of computational methods to predict protein coding genes; and the overall assessment of the completeness of the current human genome annotations as represented in the ENCODE regions. For the computational prediction assessment, eighteen groups contributed gene predictions. We evaluated these submissions against each other based on a 'reference set' of annotations generated as part of the GENCODE project. These annotations were not available to the prediction groups prior to the submission deadline, so that their predictions were blind and an external advisory committee could perform a fair assessment. RESULTS: The best methods had at least one gene transcript correctly predicted for close to 70% of the annotated genes. Nevertheless, the multiple transcript accuracy, taking into account alternative splicing, reached only approximately 40% to 50% accuracy. At the coding nucleotide level, the best programs reached an accuracy of 90% in both sensitivity and specificity. Programs relying on mRNA and protein sequences were the most accurate in reproducing the manually curated annotations. Experimental validation shows that only a very small percentage (3.2%) of the selected 221 computationally predicted exons outside of the existing annotation could be verified. CONCLUSION: This is the first such experiment in human DNA, and we have followed the standards established in a similar experiment, GASP1, in Drosophila melanogaster. We believe the results presented here contribute to the value of ongoing large-scale annotation projects and should guide further experimental methods when being scaled up to the entire human genome sequence.
Resumo:
We present a lattice model to study the equilibrium phase diagram of ordered alloys with one magnetic component that exhibits a low temperature phase separation between paramagnetic and ferromagnetic phases. The model is constructed from the experimental facts observed in Cu3-xAlMnx and it includes coupling between configurational and magnetic degrees of freedom that are appropriate for reproducing the low temperature miscibility gap. The essential ingredient for the occurrence of such a coexistence region is the development of ferromagnetic order induced by the long-range atomic order of the magnetic component. A comparative study of both mean-field and Monte Carlo solutions is presented. Moreover, the model may enable the study of the structure of ferromagnetic domains embedded in the nonmagnetic matrix. This is relevant in relation to phenomena such as magnetoresistance and paramagnetism
Resumo:
Accurately calibrated effective field theories are used to compute atomic parity nonconserving (APNC) observables. Although accurately calibrated, these effective field theories predict a large spread in the neutron skin of heavy nuclei. Whereas the neutron skin is strongly correlated to numerous physical observables, in this contribution we focus on its impact on new physics through APNC observables. The addition of an isoscalar-isovector coupling constant to the effective Lagrangian generates a wide range of values for the neutron skin of heavy nuclei without compromising the success of the model in reproducing well-constrained nuclear observables. Earlier studies have suggested that the use of isotopic ratios of APNC observables may eliminate their sensitivity to atomic structure. This leaves nuclear structure uncertainties as the main impediment for identifying physics beyond the standard model. We establish that uncertainties in the neutron skin of heavy nuclei are at present too large to measure isotopic ratios to better than the 0.1% accuracy required to test the standard model. However, we argue that such uncertainties will be significantly reduced by the upcoming measurement of the neutron radius in 208^Pb at the Jefferson Laboratory.
Resumo:
We derive analytical expressions for the excitation energy of the isoscalar giant monopole and quadrupole resonances in finite nuclei, by using the scaling method and the extended ThomasFermi approach to relativistic mean-field theory. We study the ability of several nonlinear σω parameter sets of common use in reproducing the experimental data. For monopole oscillations the calculations agree better with experiment when the nuclear matter incompressibility of the relativistic interaction lies in the range 220260 MeV. The breathing-mode energies of the scaling method compare satisfactorily with those obtained in relativistic RPA and time-dependent mean-field calculations. For quadrupole oscillations, all the analyzed nonlinear parameter sets reproduce the empirical trends reasonably well.
Resumo:
Like many organisms, the cladoceran Simocephalus vetulus (Müller) continues to grow when reproducing, whereas the optimal strategy is to stop growing at maturity, and to invest all available production into reproduction thereafter. It has been proposed that a size constraint is responsible for the observed strategy (Perrin, Ruedi & Saiah, 1987), by preventing organisms from investing more than a given amount of energy into reproduction. This hypothesis is developed here and the two folowing prediction are derived: (1) the onset of reproduction should be independent of age and (2) the reproductive investement should be size-specific, thus independent of the productin rate. Both predictions are tested by rearing a clone of S.vetulus in a gradient of productivity. The results support the first prediction, but not the second one, so that the size-constraint hypothesis is disproved.
Resumo:
We investigate the depinning transition occurring in dislocation assemblies. In particular, we consider the cases of regularly spaced pileups and low-angle grain boundaries interacting with a disordered stress landscape provided by solute atoms, or by other immobile dislocations present in nonactive slip systems. Using linear elasticity, we compute the stress originated by small deformations of these assemblies and the corresponding energy cost in two and three dimensions. Contrary to the case of isolated dislocation lines, which are usually approximated as elastic strings with an effective line tension, the deformations of a dislocation assembly cannot be described by local elastic interactions with a constant tension or stiffness. A nonlocal elastic kernel results as a consequence of long-range interactions between dislocations. In light of this result, we revise statistical depinning theories of dislocation assemblies and compare the theoretical results with numerical simulations and experimental data.
Resumo:
We point out that using the heat kernel on a cone to compute the first quantum correction to the entropy of Rindler space does not yield the correct temperature dependence. In order to obtain the physics at arbitrary temperature one must compute the heat kernel in a geometry with different topology (without a conical singularity). This is done in two ways, which are shown to agree with computations performed by other methods. Also, we discuss the ambiguities in the regularization procedure and their physical consequences.
Resumo:
We propose a criterion for the validity of semiclassical gravity (SCG) which is based on the stability of the solutions of SCG with respect to quantum metric fluctuations. We pay special attention to the two-point quantum correlation functions for the metric perturbations, which contain both intrinsic and induced fluctuations. These fluctuations can be described by the Einstein-Langevin equation obtained in the framework of stochastic gravity. Specifically, the Einstein-Langevin equation yields stochastic correlation functions for the metric perturbations which agree, to leading order in the large N limit, with the quantum correlation functions of the theory of gravity interacting with N matter fields. The homogeneous solutions of the Einstein-Langevin equation are equivalent to the solutions of the perturbed semiclassical equation, which describe the evolution of the expectation value of the quantum metric perturbations. The information on the intrinsic fluctuations, which are connected to the initial fluctuations of the metric perturbations, can also be retrieved entirely from the homogeneous solutions. However, the induced metric fluctuations proportional to the noise kernel can only be obtained from the Einstein-Langevin equation (the inhomogeneous term). These equations exhibit runaway solutions with exponential instabilities. A detailed discussion about different methods to deal with these instabilities is given. We illustrate our criterion by showing explicitly that flat space is stable and a description based on SCG is a valid approximation in that case.
Resumo:
Serum-free aggregating brain cell cultures are free-floating three-dimensional primary cell cultures able to reconstitute spontaneously a histotypic brain architecture to reproduce critical steps of brain development and to reach a high level of structural and functional maturity. This culture system offers, therefore, a unique model for neurotoxicity testing both during the development and at advanced cellular differentiation, and the high number of aggregates available combined with the excellent reproducibility of the cultures facilitates routine test procedures. This chapter presents a detailed description of the preparation, maintenance, and use of these cultures for neurotoxicity studies and a comparison of the developmental characteristics between cultures derived from the telencephalon and cultures derived from the whole brain. For culture preparation, mechanically dissociated embryonic brain tissue is used. The initial cell suspension, composed of neural stem cells, neural progenitor cells, immature postmitotic neurons, glioblasts, and microglial cells, is kept in a serum-free, chemically defined medium under continuous gyratory agitation. Spherical aggregates form spontaneously and are maintained in suspension culture for several weeks. Within the aggregates, the cells rearrange and mature, reproducing critical morphogenic events, such as migration, proliferation, differentiation, synaptogenesis, and myelination. For experimentation, replicate cultures are prepared by the randomization of aggregates from several original flasks. The high yield and reproducibility of the cultures enable multiparametric endpoint analyses, including "omics" approaches.
Resumo:
Uniform-price assignment games are introduced as those assignment markets with the core reduced to a segment. In these games, for all active agents, competitive prices are uniform although products may be non-homogeneous. A characterization in terms of the assignment matrix is given. The only assignment markets where all submarkets are uniform are the Bohm-Bawerk horse markets. We prove that for uniform-price assignment games the kernel, or set of symmetrically-pairwise bargained allocations, either coincides with the core or reduces to the nucleolus
Resumo:
[cat] En el domini dels jocs bilaterals d’assignació, es presenta una axiomàtica del nucleolus com l´unica solució que compleix les propietats de consistència respecte del joc derivat definit per Owen (1992) i monotonia de les queixes dels sectors respecte de la seva cardinalitat. Com a conseqüència obtenim una caracterització geomètrica del nucleolus mitjançant una propietat de bisecció més forta que la que satisfan els punts del kernel (Maschler et al, 1979).