884 resultados para kernel


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

According to certain arguments, computation is observer-relative either in the sense that many physical systems implement many computations (Hilary Putnam), or in the sense that almost all physical systems implement all computations (John Searle). If sound, these arguments have a potentially devastating consequence for the computational theory of mind: if arbitrary physical systems can be seen to implement arbitrary computations, the notion of computation seems to lose all explanatory power as far as brains and minds are concerned. David Chalmers and B. Jack Copeland have attempted to counter these relativist arguments by placing certain constraints on the definition of implementation. In this thesis, I examine their proposals and find both wanting in some respects. During the course of this examination, I give a formal definition of the class of combinatorial-state automata , upon which Chalmers s account of implementation is based. I show that this definition implies two theorems (one an observation due to Curtis Brown) concerning the computational power of combinatorial-state automata, theorems which speak against founding the theory of implementation upon this formalism. Toward the end of the thesis, I sketch a definition of the implementation of Turing machines in dynamical systems, and offer this as an alternative to Chalmers s and Copeland s accounts of implementation. I demonstrate that the definition does not imply Searle s claim for the universal implementation of computations. However, the definition may support claims that are weaker than Searle s, yet still troubling to the computationalist. There remains a kernel of relativity in implementation at any rate, since the interpretation of physical systems seems itself to be an observer-relative matter, to some degree at least. This observation helps clarify the role the notion of computation can play in cognitive science. Specifically, I will argue that the notion should be conceived as an instrumental rather than as a fundamental or foundational one.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Statistical learning algorithms provide a viable framework for geotechnical engineering modeling. This paper describes two statistical learning algorithms applied for site characterization modeling based on standard penetration test (SPT) data. More than 2700 field SPT values (N) have been collected from 766 boreholes spread over an area of 220 sqkm area in Bangalore. To get N corrected value (N,), N values have been corrected (Ne) for different parameters such as overburden stress, size of borehole, type of sampler, length of connecting rod, etc. In three-dimensional site characterization model, the function N-c=N-c (X, Y, Z), where X, Y and Z are the coordinates of a point corresponding to N, value, is to be approximated in which N, value at any half-space point in Bangalore can be determined. The first algorithm uses least-square support vector machine (LSSVM), which is related to aridge regression type of support vector machine. The second algorithm uses relevance vector machine (RVM), which combines the strengths of kernel-based methods and Bayesian theory to establish the relationships between a set of input vectors and a desired output. The paper also presents the comparative study between the developed LSSVM and RVM model for site characterization. Copyright (C) 2009 John Wiley & Sons,Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a measurement of the mass of the top quark using data corresponding to an integrated luminosity of 1.9fb^-1 of ppbar collisions collected at sqrt{s}=1.96 TeV with the CDF II detector at Fermilab's Tevatron. This is the first measurement of the top quark mass using top-antitop pair candidate events in the lepton + jets and dilepton decay channels simultaneously. We reconstruct two observables in each channel and use a non-parametric kernel density estimation technique to derive two-dimensional probability density functions from simulated signal and background samples. The observables are the top quark mass and the invariant mass of two jets from the W decay in the lepton + jets channel, and the top quark mass and the scalar sum of transverse energy of the event in the dilepton channel. We perform a simultaneous fit for the top quark mass and the jet energy scale, which is constrained in situ by the hadronic W boson mass. Using 332 lepton + jets candidate events and 144 dilepton candidate events, we measure the top quark mass to be mtop=171.9 +/- 1.7 (stat. + JES) +/- 1.1 (syst.) GeV/c^2 = 171.9 +/- 2.0 GeV/c^2.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Core Vector Machine(CVM) is suitable for efficient large-scale pattern classification. In this paper, a method for improving the performance of CVM with Gaussian kernel function irrespective of the orderings of patterns belonging to different classes within the data set is proposed. This method employs a selective sampling based training of CVM using a novel kernel based scalable hierarchical clustering algorithm. Empirical studies made on synthetic and real world data sets show that the proposed strategy performs well on large data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper discusses a method for scaling SVM with Gaussian kernel function to handle large data sets by using a selective sampling strategy for the training set. It employs a scalable hierarchical clustering algorithm to construct cluster indexing structures of the training data in the kernel induced feature space. These are then used for selective sampling of the training data for SVM to impart scalability to the training process. Empirical studies made on real world data sets show that the proposed strategy performs well on large data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The near flow field of small aspect ratio elliptic turbulent free jets (issuing from nozzle and orifice) was experimentally studied using a 2D PIV. Two point velocity correlations in these jets revealed the extent and orientation of the large scale structures in the major and minor planes. The spatial filtering of the instantaneous velocity field using Gaussian convolution kernel shows that while a single large vortex ring circumscribing the jet seems to be present at the exit of nozzle, the orifice jet exhibited a number of smaller vortex ring pairs close to jet exit. The smaller length scale observed in the case of the orifice jet is representative of the smaller azimuthal vortex rings that generate axial vortex field as they are convected. This results in the axis-switching in the case of orifice jet and may have a mechanism different from the self induction process as observed in the case of contoured nozzle jet flow.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To enhance the utilization of the wood, the sawmills are forced to place more emphasis on planning to master the whole production chain from the forest to the end product. One significant obstacle to integrating the forest-sawmill-market production chain is the lack of appropriate information about forest stands. Since the wood procurement point of view in forest planning systems has been almost totally disregarded there has been a great need to develop an easy and efficient pre-harvest measurement method, allowing separate measurement of stands prior to harvesting. The main purpose of this study was to develop a measurement method for pine stands which forest managers could use in describing the properties of the standing trees for sawing production planning. Study materials were collected from ten Scots pine stands (Pinus sylvestris) located in North Häme and South Pohjanmaa, in southern Finland. The data comprise test sawing data on 314 pine stems, dbh and height measures of all trees and measures of the quality parameters of pine sawlog stems in all ten study stands as well as the locations of all trees in six stands. The study was divided into four sub-studies which deal with pine quality prediction, construction of diameter and dead branch height distributions, sampling designs and applying height and crown height models. The final proposal for the pre-harvest measurement method is a synthesis of the individual sub-studies. Quality analysis resulted in choosing dbh, distance from stump height to the first dead branch (dead branch height), crown height and tree height as the most appropriate quality characteristics of Scots pine. Dbh and dead branch height are measured from each pine sample tree while height and crown height are derived from dbh measures by aid of mixed height and crown height models. Pine and spruce diameter distribution as well as dead branch height distribution are most effectively predicted by the kernel function. Roughly 25 sample trees seems to be appropriate in pure pine stands. In mixed stands the number of sample trees needs to be increased in proportion to the intensity of pines in order to attain the same level of accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fujikawa's method of evaluating the supercurrent and the superconformal current anomalies, using the heat-kernel regularization scheme, is extended to theories with gauge invariance, in particular, to the off-shell N=1 supersymmetric Yang-Mills (SSYM) theory. The Jacobians of supersymmetry and superconformal transformations are finite. Although the gauge-fixing term is not supersymmetric and the regularization scheme is not manifestly supersymmetric, we find that the regularized Jacobians are gauge invariant and finite and they can be expressed in such a way that there is no one-loop supercurrent anomaly for the N=1 SSYM theory. The superconformal anomaly is nonzero and the anomaly agrees with a similar result obtained using other methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fujikawa's method of evaluating the anomalies is extended to the on-shell supersymmetric (SUSY) theories. The supercurrent and the superconformal current anomalies are evaluated for the Wess-Zumino model using the background-field formulation and heat-kernel regularization. We find that the regularized Jacobians for SUSY and superconformal transformations are finite. The results can be expressed in a form such that there is no supercurrent anomaly but a finite nonzero superconformal anomaly, in agreement with similar results obtained using other methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method was developed for relative radiometric calibration of single multitemporal Landsat TM image, several multitemporal images covering each others, and several multitemporal images covering different geographic locations. The radiometricly calibrated difference images were used for detecting rapid changes on forest stands. The nonparametric Kernel method was applied for change detection. The accuracy of the change detection was estimated by inspecting the image analysis results in field. The change classification was applied for controlling the quality of the continuously updated forest stand information. The aim was to ensure that all the manmade changes and any forest damages were correctly updated including the attribute and stand delineation information. The image analysis results were compared with the registered treatments and the stand information base. The stands with discrepancies between these two information sources were recommended to be field inspected.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The characteristic function for a contraction is a classical complete unitary invariant devised by Sz.-Nagy and Foias. Just as a contraction is related to the Szego kernel k(S) (z, w) = (1 - z (w) over tilde)(-1) for |z|, |w| < 1, by means of (1/k(S))(T,T*) >= 0, we consider an arbitrary open connected domain Omega in C-n, a complete Pick kernel k on Omega and a tuple T = (T-1, ..., T-n) of commuting bounded operators on a complex separable Hilbert space H such that (1/k)(T,T*) >= 0. For a complete Pick kernel the 1/k functional calculus makes sense in a beautiful way. It turns out that the model theory works very well and a characteristic function can be associated with T. Moreover, the characteristic function is then a complete unitary invariant for a suitable class of tuples T.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using a modified Green's function technique the two well-known basic problems of scattering of surface water waves by vertical barriers are reduced to the problem of solving a pair of uncoupled integral equations involving the “jump” and “sum” of the limiting values of the velocity potential on the two sides of the barriers in each case. These integral equations are then solved, in closed form, by the aid of an integral transform technique involving a general trigonometric kernel as applicable to the problems associated with a radiation condition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several recent theoretical and computer simulation studies have considered solvation dynamics in a Brownian dipolar lattice which provides a simple model solvent for which detailed calculations can be carried out. In this article a fully microscopic calculation of the solvation dynamics of an ion in a Brownian dipolar lattice is presented. The calculation is based on the non‐Markovian molecular hydrodynamic theory developed recently. The main assumption of the present calculation is that the two‐particle orientational correlation functions of the solid can be replaced by those of the liquid state. It is shown that such a calculation provides an excellent agreement with the computer simulation results. More importantly, the present calculations clearly demonstrate that the frequency‐dependent dielectric friction plays an important role in the long time decay of the solvation time correlation function. We also find that the present calculation provides somewhat better agreement than either the dynamic mean spherical approximation (DMSA) or the Fried–Mukamel theory which use the simulated frequency‐dependent dielectric function. It is found that the dissipative kernels used in the molecular hydrodynamic approach and in the Fried–Mukamel theory are vastly different, especially at short times. However, in spite of this disagreement, the two theories still lead to comparable results in good agreement with computer simulation, which suggests that even a semiquantitatively accurate dissipative kernel may be sufficient to obtain a reliable solvation time correlation function. A new wave vector and frequency‐dependent dissipative kernel (or memory function) is proposed which correctly goes over to the appropriate expressions in both the single particle and the collective limits. This form is expected to lead to better results than all the existing descriptions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A theoretical analysis of the three currently popular microscopic theories of solvation dynamics, namely, the dynamic mean spherical approximation (DMSA), the molecular hydrodynamic theory (MHT), and the memory function theory (MFT) is carried out. It is shown that in the underdamped limit of momentum relaxation, all three theories lead to nearly identical results when the translational motions of both the solute ion and the solvent molecules are neglected. In this limit, the theoretical prediction is in almost perfect agreement with the computer simulation results of solvation dynamics in the model Stockmayer liquid. However, the situation changes significantly in the presence of the translational motion of the solvent molecules. In this case, DMSA breaks down but the other two theories correctly predict the acceleration of solvation in agreement with the simulation results. We find that the translational motion of a light solute ion can play an important role in its own solvation. None of the existing theories describe this aspect. A generalization of the extended hydrodynamic theory is presented which, for the first time, includes the contribution of solute motion towards its own solvation dynamics. The extended theory gives excellent agreement with the simulations where solute motion is allowed. It is further shown that in the absence of translation, the memory function theory of Fried and Mukamel can be recovered from the hydrodynamic equations if the wave vector dependent dissipative kernel in the hydrodynamic description is replaced by its long wavelength value. We suggest a convenient memory kernel which is superior to the limiting forms used in earlier descriptions. We also present an alternate, quite general, statistical mechanical expression for the time dependent solvation energy of an ion. This expression has remarkable similarity with that for the translational dielectric friction on a moving ion.