914 resultados para flame kernel
Resumo:
We derive a very general expression of the survival probability and the first passage time distribution for a particle executing Brownian motion in full phase space with an absorbing boundary condition at a point in the position space, which is valid irrespective of the statistical nature of the dynamics. The expression, together with the Jensen's inequality, naturally leads to a lower bound to the actual survival probability and an approximate first passage time distribution. These are expressed in terms of the position-position, velocity-velocity, and position-velocity variances. Knowledge of these variances enables one to compute a lower bound to the survival probability and consequently the first passage distribution function. As examples, we compute these for a Gaussian Markovian process and, in the case of non-Markovian process, with an exponentially decaying friction kernel and also with a power law friction kernel. Our analysis shows that the survival probability decays exponentially at the long time irrespective of the nature of the dynamics with an exponent equal to the transition state rate constant.
Resumo:
House loss during unplanned bushfires is a complex phenomenon where design, configuration, material and siting, can significantly influence the loss. In collaboration with the Bushfire Cooperative Research Centre the CSIRO has developed a tool to assess the vulnerability of a specific house at the urban interface. The tool is based on a spatial profiling of urban assets including their design, material, surrounding objects and their relationship amongst one another. The analysis incorporates both probabilistic and deterministic parameters, and is based on the impact of radiant heat, flame and embers on the surrounding elements and the structure itself. It provides a breakdown of the attributes and design parameters that contribute to the vulnerability level. This paper describes the tool which allows the user to explore the vulnerability of a house to varying levels of bushfire attacks. The tool is aimed at government agencies interested in building design, town planning and community education for bushfire risk mitigation.
Resumo:
Dynamic systems involving convolution integrals with decaying kernels, of which fractionally damped systems form a special case, are non-local in time and hence infinite dimensional. Straightforward numerical solution of such systems up to time t needs O(t(2)) computations owing to the repeated evaluation of integrals over intervals that grow like t. Finite-dimensional and local approximations are thus desirable. We present here an approximation method which first rewrites the evolution equation as a coupled in finite-dimensional system with no convolution, and then uses Galerkin approximation with finite elements to obtain linear, finite-dimensional, constant coefficient approximations for the convolution. This paper is a broad generalization, based on a new insight, of our prior work with fractional order derivatives (Singh & Chatterjee 2006 Nonlinear Dyn. 45, 183-206). In particular, the decaying kernels we can address are now generalized to the Laplace transforms of known functions; of these, the power law kernel of fractional order differentiation is a special case. The approximation can be refined easily. The local nature of the approximation allows numerical solution up to time t with O(t) computations. Examples with several different kernels show excellent performance. A key feature of our approach is that the dynamic system in which the convolution integral appears is itself approximated using another system, as distinct from numerically approximating just the solution for the given initial values; this allows non-standard uses of the approximation, e. g. in stability analyses.
Resumo:
In this paper we propose a novel family of kernels for multivariate time-series classification problems. Each time-series is approximated by a linear combination of piecewise polynomial functions in a Reproducing Kernel Hilbert Space by a novel kernel interpolation technique. Using the associated kernel function a large margin classification formulation is proposed which can discriminate between two classes. The formulation leads to kernels, between two multivariate time-series, which can be efficiently computed. The kernels have been successfully applied to writer independent handwritten character recognition.
Resumo:
Support Vector Machines(SVMs) are hyperplane classifiers defined in a kernel induced feature space. The data size dependent training time complexity of SVMs usually prohibits its use in applications involving more than a few thousands of data points. In this paper we propose a novel kernel based incremental data clustering approach and its use for scaling Non-linear Support Vector Machines to handle large data sets. The clustering method introduced can find cluster abstractions of the training data in a kernel induced feature space. These cluster abstractions are then used for selective sampling based training of Support Vector Machines to reduce the training time without compromising the generalization performance. Experiments done with real world datasets show that this approach gives good generalization performance at reasonable computational expense.
Resumo:
Automatic identification of software faults has enormous practical significance. This requires characterizing program execution behavior and the use of appropriate data mining techniques on the chosen representation. In this paper, we use the sequence of system calls to characterize program execution. The data mining tasks addressed are learning to map system call streams to fault labels and automatic identification of fault causes. Spectrum kernels and SVM are used for the former while latent semantic analysis is used for the latter The techniques are demonstrated for the intrusion dataset containing system call traces. The results show that kernel techniques are as accurate as the best available results but are faster by orders of magnitude. We also show that latent semantic indexing is capable of revealing fault-specific features.
Resumo:
In this paper, a new strategy for scaling burners based on "mild combustion" is evolved and adopted to scaling a burner from 3 to a 150 kW burner at a high heat release Late of 5 MW/m(3) Existing scaling methods (constant velocity, constant residence time, and Cole's procedure [Proc. Combust. Inst. 28 (2000) 1297]) are found to be inadequate for mild combustion burners. Constant velocity approach leads to reduced heat release rates at large sizes and constant residence time approach in unacceptable levels of pressure drop across the system. To achieve mild combustion at high heat release rates at all scales, a modified approach with high recirculation is adopted in the present studies. Major geometrical dimensions are scaled as D similar to Q(1/3) with an air injection velocity of similar to 100 m/s (Delta p similar to 600 mm water gauge). Using CFD support, the position of air injection holes is selected to enhance the recirculation rates. The precise role of secondary air is to increase the recirculation rates and burn LIP the residual CO in the downstream. Measurements of temperature and oxidizer concentrations inside 3 kW, 150 kW burner and a jet flame are used to distinguish the combustion process in these burners. The burner can be used for a wide range of fuels from LPG to producer gas as extremes. Up to 8 dB of noise level reduction is observed in comparison to the conventional combustion mode. Exhaust NO emissions below 26 and 3 ppm and temperatures 1710 and 1520 K were measured for LPG and producer gas when the burner is operated at stoichiometry. (c) 2004 The Combustion Institute. Published by Elsevier Inc. All rights reserved.
Resumo:
Microorganisms exist predominantly as sessile multispecies communities in natural habitats. Most bacterial species can form these matrix-enclosed microbial communities called biofilms. Biofilms occur in a wide range of environments, on every surface with sufficient moisture and nutrients, also on surfaces in industrial settings and engineered water systems. This unwanted biofilm formation on equipment surfaces is called biofouling. Biofouling can significantly decrease equipment performance and lifetime and cause contamination and impaired quality of the industrial product. In this thesis we studied bacterial adherence to abiotic surfaces by using coupons of stainless steel coated or not coated with fluoropolymer or diamond like carbon (DLC). As model organisms we used bacterial isolates from paper machines (Meiothermus silvanus, Pseudoxanthomonas taiwanensis and Deinococcus geothermalis) and also well characterised species isolated from medical implants (Staphylococcus epidermidis). We found that coating of steel surface with these materials reduced its tendency towards biofouling: Fluoropolymer and DLC coatings repelled all four biofilm formers on steel. We found great differences between bacterial species in their preference of surfaces to adhere as well as their ultrastructural details, like number and thickness of adhesion organelles they expressed. These details responded differently towards the different surfaces they adhered to. We further found that biofilms of D. geothermalis formed on titanium dioxide coated coupons of glass, steel and titanium, were effectively removed by photocatalytic action in response to irradiation at 360 nm. However, on non-coated glass or steel surfaces irradiation had no detectable effect on the amount of bacterial biomass. We showed that the adhesion organelles of bacteria on illuminated TiO2 coated coupons were complety destroyed whereas on non-coated coupons they looked intact when observed by microscope. Stainless steel is the most widely used material for industrial process equipments and surfaces. The results in this thesis showed that stainless steel is prone to biofouling by phylogenetically distant bacterial species and that coating of the steel may offer a tool for reduced biofouling of industrial equipment. Photocatalysis, on the other hand, is a potential technique for biofilm removal from surfaces in locations where high level of hygiene is required. Our study of natural biofilms on barley kernel surfaces showed that also there the microbes possessed adhesion organelles visible with electronmicroscope both before and after steeping. The microbial community of dry barley kernels turned into a dense biofilm covered with slimy extracellular polymeric substance (EPS) in the kernels after steeping in water. Steeping is the first step in malting. We also presented evidence showing that certain strains of Lactobacillus plantarum and Wickerhamomyces anomalus, when used as starter cultures in the steeping water, could enter the barley kernel and colonise the tissues of the barley kernel. By use of a starter culture it was possible to reduce the extensive production of EPS, which resulted in a faster filtration of the mash.
Resumo:
In this paper the problem of ignition and extinction has been formulated for the flow of a compressible fluid with Prandtl and Schmidt numbers taken as unity. In particular, the problems of (i) a jet impinging on a wall of combustible material and (ii) the opposed jet diffusion flame have been studied. In the wall jet case, three approximations in the momentum equation namely, (i) potential flow, (ii) viscous flow, (ii) viscous incompressible with k = 1 and (iii) Lees' approximation (taking pressure gradient terms zero) are studied. It is shown that the predictions of the mass flow rates at extinction are not very sensitive to the approximations made in the momentum equation. The effects of varying the wall temperature in the case (i) and the jet temperature in the case (ii) on the extinction speeds have been studied. The effects of varying the activation energy and the free stream oxidant concentration in case (ii), have also been investigated.
Resumo:
A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.
Resumo:
A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.
Resumo:
According to certain arguments, computation is observer-relative either in the sense that many physical systems implement many computations (Hilary Putnam), or in the sense that almost all physical systems implement all computations (John Searle). If sound, these arguments have a potentially devastating consequence for the computational theory of mind: if arbitrary physical systems can be seen to implement arbitrary computations, the notion of computation seems to lose all explanatory power as far as brains and minds are concerned. David Chalmers and B. Jack Copeland have attempted to counter these relativist arguments by placing certain constraints on the definition of implementation. In this thesis, I examine their proposals and find both wanting in some respects. During the course of this examination, I give a formal definition of the class of combinatorial-state automata , upon which Chalmers s account of implementation is based. I show that this definition implies two theorems (one an observation due to Curtis Brown) concerning the computational power of combinatorial-state automata, theorems which speak against founding the theory of implementation upon this formalism. Toward the end of the thesis, I sketch a definition of the implementation of Turing machines in dynamical systems, and offer this as an alternative to Chalmers s and Copeland s accounts of implementation. I demonstrate that the definition does not imply Searle s claim for the universal implementation of computations. However, the definition may support claims that are weaker than Searle s, yet still troubling to the computationalist. There remains a kernel of relativity in implementation at any rate, since the interpretation of physical systems seems itself to be an observer-relative matter, to some degree at least. This observation helps clarify the role the notion of computation can play in cognitive science. Specifically, I will argue that the notion should be conceived as an instrumental rather than as a fundamental or foundational one.
Resumo:
Statistical learning algorithms provide a viable framework for geotechnical engineering modeling. This paper describes two statistical learning algorithms applied for site characterization modeling based on standard penetration test (SPT) data. More than 2700 field SPT values (N) have been collected from 766 boreholes spread over an area of 220 sqkm area in Bangalore. To get N corrected value (N,), N values have been corrected (Ne) for different parameters such as overburden stress, size of borehole, type of sampler, length of connecting rod, etc. In three-dimensional site characterization model, the function N-c=N-c (X, Y, Z), where X, Y and Z are the coordinates of a point corresponding to N, value, is to be approximated in which N, value at any half-space point in Bangalore can be determined. The first algorithm uses least-square support vector machine (LSSVM), which is related to aridge regression type of support vector machine. The second algorithm uses relevance vector machine (RVM), which combines the strengths of kernel-based methods and Bayesian theory to establish the relationships between a set of input vectors and a desired output. The paper also presents the comparative study between the developed LSSVM and RVM model for site characterization. Copyright (C) 2009 John Wiley & Sons,Ltd.
First simultaneous measurement of the top quark mass in the lepton+jets and dilepton channels at CDF
Resumo:
We present a measurement of the mass of the top quark using data corresponding to an integrated luminosity of 1.9fb^-1 of ppbar collisions collected at sqrt{s}=1.96 TeV with the CDF II detector at Fermilab's Tevatron. This is the first measurement of the top quark mass using top-antitop pair candidate events in the lepton + jets and dilepton decay channels simultaneously. We reconstruct two observables in each channel and use a non-parametric kernel density estimation technique to derive two-dimensional probability density functions from simulated signal and background samples. The observables are the top quark mass and the invariant mass of two jets from the W decay in the lepton + jets channel, and the top quark mass and the scalar sum of transverse energy of the event in the dilepton channel. We perform a simultaneous fit for the top quark mass and the jet energy scale, which is constrained in situ by the hadronic W boson mass. Using 332 lepton + jets candidate events and 144 dilepton candidate events, we measure the top quark mass to be mtop=171.9 +/- 1.7 (stat. + JES) +/- 1.1 (syst.) GeV/c^2 = 171.9 +/- 2.0 GeV/c^2.
Resumo:
A knowledge of the concentration distribution around a burning droplet is essential if accurate estimates are to be made of the transport coefficients in that region which influence the burning rate. There are two aspects of this paper; (1) determination of the concentration profiles, using the simple assumption of constant binary diffusion coefficients for all species, and comparison with experiments; and (2) postulation of a new relation for the therinal conductivity, which takes into account the variations of both temperature and concentrations of various species. First, the theoretical concentration profiles are evaluated and compared with experimental results reported elsewhere [5]. It is found that the agreement between the theory and experiment is fairly satisfactory. Then, by the use of these profiles and the relations proposed in the literature for the thermal conductivity of a mixture of nonpolar gases, a new relation for thermal conductivity: K = (A1 + B1 T) + (A2 + B2 T) xr (21). is suggested for analytical solutions of droplet combustion problems. Equations are presented to evaluate A1, A2, B1, and B2, and values of these terms for a few hydrocarbons are tabulated.