947 resultados para OF-ONSET DISTRIBUTIONS
Resumo:
The object of research presented here is Vessiot's theory of partial differential equations: for a given differential equation one constructs a distribution both tangential to the differential equation and contained within the contact distribution of the jet bundle. Then within it, one seeks n-dimensional subdistributions which are transversal to the base manifold, the integral distributions. These consist of integral elements, and these again shall be adapted so that they make a subdistribution which closes under the Lie-bracket. This then is called a flat Vessiot connection. Solutions to the differential equation may be regarded as integral manifolds of these distributions. In the first part of the thesis, I give a survey of the present state of the formal theory of partial differential equations: one regards differential equations as fibred submanifolds in a suitable jet bundle and considers formal integrability and the stronger notion of involutivity of differential equations for analyzing their solvability. An arbitrary system may (locally) be represented in reduced Cartan normal form. This leads to a natural description of its geometric symbol. The Vessiot distribution now can be split into the direct sum of the symbol and a horizontal complement (which is not unique). The n-dimensional subdistributions which close under the Lie bracket and are transversal to the base manifold are the sought tangential approximations for the solutions of the differential equation. It is now possible to show their existence by analyzing the structure equations. Vessiot's theory is now based on a rigorous foundation. Furthermore, the relation between Vessiot's approach and the crucial notions of the formal theory (like formal integrability and involutivity of differential equations) is clarified. The possible obstructions to involution of a differential equation are deduced explicitly. In the second part of the thesis it is shown that Vessiot's approach for the construction of the wanted distributions step by step succeeds if, and only if, the given system is involutive. Firstly, an existence theorem for integral distributions is proven. Then an existence theorem for flat Vessiot connections is shown. The differential-geometric structure of the basic systems is analyzed and simplified, as compared to those of other approaches, in particular the structure equations which are considered for the proofs of the existence theorems: here, they are a set of linear equations and an involutive system of differential equations. The definition of integral elements given here links Vessiot theory and the dual Cartan-Kähler theory of exterior systems. The analysis of the structure equations not only yields theoretical insight but also produces an algorithm which can be used to derive the coefficients of the vector fields, which span the integral distributions, explicitly. Therefore implementing the algorithm in the computer algebra system MuPAD now is possible.
Calculation of the hyperfine structure transition energy and lifetime in the one-electron Bi^82+ ion
Resumo:
We calculate the energy and lifetime of the ground state hyperfine structure transition in one-electron Bi^82+ . The influence of various distributions of the magnetic moment and the electric charge in the nucleus ^209_83 Bi on energy and lifetime is studied.
Resumo:
We present distribution independent bounds on the generalization misclassification performance of a family of kernel classifiers with margin. Support Vector Machine classifiers (SVM) stem out of this class of machines. The bounds are derived through computations of the $V_gamma$ dimension of a family of loss functions where the SVM one belongs to. Bounds that use functions of margin distributions (i.e. functions of the slack variables of SVM) are derived.
Resumo:
Low concentrations of elements in geochemical analyses have the peculiarity of being compositional data and, for a given level of significance, are likely to be beyond the capabilities of laboratories to distinguish between minute concentrations and complete absence, thus preventing laboratories from reporting extremely low concentrations of the analyte. Instead, what is reported is the detection limit, which is the minimum concentration that conclusively differentiates between presence and absence of the element. A spatially distributed exhaustive sample is employed in this study to generate unbiased sub-samples, which are further censored to observe the effect that different detection limits and sample sizes have on the inference of population distributions starting from geochemical analyses having specimens below detection limit (nondetects). The isometric logratio transformation is used to convert the compositional data in the simplex to samples in real space, thus allowing the practitioner to properly borrow from the large source of statistical techniques valid only in real space. The bootstrap method is used to numerically investigate the reliability of inferring several distributional parameters employing different forms of imputation for the censored data. The case study illustrates that, in general, best results are obtained when imputations are made using the distribution best fitting the readings above detection limit and exposes the problems of other more widely used practices. When the sample is spatially correlated, it is necessary to combine the bootstrap with stochastic simulation
Resumo:
AEA Technology has provided an assessment of the probability of α-mode containment failure for the Sizewell B PWR. After a preliminary review of the methodologies available it was decided to use the probabilistic approach described in the paper, based on an extension of the methodology developed by Theofanous et al. (Nucl. Sci. Eng. 97 (1987) 259–325). The input to the assessment is 12 probability distributions; the bases for the quantification of these distributions are discussed. The α-mode assessment performed for the Sizewell B PWR has demonstrated the practicality of the event-tree method with input data represented by probability distributions. The assessment itself has drawn attention to a number of topics, which may be plant and sequence dependent, and has indicated the importance of melt relocation scenarios. The α-mode failure probability following an accident that leads to core melt relocation to the lower head for the Sizewell B PWR has been assessed as a few parts in 10 000, on the basis of current information. This assessment has been the first to consider elevated pressures (6 MPa and 15 MPa) besides atmospheric pressure, but the results suggest only a modest sensitivity to system pressure.
Resumo:
[ 1] There has been a paucity of information on trends in daily climate and climate extremes, especially from developing countries. We report the results of the analysis of daily temperature ( maximum and minimum) and precipitation data from 14 south and west African countries over the period 1961 - 2000. Data were subject to quality control and processing into indices of climate extremes for release to the global community. Temperature extremes show patterns consistent with warming over most of the regions analyzed, with a large proportion of stations showing statistically significant trends for all temperature indices. Over 1961 to 2000, the regionally averaged occurrence of extreme cold ( fifth percentile) days and nights has decreased by - 3.7 and - 6.0 days/decade, respectively. Over the same period, the occurrence of extreme hot (95th percentile) days and nights has increased by 8.2 and 8.6 days/decade, respectively. The average duration of warm ( cold) has increased ( decreased) by 2.4 (0.5) days/decade and warm spells. Overall, it appears that the hot tails of the distributions of daily maximum temperature have changed more than the cold tails; for minimum temperatures, hot tails show greater changes in the NW of the region, while cold tails have changed more in the SE and east. The diurnal temperature range (DTR) does not exhibit a consistent trend across the region, with many neighboring stations showing opposite trends. However, the DTR shows consistent increases in a zone across Namibia, Botswana, Zambia, and Mozambique, coinciding with more rapid increases in maximum temperature than minimum temperature extremes. Most precipitation indices do not exhibit consistent or statistically significant trends across the region. Regionally averaged total precipitation has decreased but is not statistically significant. At the same time, there has been a statistically significant increase in regionally averaged daily rainfall intensity and dry spell duration. While the majority of stations also show increasing trends for these two indices, only a few of these are statistically significant. There are increasing trends in regionally averaged rainfall on extreme precipitation days and in maximum annual 5-day and 1-day rainfall, but only trends for the latter are statistically significant.
Resumo:
Two simple and frequently used capture–recapture estimates of the population size are compared: Chao's lower-bound estimate and Zelterman's estimate allowing for contaminated distributions. In the Poisson case it is shown that if there are only counts of ones and twos, the estimator of Zelterman is always bounded above by Chao's estimator. If counts larger than two exist, the estimator of Zelterman is becoming larger than that of Chao's, if only the ratio of the frequencies of counts of twos and ones is small enough. A similar analysis is provided for the binomial case. For a two-component mixture of Poisson distributions the asymptotic bias of both estimators is derived and it is shown that the Zelterman estimator can experience large overestimation bias. A modified Zelterman estimator is suggested and also the bias-corrected version of Chao's estimator is considered. All four estimators are compared in a simulation study.
Resumo:
Two simple and frequently used capture–recapture estimates of the population size are compared: Chao's lower-bound estimate and Zelterman's estimate allowing for contaminated distributions. In the Poisson case it is shown that if there are only counts of ones and twos, the estimator of Zelterman is always bounded above by Chao's estimator. If counts larger than two exist, the estimator of Zelterman is becoming larger than that of Chao's, if only the ratio of the frequencies of counts of twos and ones is small enough. A similar analysis is provided for the binomial case. For a two-component mixture of Poisson distributions the asymptotic bias of both estimators is derived and it is shown that the Zelterman estimator can experience large overestimation bias. A modified Zelterman estimator is suggested and also the bias-corrected version of Chao's estimator is considered. All four estimators are compared in a simulation study.
Resumo:
The concept of an organism's niche is central to ecological theory, but an operational definition is needed that allows both its experimental delineation and interpretation of field distributions of the species. Here we use population growth rate (hereafter, pgr) to de. ne the niche as the set of points in niche space where pgr. 0. If there are just two axes to the niche space, their relationship to pgr can be pictured as a contour map in which pgr varies along the axes in the same way that the height of land above sea level varies with latitude and longitude. In laboratory experiments we measured the pgr of Daphnia magna over a grid of values of pH and Ca2+, and so defined its "laboratory niche'' in pH-Ca2+ space. The position of the laboratory niche boundary suggests that population persistence is only possible above 0.5 mg Ca2+/L and between pH 5.75 and pH 9, though more Ca2+ is needed at lower pH values. To see how well the measured niche predicts the field distribution of D. magna, we examined relevant field data from 422 sites in England and Wales. Of the 58 colonized water bodies, 56 lay within the laboratory niche. Very few of the sites near the niche boundary were colonized, probably because pgr there is so low that populations are vulnerable to extinction by other factors. Our study shows how the niche can be quantified and used to predict field distributions successfully.
Resumo:
The Cape Floristic Region is exceptionally species-rich both for its area and latitude, and this diversity is highly unevenly distributed among genera. The modern flora is hypothesized to result largely from recent (post-Oligocene) speciation, and it has long been speculated that particular species-poor lineages pre-date this burst of speciation. Here, we employ molecular phylogenetic data in combination with fossil calibrations to estimate the minimum duration of Cape occupation by 14 unrelated putative relicts. Estimates vary widely between lineages (7-101 Myr ago), and when compared with the estimated timing of onset of the modern flora's radiation, it is clear that many, but possibly not all, of these lineages pre-date its establishment. Statistical comparisons of diversities with lineage age show that low species diversity of many of the putative relicts results from a lower rate of diversification than in dated Cape radiations. In other putative relicts, however, we cannot reject the possibility that they diversify at the same underlying rate as the radiations, but have been present in the Cape for insufficient time to accumulate higher diversity. Although the extremes in diversity of currently dated Cape lineages fall outside expectations under a underlying diversification rate, sampling of all Cape lineages would be required to reject this null hypothesis.
Resumo:
Estimation of a population size by means of capture-recapture techniques is an important problem occurring in many areas of life and social sciences. We consider the frequencies of frequencies situation, where a count variable is used to summarize how often a unit has been identified in the target population of interest. The distribution of this count variable is zero-truncated since zero identifications do not occur in the sample. As an application we consider the surveillance of scrapie in Great Britain. In this case study holdings with scrapie that are not identified (zero counts) do not enter the surveillance database. The count variable of interest is the number of scrapie cases per holding. For count distributions a common model is the Poisson distribution and, to adjust for potential heterogeneity, a discrete mixture of Poisson distributions is used. Mixtures of Poissons usually provide an excellent fit as will be demonstrated in the application of interest. However, as it has been recently demonstrated, mixtures also suffer under the so-called boundary problem, resulting in overestimation of population size. It is suggested here to select the mixture model on the basis of the Bayesian Information Criterion. This strategy is further refined by employing a bagging procedure leading to a series of estimates of population size. Using the median of this series, highly influential size estimates are avoided. In limited simulation studies it is shown that the procedure leads to estimates with remarkable small bias.
Resumo:
This paper investigates how sequential bilingual (L2) Turkish-English children comprehend English reflexives and pronouns and tests whether they pattern similarly to monolingual (L1) children, L2 adults, or children with Specific Language Impairment (SLI). Thirty nine 6- to 9-year-old L2 children with an age of onset of 30-48 months and exposure to English of 30-72 months and 33 L1 age-matched control children completed the Advanced Syntactic Test of Pronominal Reference-Revised (van der Lely, 1997). The L2 children’s performance was compared to L2 adults from Demirci (2001) and children with SLI from van der Lely & Stollwerck (1997). The L2 children’s performance in the comprehension of reflexives was almost identical to their age-matched controls, and differed from L2 adults and children with SLI. In the comprehension of pronouns, L2 children showed an asymmetry between referential and quantificational NPs, a pattern attested in younger L1 children and children with SLI. Our study provides evidence that the development of comprehension of reflexives and pronouns in these children resembles monolingual L1 acquisition and not adult L2 acquisition or acquisition of children with SLI.
Resumo:
The mean state, variability and extreme variability of the stratospheric polar vortices, with an emphasis on the Northern Hemisphere vortex, are examined using 2-dimensional moment analysis and Extreme Value Theory (EVT). The use of moments as an analysis to ol gives rise to information about the vortex area, centroid latitude, aspect ratio and kurtosis. The application of EVT to these moment derived quantaties allows the extreme variability of the vortex to be assessed. The data used for this study is ECMWF ERA-40 potential vorticity fields on interpolated isentropic surfaces that range from 450K-1450K. Analyses show that the most extreme vortex variability occurs most commonly in late January and early February, consistent with when most planetary wave driving from the troposphere is observed. Composites around sudden stratospheric warming (SSW) events reveal that the moment diagnostics evolve in statistically different ways between vortex splitting events and vortex displacement events, in contrast to the traditional diagnostics. Histograms of the vortex diagnostics on the 850K (∼10hPa) surface over the 1958-2001 period are fitted with parametric distributions, and show that SSW events comprise the majority of data in the tails of the distributions. The distribution of each diagnostic is computed on various surfaces throughout the depth of the stratosphere, and shows that in general the vortex becomes more circular with higher filamentation at the upper levels. The Northern Hemisphere (NH) and Southern Hemisphere (SH) vortices are also compared through the analysis of their respective vortex diagnostics, and confirm that the SH vortex is less variable and lacks extreme events compared to the NH vortex. Finally extreme value theory is used to statistically mo del the vortex diagnostics and make inferences about the underlying dynamics of the polar vortices.
Resumo:
It is generally assumed that the variability of neuronal morphology has an important effect on both the connectivity and the activity of the nervous system, but this effect has not been thoroughly investigated. Neuroanatomical archives represent a crucial tool to explore structure–function relationships in the brain. We are developing computational tools to describe, generate, store and render large sets of three–dimensional neuronal structures in a format that is compact, quantitative, accurate and readily accessible to the neuroscientist. Single–cell neuroanatomy can be characterized quantitatively at several levels. In computer–aided neuronal tracing files, a dendritic tree is described as a series of cylinders, each represented by diameter, spatial coordinates and the connectivity to other cylinders in the tree. This ‘Cartesian’ description constitutes a completely accurate mapping of dendritic morphology but it bears little intuitive information for the neuroscientist. In contrast, a classical neuroanatomical analysis characterizes neuronal dendrites on the basis of the statistical distributions of morphological parameters, e.g. maximum branching order or bifurcation asymmetry. This description is intuitively more accessible, but it only yields information on the collective anatomy of a group of dendrites, i.e. it is not complete enough to provide a precise ‘blueprint’ of the original data. We are adopting a third, intermediate level of description, which consists of the algorithmic generation of neuronal structures within a certain morphological class based on a set of ‘fundamental’, measured parameters. This description is as intuitive as a classical neuroanatomical analysis (parameters have an intuitive interpretation), and as complete as a Cartesian file (the algorithms generate and display complete neurons). The advantages of the algorithmic description of neuronal structure are immense. If an algorithm can measure the values of a handful of parameters from an experimental database and generate virtual neurons whose anatomy is statistically indistinguishable from that of their real counterparts, a great deal of data compression and amplification can be achieved. Data compression results from the quantitative and complete description of thousands of neurons with a handful of statistical distributions of parameters. Data amplification is possible because, from a set of experimental neurons, many more virtual analogues can be generated. This approach could allow one, in principle, to create and store a neuroanatomical database containing data for an entire human brain in a personal computer. We are using two programs, L–NEURON and ARBORVITAE, to investigate systematically the potential of several different algorithms for the generation of virtual neurons. Using these programs, we have generated anatomically plausible virtual neurons for several morphological classes, including guinea pig cerebellar Purkinje cells and cat spinal cord motor neurons. These virtual neurons are stored in an online electronic archive of dendritic morphology. This process highlights the potential and the limitations of the ‘computational neuroanatomy’ strategy for neuroscience databases.
Resumo:
View-based and Cartesian representations provide rival accounts of visual navigation in humans, and here we explore possible models for the view-based case. A visual “homing” experiment was undertaken by human participants in immersive virtual reality. The distributions of end-point errors on the ground plane differed significantly in shape and extent depending on visual landmark configuration and relative goal location. A model based on simple visual cues captures important characteristics of these distributions. Augmenting visual features to include 3D elements such as stereo and motion parallax result in a set of models that describe the data accurately, demonstrating the effectiveness of a view-based approach.