996 resultados para Statistical Convergence


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A central composite rotatable experimental design was constructed for a statistical study of the ethylation of benzene in the liquid phase, with aluminum chloride catalyst, in an agitated tank system. The conversion of benzene and ethylene and the yield of monoethyl- and diethylbenzene are characterized by the response surface technique. In the experimental range studied, agitation rate has no significant effect. Catalyst concentration, rate of ethylene Flow, and temperature are the influential factors. The response surfaces may be adequately approximated by planes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is well known that the numerical accuracy of a series solution to a boundary-value problem by the direct method depends on the technique of approximate satisfaction of the boundary conditions and on the stage of truncation of the series. On the other hand, it does not appear to be generally recognized that, when the boundary conditions can be described in alternative equivalent forms, the convergence of the solution is significantly affected by the actual form in which they are stated. The importance of the last aspect is studied for three different techniques of computing the deflections of simply supported regular polygonal plates under uniform pressure. It is also shown that it is sometimes possible to modify the technique of analysis to make the accuracy independent of the description of the boundary conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two algorithms are outlined, each of which has interesting features for modeling of spatial variability of rock depth. In this paper, reduced level of rock at Bangalore, India, is arrived from the 652 boreholes data in the area covering 220 sqa <.km. Support vector machine (SVM) and relevance vector machine (RVM) have been utilized to predict the reduced level of rock in the subsurface of Bangalore and to study the spatial variability of the rock depth. The support vector machine (SVM) that is firmly based on the theory of statistical learning theory uses regression technique by introducing epsilon-insensitive loss function has been adopted. RVM is a probabilistic model similar to the widespread SVM, but where the training takes place in a Bayesian framework. Prediction results show the ability of learning machine to build accurate models for spatial variability of rock depth with strong predictive capabilities. The paper also highlights the capability ofRVM over the SVM model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The core aim of machine learning is to make a computer program learn from the experience. Learning from data is usually defined as a task of learning regularities or patterns in data in order to extract useful information, or to learn the underlying concept. An important sub-field of machine learning is called multi-view learning where the task is to learn from multiple data sets or views describing the same underlying concept. A typical example of such scenario would be to study a biological concept using several biological measurements like gene expression, protein expression and metabolic profiles, or to classify web pages based on their content and the contents of their hyperlinks. In this thesis, novel problem formulations and methods for multi-view learning are presented. The contributions include a linear data fusion approach during exploratory data analysis, a new measure to evaluate different kinds of representations for textual data, and an extension of multi-view learning for novel scenarios where the correspondence of samples in the different views or data sets is not known in advance. In order to infer the one-to-one correspondence of samples between two views, a novel concept of multi-view matching is proposed. The matching algorithm is completely data-driven and is demonstrated in several applications such as matching of metabolites between humans and mice, and matching of sentences between documents in two languages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The error introduced in depolarisation measurements due to the convergence of the incident beam has been investigated theoretically as well as experimentally for the case of colloid scattering, where the particles are not small compared to the wavelength of light. Assuming the scattering particles to be anisotropic rods, it is shown that, when the incident unpolarised light is condensed by means of a lens with a circular aperture, the observed depolarisation ratio ϱ u is given by ϱ u = ϱ u0 + 5/3 θ2 where ϱ u0 is the true depolarisation for incident parallel light, and θ the semi-angle of convergence. Appropriate formulae are derived when the incident beam is polarised vertically and horizontally. Experiments performed on six typical colloids support the theoretical conclusions. Other immediate consequences of the theory are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Based on the Aristotelian criterion referred to as 'abductio', Peirce suggests a method of hypothetical inference, which operates in a different way than the deductive and inductive methods. “Abduction is nothing but guessing” (Peirce, 7.219). This principle is of extreme value for the study of our understanding of mathematical self-similarity in both of its typical presentations: relative or absolute. For the first case, abduction incarnates the quantitative/qualitative relationships of a self-similar object or process; for the second case, abduction makes understandable the statistical treatment of self-similarity, 'guessing' the continuity of geometric features to the infinity through the use of a systematic stereotype (for instance, the assumption that the general shape of the Sierpiński triangle continuates identically into its particular shapes). The metaphor coined by Peirce, of an exact map containig itself the same exact map (a map of itself), is not only the most important precedent of Mandelbrot’s problem of measuring the boundaries of a continuous irregular surface with a logarithmic ruler, but also still being a useful abstraction for the conceptualisation of relative and absolute self-similarity, and its mechanisms of implementation. It is useful, also, for explaining some of the most basic geometric ontologies as mental constructions: in the notion of infinite convergence of points in the corners of a triangle, or the intuition for defining two parallel straight lines as two lines in a plane that 'never' intersect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Growth and Convergence: The Case of China Since the initiation of economic reforms in 1978, China has become one of the world’s fast-growing economies. The rapid growth, however, has not been shared equally across the different regions in China. The prominent feature of substantial differences in incomes and growth rates across the different Chinese regions has attracted the attention of many researchers. This book focuses on issues related to economic growth and convergence across the Chinese regions over the past three decades. The book has eight chapters. Apart from an introduction chapter and a concluding chapter, all the other chapters each deal with some certain aspects of the central issue of regional growth and convergence across China over the past three decades. The whole book is organized as follows. Chapter 1 provides an introduction to the basic issues involved in this book. Chapter 2 tests economic growth and convergence across 31 Chinese provinces during 1981-2005, based on the theoretical framework of the Solow growth model. Chapter 3 investigates the relationship between openness to foreign economic activities, such as foreign trade and foreign direct investment, and the regional economic growth in the case of China during 1981-2005. Chapter 4, based on data of 31 Chinese provinces over the period 1980-2004, presents new evidence on the effects of structural shocks and structural transformation on growth and convergence among the Chinese regions. Chapter 5, by building up an empirical model that takes account of different potential effects of foreign direct investment, focuses on the impacts of foreign direct investment on China’s regional economic performance and growth. Chapter 6 reconsiders the growth and convergence problem of the Chinese regions in an alternative theoretical framework with endogenous saving behavior and capital mobility across regions. Chapter 7, by building up a theoretical model concerning comparative advantage and transaction efficiency, focuses on one of the potential mechanisms through which China achieves its fast economic growth over the past few decades. Chapter 8 concludes the book by summarizing the results from the previous chapters and suggesting directions for further studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Artificial neural networks (ANNs) have shown great promise in modeling circuit parameters for computer aided design applications. Leakage currents, which depend on process parameters, supply voltage and temperature can be modeled accurately with ANNs. However, the complex nature of the ANN model, with the standard sigmoidal activation functions, does not allow analytical expressions for its mean and variance. We propose the use of a new activation function that allows us to derive an analytical expression for the mean and a semi-analytical expression for the variance of the ANN-based leakage model. To the best of our knowledge this is the first result in this direction. Our neural network model also includes the voltage and temperature as input parameters, thereby enabling voltage and temperature aware statistical leakage analysis (SLA). All existing SLA frameworks are closely tied to the exponential polynomial leakage model and hence fail to work with sophisticated ANN models. In this paper, we also set up an SLA framework that can efficiently work with these ANN models. Results show that the cumulative distribution function of leakage current of ISCAS'85 circuits can be predicted accurately with the error in mean and standard deviation, compared to Monte Carlo-based simulations, being less than 1% and 2% respectively across a range of voltage and temperature values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The role of convergence feedback on the stability of a coupled ocean‐atmosphere system is studied using model III of Hirst (1986). It is shown that the unstable coupled mode found by Hirst is greatly modified by the convergence feedback. If the convergence feedback strength exceeds a critical value, several new unstable intraseasonal modes are also introduced. These modes have very weak dependence on the wave number. These results may explain the behaviour of some coupled models and to some extent provide a mechanism for the observed aperiodicity of the El‐Nino and Southern Oscillation (ENSO) events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A systematic structure analysis of the correlation functions of statistical quantum optics is carried out. From a suitably defined auxiliary two‐point function we are able to identify the excited modes in the wave field. The relative simplicity of the higher order correlation functions emerge as a byproduct and the conditions under which these are made pure are derived. These results depend in a crucial manner on the notion of coherence indices and of unimodular coherence indices. A new class of approximate expressions for the density operator of a statistical wave field is worked out based on discrete characteristic sets. These are even more economical than the diagonal coherent state representations. An appreciation of the subtleties of quantum theory obtains. Certain implications for the physics of light beams are cited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The absorption produced by the audience in concert halls is considered a random variable. Beranek's proposal [L. L. Beranek, Music, Acoustics and Architecture (Wiley, New York, 1962), p. 543] that audience absorption is proportional to the area they occupy and not to their number is subjected to a statistical hypothesis test. A two variable linear regression model of the absorption with audience area and residual area as regressor variables is postulated for concert halls without added absorptive materials. Since Beranek's contention amounts to the statement that audience absorption is independent of the seating density, the test of the hypothesis lies in categorizing halls by seating density and examining for significant differences among slopes of regression planes of the different categories. Such a test shows that Beranek's hypothesis can be accepted. It is also shown that the audience area is a better predictor of the absorption than the audience number. The absorption coefficients and their 95% confidence limits are given for the audience and residual areas. A critique of the regression model is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigates the process of producing interactivity in a converged media environment. The study asks whether more media convergence equals more interactivity. The research object is approached through semi-structured interviews of prominent decision makers within the Finnish media. The main focus of the study are the three big ones of the traditional media, radio, television and the printing press, and their ability to adapt to the changing environment. The study develops theoretical models for the analysis of interactive features and convergence. Case-studies are formed from the interview data and they are evaluated against the models. As a result the cases arc plotted and compared on a four-fold table. The cases are Radio Rock, NRJ, Biu Brother, Television Chat, Olivia and Sanoma News. It is found out that the theoretical models can accurately forecast the results of the case studies. The models are also able to distinguish different aspects of both interactivity and convergence so that a case, which at a first glance seems not to be very interactive is in the end found out to receive second highest scores on the analysis. The highest scores are received by Big Brother and Sanoma News. Through the theory and the analysis of the research data it is found out that the concepts of interactivity and convergence arc intimately intertwined and very hard in many cases to separate from each other. Hence the answer to the main question of this study is yes, convergence does promote interactivity and audience participation. The main theoretical background for the analysis of interactivity follows the work of Came Fleeter, Spiro Kiousis and Sally McMillan. Heeler's six-dimensional definition of interactivity is used as the basis for operationalizing interactivity. The actor-network theory is used as the main theoretical framework to analyze convergence. The definition and operationalization of the actor-network theory into a model of convergence follows the work of Michel Callon. Bruno Latour and especially John Law and Felix Stalder.