862 resultados para stochastic regression, consistency


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analytical curves are normally obtained from discrete data by least squares regression. The least squares regression of data involving significant error in both x and y values should not be implemented by ordinary least squares (OLS). In this work, the use of orthogonal distance regression (ODR) is discussed as an alternative approach in order to take into account the error in the x variable. Four examples are presented to illustrate deviation between the results from both regression methods. The examples studied show that, in some situations, ODR coefficients must substitute for those of OLS, and, in other situations, the difference is not significant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Practical Stochastic Model is a simple and robust method to describe coupled chemical reactions. The connection between this stochastic method and a deterministic method was initially established to understand how the parameters and variables that describe the concentration in both methods were related. It was necessary to define two main concepts to make this connection: the filling of compartments or dilutions and the rate of reaction enhancement. The parameters, variables, and the time of the stochastic methods were scaled with the size of the compartment and were compared with a deterministic method. The deterministic approach was employed as an initial reference to achieve a consistent stochastic result. Finally, an independent robust stochastic method was obtained. This method could be compared with the Stochastic Simulation Algorithm developed by Gillespie, 1977. The Practical Stochastic Model produced absolute values that were essential to describe non-linear chemical reactions with a simple structure, and allowed for a correct description of the chemical kinetics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis consists of three main theoretical themes: quality of data, success of information systems, and metadata in data warehousing. Loosely defined, metadata is descriptive data about data, and, in this thesis, master data means reference data about customers, products etc. The objective of the thesis is to contribute to an implementation of a metadata management solution for an industrial enterprise. The metadata system incorporates a repository, integration, delivery and access tools, as well as semantic rules and procedures for master data maintenance. It targets to improve maintenance processes and quality of hierarchical master data in the case company’s informational systems. That should bring benefits to whole organization in improved information quality, especially in cross-system data consistency, and in more efficient and effective data management processes. As the result of this thesis, the requirements for the metadata management solution in case were compiled, and the success of the new information system and the implementation project was evaluated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Konsistens och förändring i finländsk etermediapolitik. Implementering av digital television och en jämförelse med Kanada Avhandlingen handlar om hur det finländska televisionssystemet förändrades i slutet av 1990-talet från en nationell institution till ett dualistiskt system som präglas av stark marknadsorientering. Syftet med avhandlingen är att förstå på vilket sätt en så snabb förändring kunde ske och analysera de institutionella faktorerna bakom utvecklingen. På teoretisk nivå diskuteras tesen om det nära sambandet mellan statliga politiska institutioner och rundradioverksamhetens institutioner. Avhandlingen består av två fallstudier. Den första sätter fokus på de första åren av den finländska televisionens digitaliseringsprocess som startade med starka industriell-nationalistiska motiveringar. Analysen, som baserar på offentliga dokument, sträcker sig framtill hösten 2001 då de digitala televisionssändningarna startade och regeringspropositionen om den nya kommunikationsmarknadslagen lämnades till riksdagen. Dessa policy-processer analyseras som en ”marknadisering” av de traditionella styrningsprinciper och idéer gällande finländsk rundradioverksamheten. En jämförelse mellan Finlands och Kanadas nationella rundradiopolitik gör att man kan koppla slutsatserna till den internationella utvecklingen. Jämförelsen visar hur kommunikationspolitiska linjen i de två länderna har kommit att likna varandra trots att ländernas tv-system och deras styrordningar är mycket olika. Exemplet med Kanada visar att den särskilda teknologin inte är viktig utan snarare de kommersiella intressen som står bakom och som beslutsfattare gärna döljer i en nationalistisk retorik. Studien visar att det är viktigt att beakta vilken tyngd man i politiken ger de två sidor som rundradioverksamheten består av: sändningsteknologin och verksamheten som en speciell kulturform. Nationalstatens handlingsutrymme minskar inom fältet om målet är att vara framgångsrik i konkurrensen i den nya internationella ekonomin. Enligt de nyliberalistiska principer som det politiska systemet överlag har tillägnat sig det är önskvärd men också helt inhemska institutionella traditioner och praxis som följdes i den finländska digitaliseringsprocessen främjade utvecklingen som ledde till att nästan alla ursprungliga nationella syften föll sönder.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Any inconsistent theory whose underlying logic is classical encompasses all the sentences of its own language. As it denies everything it asserts, it is useless for explaining or predicting anything. Nevertheless, paraconsistent logic has shown that it is possible to live with contradictions and still avoid the collapse of the theory. The main point of this paper is to show that even if it is formally possible to isolate the contradictions and to live with them, this cohabitation is neither desired by working scientists not desirable for the progress of science. Several cases from the recent history of physics and cosmology are analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is generally accepted that the development of the modern sciences is rooted in experiment. Yet for a long time, experimentation did not occupy a prominent role, neither in philosophy nor in history of science. With the 'practical turn' in studying the sciences and their history, this has begun to change. This paper is concerned with systems and cultures of experimentation and the consistencies that are generated within such systems and cultures. The first part of the paper exposes the forms of historical and structural coherence that characterize the experimental exploration of epistemic objects. In the second part, a particular experimental culture in the life sciences is briefly described as an example. A survey will be given of what it means and what it takes to analyze biological functions in the test tube.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In any decision making under uncertainties, the goal is mostly to minimize the expected cost. The minimization of cost under uncertainties is usually done by optimization. For simple models, the optimization can easily be done using deterministic methods.However, many models practically contain some complex and varying parameters that can not easily be taken into account using usual deterministic methods of optimization. Thus, it is very important to look for other methods that can be used to get insight into such models. MCMC method is one of the practical methods that can be used for optimization of stochastic models under uncertainty. This method is based on simulation that provides a general methodology which can be applied in nonlinear and non-Gaussian state models. MCMC method is very important for practical applications because it is a uni ed estimation procedure which simultaneously estimates both parameters and state variables. MCMC computes the distribution of the state variables and parameters of the given data measurements. MCMC method is faster in terms of computing time when compared to other optimization methods. This thesis discusses the use of Markov chain Monte Carlo (MCMC) methods for optimization of Stochastic models under uncertainties .The thesis begins with a short discussion about Bayesian Inference, MCMC and Stochastic optimization methods. Then an example is given of how MCMC can be applied for maximizing production at a minimum cost in a chemical reaction process. It is observed that this method performs better in optimizing the given cost function with a very high certainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximately a quarter of electrical power consumption in pulp and paper industry is used in different pumping systems. Therefore, improving pumping system efficiency is a considerable way to reduce energy consumption in different processes. Pumping of wood pulp in different consistencies is common in pulp and paper industry. Earlier, centrifugal pumps were used to pump pulp only at low consistencies, but development of MC technology has made it possible to pump medium consistency pulp. Pulp is a non-Newtonian fluid, which flow characteristics are significantly different than what of water. In this thesis is examined the energy efficiency of pumping medium consistency pulp with centrifugal pump. The factors effecting the pumping of MC pulp are presented and through case study is examined the energy efficiency of pumping in practice. With data obtained from the case study are evaluated the effects of pump rotational speed and pulp consistency on energy efficiency. Additionally, losses caused by control valve and validity of affinity laws in pulp pumping are evaluated. The results of this study can be used for demonstrating the energy consumption of MC pumping processes and finding ways to improve energy efficiency in these processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing demand of consumer markets for the welfare of birds in poultry house has motivated many scientific researches to monitor and classify the welfare according to the production environment. Given the complexity between the birds and the environment of the aviary, the correct interpretation of the conduct becomes an important way to estimate the welfare of these birds. This study obtained multiple logistic regression models with capacity of estimating the welfare of broiler breeders in relation to the environment of the aviaries and behaviors expressed by the birds. In the experiment, were observed several behaviors expressed by breeders housed in a climatic chamber under controlled temperatures and three different ammonia concentrations from the air monitored daily. From the analysis of the data it was obtained two logistic regression models, of which the first model uses a value of ammonia concentration measured by unit and the second model uses a binary value to classify the ammonia concentration that is assigned by a person through his olfactory perception. The analysis showed that both models classified the broiler breeder's welfare successfully.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The broiler rectal temperature (t rectal) is one of the most important physiological responses to classify the animal thermal comfort. Therefore, the aim of this study was to adjust regression models in order to predict the rectal temperature (t rectal) of broiler chickens under different thermal conditions based on age (A) and a meteorological variable (air temperature - t air) or a thermal comfort index (temperature and humidity index -THI or black globe humidity index - BGHI) or a physical quantity enthalpy (H). In addition, through the inversion of these models and the expected t rectal intervals for each age, the comfort limits of t air, THI, BGHI and H for the chicks in the heating phase were determined, aiding in the validation of the equations and the preliminary limits for H. The experimental data used to adjust the mathematical models were collected in two commercial poultry farms, with Cobb chicks, from 1 to 14 days of age. It was possible to predict the t rectal of conditions from the expected t rectal and determine the lower and superior comfort thresholds of broilers satisfactorily by applying the four models adjusted; as well as to invert the models for prediction of the environmental H for the chicks first 14 days of life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Singular Value Decomposition (SVD), Principal Component Analysis (PCA) and Multiple Linear Regression (MLR) are some of the mathematical pre- liminaries that are discussed prior to explaining PLS and PCR models. Both PLS and PCR are applied to real spectral data and their di erences and similarities are discussed in this thesis. The challenge lies in establishing the optimum number of components to be included in either of the models but this has been overcome by using various diagnostic tools suggested in this thesis. Correspondence analysis (CA) and PLS were applied to ecological data. The idea of CA was to correlate the macrophytes species and lakes. The di erences between PLS model for ecological data and PLS for spectral data are noted and explained in this thesis. i

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quite often, in the construction of a pulp mill involves establishing the size of tanks which will accommodate the material from the various processes in which case estimating the right tank size a priori would be vital. Hence, simulation of the whole production process would be worthwhile. Therefore, there is need to develop mathematical models that would mimic the behavior of the output from the various production units of the pulp mill to work as simulators. Markov chain models, Autoregressive moving average (ARMA) model, Mean reversion models with ensemble interaction together with Markov regime switching models are proposed for that purpose.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stochastic approximation methods for stochastic optimization are considered. Reviewed the main methods of stochastic approximation: stochastic quasi-gradient algorithm, Kiefer-Wolfowitz algorithm and adaptive rules for them, simultaneous perturbation stochastic approximation (SPSA) algorithm. Suggested the model and the solution of the retailer's profit optimization problem and considered an application of the SQG-algorithm for the optimization problems with objective functions given in the form of ordinary differential equation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stochastic differential equation (SDE) is a differential equation in which some of the terms and its solution are stochastic processes. SDEs play a central role in modeling physical systems like finance, Biology, Engineering, to mention some. In modeling process, the computation of the trajectories (sample paths) of solutions to SDEs is very important. However, the exact solution to a SDE is generally difficult to obtain due to non-differentiability character of realizations of the Brownian motion. There exist approximation methods of solutions of SDE. The solutions will be continuous stochastic processes that represent diffusive dynamics, a common modeling assumption for financial, Biology, physical, environmental systems. This Masters' thesis is an introduction and survey of numerical solution methods for stochastic differential equations. Standard numerical methods, local linearization methods and filtering methods are well described. We compute the root mean square errors for each method from which we propose a better numerical scheme. Stochastic differential equations can be formulated from a given ordinary differential equations. In this thesis, we describe two kind of formulations: parametric and non-parametric techniques. The formulation is based on epidemiological SEIR model. This methods have a tendency of increasing parameters in the constructed SDEs, hence, it requires more data. We compare the two techniques numerically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software plays an important role in our society and economy. Software development is an intricate process, and it comprises many different tasks: gathering requirements, designing new solutions that fulfill these requirements, as well as implementing these designs using a programming language into a working system. As a consequence, the development of high quality software is a core problem in software engineering. This thesis focuses on the validation of software designs. The issue of the analysis of designs is of great importance, since errors originating from designs may appear in the final system. It is considered economical to rectify the problems as early in the software development process as possible. Practitioners often create and visualize designs using modeling languages, one of the more popular being the Uni ed Modeling Language (UML). The analysis of the designs can be done manually, but in case of large systems, the need of mechanisms that automatically analyze these designs arises. In this thesis, we propose an automatic approach to analyze UML based designs using logic reasoners. This approach firstly proposes the translations of the UML based designs into a language understandable by reasoners in the form of logic facts, and secondly shows how to use the logic reasoners to infer the logical consequences of these logic facts. We have implemented the proposed translations in the form of a tool that can be used with any standard compliant UML modeling tool. Moreover, we authenticate the proposed approach by automatically validating hundreds of UML based designs that consist of thousands of model elements available in an online model repository. The proposed approach is limited in scope, but is fully automatic and does not require any expertise of logic languages from the user. We exemplify the proposed approach with two applications, which include the validation of domain specific languages and the validation of web service interfaces.