887 resultados para Matemática aplicada


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work, we present a text on the Sets Numerical using the human social needs as a tool for construction new numbers. This material is intended to present a text that reconciles the correct teaching of mathmatics and clarity needed for a good learning

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The on-line processes control for attributes consists of inspecting a single item at every m produced ones. If the examined item is conforming, the production continues; otherwise, the process stops for adjustment. However, in many practical situations, the interest consist of monitoring the number of non-conformities among the examined items. In this case, if the number of non-conformities is higher than an upper control limit, the process needs to be stopped and some adjustment is required. The contribution of this paper is to propose a control system for the number of nonconforming of the inspected item. Employing properties of an ergodic Markov chain, an expression for the expected cost per item of the control system was obtained and it will be minimized by two parameters: the sampling interval and the upper limit control of the non-conformities of the examined item. Numerical examples illustrate the proposed procedure

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Este trabalho tem como objetivo o estudo do comportamento assintótico da estatística de Pearson (1900), que é o aparato teórico do conhecido teste qui-quadrado ou teste x2 como também é usualmente denotado. Inicialmente estudamos o comportamento da distribuição da estatística qui-quadrado de Pearson (1900) numa amostra {X1, X2,...,Xn} quando n → ∞ e pi = pi0 , 8n. Em seguida detalhamos os argumentos usados em Billingley (1960), os quais demonstram a convergência em distribuição de uma estatística, semelhante a de Pearson, baseada em uma amostra de uma cadeia de Markov, estacionária, ergódica e com espaço de estados finitos S

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work we study the Hidden Markov Models with finite as well as general state space. In the finite case, the forward and backward algorithms are considered and the probability of a given observed sequence is computed. Next, we use the EM algorithm to estimate the model parameters. In the general case, the kernel estimators are used and to built a sequence of estimators that converge in L1-norm to the density function of the observable process

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work we studied the consistency for a class of kernel estimates of f f (.) in the Markov chains with general state space E C Rd case. This study is divided into two parts: In the first one f (.) is a stationary density of the chain, and in the second one f (x) v (dx) is the limit distribution of a geometrically ergodic chain

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Os Algoritmos Genético (AG) e o Simulated Annealing (SA) são algoritmos construídos para encontrar máximo ou mínimo de uma função que representa alguma característica do processo que está sendo modelado. Esses algoritmos possuem mecanismos que os fazem escapar de ótimos locais, entretanto, a evolução desses algoritmos no tempo se dá de forma completamente diferente. O SA no seu processo de busca trabalha com apenas um ponto, gerando a partir deste sempre um nova solução que é testada e que pode ser aceita ou não, já o AG trabalha com um conjunto de pontos, chamado população, da qual gera outra população que sempre é aceita. Em comum com esses dois algoritmos temos que a forma como o próximo ponto ou a próxima população é gerada obedece propriedades estocásticas. Nesse trabalho mostramos que a teoria matemática que descreve a evolução destes algoritmos é a teoria das cadeias de Markov. O AG é descrito por uma cadeia de Markov homogênea enquanto que o SA é descrito por uma cadeia de Markov não-homogênea, por fim serão feitos alguns exemplos computacionais comparando o desempenho desses dois algoritmos

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work, we present a risk theory application in the following scenario: In each period of time we have a change in the capital of the ensurance company and the outcome of a two-state Markov chain stabilishs if the company pays a benece it heat to one of its policyholders or it receives a Hightimes c > 0 paid by someone buying a new policy. At the end we will determine once again by the recursive equation for expectation the time ruin for this company

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In Percolation Theory, functions like the probability that a given site belongs to the infinite cluster, average size of clusters, etc. are described through power laws and critical exponents. This dissertation uses a method called Finite Size Scaling to provide a estimative of those exponents. The dissertation is divided in four parts. The first one briefly presents the main results for Site Percolation Theory for d = 2 dimension. Besides, some important quantities for the determination of the critical exponents and for the phase transistions understanding are defined. The second shows an introduction to the fractal concept, dimension and classification. Concluded the base of our study, in the third part the Scale Theory is mentioned, wich relates critical exponents and the quantities described in Chapter 2. In the last part, through the Finite Size Scaling method, we determine the critical exponents fi and. Based on them, we used the previous Chapter scale relations in order to determine the remaining critical exponents

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We considered prediction techniques based on models of accelerated failure time with random e ects for correlated survival data. Besides the bayesian approach through empirical Bayes estimator, we also discussed about the use of a classical predictor, the Empirical Best Linear Unbiased Predictor (EBLUP). In order to illustrate the use of these predictors, we considered applications on a real data set coming from the oil industry. More speci - cally, the data set involves the mean time between failure of petroleum-well equipments of the Bacia Potiguar. The goal of this study is to predict the risk/probability of failure in order to help a preventive maintenance program. The results show that both methods are suitable to predict future failures, providing good decisions in relation to employment and economy of resources for preventive maintenance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work, we studied the strong consistency for a class of estimates for a transition density of a Markov chain with general state space E ⊂ Rd. The strong ergodicity of the estimates for the density transition is obtained from the strong consistency of the kernel estimates for both the marginal density p(:) of the chain and the joint density q(., .). In this work the Markov chain is supposed to be homogeneous, uniformly ergodic and possessing a stationary density p(.,.)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In production lines, the entire process is bound to unexpected happenings which may cost losing the production quality. Thus, it means losses to the manufacturer. Identify such causes and remove them is the task of the processing management. The on-line control system consists of periodic inspection of every month produced item. Once any of those items is quali ed as not t, it is admitted that a change in the fraction of the items occurred, and then the process is stopped for adjustments. This work is an extension of Quinino & Ho (2010) and has as objective main to make the monitoramento in a process through the control on-line of quality for the number of non-conformities about the inspected item. The strategy of decision to verify if the process is under control, is directly associated to the limits of the graphic control of non-conformities of the process. A policy of preventive adjustments is incorporated in order to enlarge the conforming fraction of the process. With the help of the R software, a sensibility analysis of the proposed model is done showing in which situations it is most interesting to execute the preventive adjustment

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Two-level factorial designs are widely used in industrial experimentation. However, many factors in such a design require a large number of runs to perform the experiment, and too many replications of the treatments may not be feasible, considering limitations of resources and of time, making it expensive. In these cases, unreplicated designs are used. But, with only one replicate, there is no internal estimate of experimental error to make judgments about the significance of the observed efects. One of the possible solutions for this problem is to use normal plots or half-normal plots of the efects. Many experimenters use the normal plot, while others prefer the half-normal plot and, often, for both cases, without justification. The controversy about the use of these two graphical techniques motivates this work, once there is no register of formal procedure or statistical test that indicates \which one is best". The choice between the two plots seems to be a subjective issue. The central objective of this master's thesis is, then, to perform an experimental comparative study of the normal plot and half-normal plot in the context of the analysis of the 2k unreplicated factorial experiments. This study involves the construction of simulated scenarios, in which the graphics performance to detect significant efects and to identify outliers is evaluated in order to verify the following questions: Can be a plot better than other? In which situations? What kind of information does a plot increase to the analysis of the experiment that might complement those provided by the other plot? What are the restrictions on the use of graphics? Herewith, this work intends to confront these two techniques; to examine them simultaneously in order to identify similarities, diferences or relationships that contribute to the construction of a theoretical reference to justify or to aid in the experimenter's decision about which of the two graphical techniques to use and the reason for this use. The simulation results show that the half-normal plot is better to assist in the judgement of the efects, while the normal plot is recommended to detect outliers in the data

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In survival analysis, the response is usually the time until the occurrence of an event of interest, called failure time. The main characteristic of survival data is the presence of censoring which is a partial observation of response. Associated with this information, some models occupy an important position by properly fit several practical situations, among which we can mention the Weibull model. Marshall-Olkin extended form distributions other a basic generalization that enables greater exibility in adjusting lifetime data. This paper presents a simulation study that compares the gradient test and the likelihood ratio test using the Marshall-Olkin extended form Weibull distribution. As a result, there is only a small advantage for the likelihood ratio test