925 resultados para CNPQ::CIENCIAS EXATAS E DA TERRA::PROBABILIDADE E ESTATISTICA::ESTATISTICA


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In production lines, the entire process is bound to unexpected happenings which may cost losing the production quality. Thus, it means losses to the manufacturer. Identify such causes and remove them is the task of the processing management. The on-line control system consists of periodic inspection of every month produced item. Once any of those items is quali ed as not t, it is admitted that a change in the fraction of the items occurred, and then the process is stopped for adjustments. This work is an extension of Quinino & Ho (2010) and has as objective main to make the monitoramento in a process through the control on-line of quality for the number of non-conformities about the inspected item. The strategy of decision to verify if the process is under control, is directly associated to the limits of the graphic control of non-conformities of the process. A policy of preventive adjustments is incorporated in order to enlarge the conforming fraction of the process. With the help of the R software, a sensibility analysis of the proposed model is done showing in which situations it is most interesting to execute the preventive adjustment

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In survival analysis, the response is usually the time until the occurrence of an event of interest, called failure time. The main characteristic of survival data is the presence of censoring which is a partial observation of response. Associated with this information, some models occupy an important position by properly fit several practical situations, among which we can mention the Weibull model. Marshall-Olkin extended form distributions other a basic generalization that enables greater exibility in adjusting lifetime data. This paper presents a simulation study that compares the gradient test and the likelihood ratio test using the Marshall-Olkin extended form Weibull distribution. As a result, there is only a small advantage for the likelihood ratio test

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we study the accelerated failure-time generalized Gamma regression models with a unified approach. The models attempt to estimate simultaneously the effects of covariates on the acceleration/deceleration of the timing of a given event and the surviving fraction. The method is implemented in the free statistical software R. Finally the model is applied to a real dataset referring to the time until the return of the disease in patients diagnosed with breast cancer

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem treated in this dissertation is to establish boundedness for the iterates of an iterative algorithm in es in each step an orthogonal projection on a straight line in exed in a (possibly infinite) family of lines, allowing arbitrary order in applying the projections. This problem was analyzed in a paper by Barany et al. in 1994, which found a necessary and suficient condition in the case d = 2, and analyzed further the case d > 2, under some technical conditions. However, this paper uses non-trivial intuitive arguments and its proofs lack suficient rigor. In this dissertation we discuss and strengthen the results of this paper, in order to complete and simplify its proofs

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a new control chart to monitor a process mean employing a combined npx-X control chart. Basically the procedure consists of splitting the sample of size n into two sub-samples n1 and n2 determined by an optimization search. The sampling occur in two-stages. In the first stage the units of the sub-sample n1 are evaluated by attributes and plotted in npx control chart. If this chart signs then units of second sub-sample are measured and the monitored statistic plotted in X control chart (second stage). If both control charts sign then the process is stopped for adjustment. The possibility of non-inspection in all n items may promote a reduction not only in the cost but also the time spent to examine the sampled items. Performances of the current proposal, individual X and npx control charts are compared. In this study the proposed procedure presents many competitive options for the X control chart for a sample size n and a shift from the target mean. The average time to sign (ATS) of the current proposal lower than the values calculated from an individual X control chart points out that the combined control chart is an efficient tool in monitoring process mean.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Survival models deals with the modeling of time to event data. However in some situations part of the population may be no longer subject to the event. Models that take this fact into account are called cure rate models. There are few studies about hypothesis tests in cure rate models. Recently a new test statistic, the gradient statistic, has been proposed. It shares the same asymptotic properties with the classic large sample tests, the likelihood ratio, score and Wald tests. Some simulation studies have been carried out to explore the behavior of the gradient statistic in fi nite samples and compare it with the classic statistics in diff erent models. The main objective of this work is to study and compare the performance of gradient test and likelihood ratio test in cure rate models. We first describe the models and present the main asymptotic properties of the tests. We perform a simulation study based on the promotion time model with Weibull distribution to assess the performance of the tests in finite samples. An application is presented to illustrate the studied concepts

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we present a mathematical and computational modeling of electrokinetic phenomena in electrically charged porous medium. We consider the porous medium composed of three different scales (nanoscopic, microscopic and macroscopic). On the microscopic scale the domain is composed by a porous matrix and a solid phase. The pores are filled with an aqueous phase consisting of ionic solutes fully diluted, and the solid matrix consists of electrically charged particles. Initially we present the mathematical model that governs the electrical double layer in order to quantify the electric potential, electric charge density, ion adsorption and chemical adsorption in nanoscopic scale. Then, we derive the microscopic model, where the adsorption of ions due to the electric double layer and the reactions of protonation/ deprotanaç~ao and zeta potential obtained in modeling nanoscopic arise in microscopic scale through interface conditions in the problem of Stokes and Nerst-Planck equations respectively governing the movement of the aqueous solution and transport of ions. We developed the process of upscaling the problem nano/microscopic using the homogenization technique of periodic structures by deducing the macroscopic model with their respectives cell problems for effective parameters of the macroscopic equations. Considering a clayey porous medium consisting of kaolinite clay plates distributed parallel, we rewrite the macroscopic model in a one-dimensional version. Finally, using a sequential algorithm, we discretize the macroscopic model via the finite element method, along with the interactive method of Picard for the nonlinear terms. Numerical simulations on transient regime with variable pH in one-dimensional case are obtained, aiming computational modeling of the electroremediation process of clay soils contaminated

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper has two objectives: (i) conducting a literature search on the criteria of uniqueness of solution for initial value problems of ordinary differential equations. (ii) a modification of the method of Euler that seems to be able to converge to a solution of the problem, if the solution is not unique

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In general, an inverse problem corresponds to find a value of an element x in a suitable vector space, given a vector y measuring it, in some sense. When we discretize the problem, it usually boils down to solve an equation system f(x) = y, where f : U Rm ! Rn represents the step function in any domain U of the appropriate Rm. As a general rule, we arrive to an ill-posed problem. The resolution of inverse problems has been widely researched along the last decades, because many problems in science and industry consist in determining unknowns that we try to know, by observing its effects under certain indirect measures. Our general subject of this dissertation is the choice of Tykhonov´s regulaziration parameter of a poorly conditioned linear problem, as we are going to discuss on chapter 1 of this dissertation, focusing on the three most popular methods in nowadays literature of the area. Our more specific focus in this dissertation consists in the simulations reported on chapter 2, aiming to compare the performance of the three methods in the recuperation of images measured with the Radon transform, perturbed by the addition of gaussian i.i.d. noise. We choosed a difference operator as regularizer of the problem. The contribution we try to make, in this dissertation, mainly consists on the discussion of numerical simulations we execute, as is exposed in Chapter 2. We understand that the meaning of this dissertation lays much more on the questions which it raises than on saying something definitive about the subject. Partly, for beeing based on numerical experiments with no new mathematical results associated to it, partly for being about numerical experiments made with a single operator. On the other hand, we got some observations which seemed to us interesting on the simulations performed, considered the literature of the area. In special, we highlight observations we resume, at the conclusion of this work, about the different vocations of methods like GCV and L-curve and, also, about the optimal parameters tendency observed in the L-curve method of grouping themselves in a small gap, strongly correlated with the behavior of the generalized singular value decomposition curve of the involved operators, under reasonably broad regularity conditions in the images to be recovered

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a dependent risk model to describe the surplus of an insurance portfolio, based on the article "A ruin model with dependence between claim sizes and claim intervals"(Albrecher and Boxma [1]). An exact expression for the Laplace transform of the survival function of the surplus is derived. The results obtained are illustrated by several numerical examples and the case when we ignore the dependence structure present in the model is investigated. For the phase type claim sizes, we study by the survival probability, considering this is a class of distributions computationally tractable and more general

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The central objective of a study Non-Homogeneous Markov Chains is the concept of weak and strong ergodicity. A chain is weak ergodic if the dependence on the initial distribution vanishes with time, and it is strong ergodic if it is weak ergodic and converges in distribution. Most theoretical results on strong ergodicity assume some knowledge of the limit behavior of the stationary distributions. In this work, we collect some general results on weak and strong ergodicity for chains with space enumerable states, and also study the asymptotic behavior of the stationary distributions of a particular type of Markov Chains with finite state space, called Markov Chains with Rare Transitions

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A research project is being developed by PPGG/UFRN and PETROBRAS in the Xaréu Oil Field located in Ceará Basin, Northeastern Brazil. The objective of the research is to characterize a fractured carbonate reservoir, the Trairi Limestone, in order to drill a borehole with two horizontal legs taking advantage of the natural fracture system to enhance the oil recovery. The present master thesis is part of this research and its contribution is to estimate fault orientation from unoriented cores, using the method proposed by Hesthammer & Henden (2000). In order to orient a fault cutting a bed observed in the core, the bed should be previously oriented. As additional constraint to orient the bed, we use regional bedding orientation obtained from structure maps of Trairi Limestone. Because the number of cores drilled from the Trairi Limestone was too small, we analyzed all cores from the field. As geologic constraint, we admit that all faults were formed as result of the South America and Africa separation, in the context of a regional dextral strike-slip fault formation. In this context, secondary faults are manly T and R faults according Riedel s classification. We analyzed 236.5 m of cores. The dip of bedding varies from 0o to 8o, being the most frequent value equal to 2o. We interpret this result as evidence that the deformation process was manly ruptil. 77 faults were identified in the cores. These faults strike manly to NW and NE with dips, in general, inside the interval 700 - 900. We suggest that the horizontal legs of the borehole should be oriented to NW and NE in order to improve the probability of intercepting open fractures and faults