19 resultados para kernel estimators

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work we studied the asymptotic unbiasedness, the strong and the uniform strong consistencies of a class of kernel estimators fn as an estimator of the density function f taking values on a k-dimensional sphere

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work we study the Hidden Markov Models with finite as well as general state space. In the finite case, the forward and backward algorithms are considered and the probability of a given observed sequence is computed. Next, we use the EM algorithm to estimate the model parameters. In the general case, the kernel estimators are used and to built a sequence of estimators that converge in L1-norm to the density function of the observable process

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, the paper of Campos and Dorea [3] was detailed. In that article a Kernel Estimator was applied to a sequence of random variables with general state space, which were independent and identicaly distributed. In chapter 2, the estimator´s properties such as asymptotic unbiasedness, consistency in quadratic mean, strong consistency and asymptotic normality were verified. In chapter 3, using R software, numerical experiments were developed in order to give a visual idea of the estimate process

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present time, public organizations are employing more and more solutions that uses information technology in order to ofer more transparency and better services for all citizens. Integrated Systems are IT which carry in their kernel features of integration and the use of a unique database. These systems bring several benefits and face some obstacles that make their adoption difficult. The conversion to a integrated system may take years and, thus, the study of the adoption of this IT in public sector organizations become very stimulant due to some peculiarities of this sector and the features of this technology. First of all, information about the particular integrated system in study and about its process of conversion are offered. Then, the researcher designs the configuration of the conversion process aim of this study the agents envolved and the moments and the tools used to support the process in order to elaborate the methodology of the conversion process understood as the set of procedures and tools used during all the conversion process. After this, the researcher points out, together with all the members of the conversion team, the negative and positive factors during the project. Finally, these factors were analysed through the Hospitality Theory lens which, in the researcher opinion, was very useful to understand the elements, events and moments that interfered in the project. The results consolidated empirically the Hospitality Theory presumptions, showing yet a limitation of this theory in the case in study

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Brazil is the third largest producer of cashew nuts in the world. Despite the social and economic importance of the cashew nut, its production is still carried out artisanally. One of the main problems encountered in the cashew production chain are the conditions under which the roasting of the nut occurs to obtain the kernel from the shell. In the present study was conducted a biomonitoring of the genotoxic and cytotoxicity effects associated with the elements from the cashew nut roasting in João Câmara - RN, semi-arid region of Brazil. To assess the genotoxic was used the bioassay of micronucleus (MN) in Tradescantia pallida. In addition, it was performed a comparative between the Tradescantia pallida and KU-20 and other biomarkers of DNA damage, such as the nucleoplasmic bridges (NBP) and nuclear fragments (NF) were quantified. The levels of particulate matter (PM1.0, PM2.5, PM10) and black carbon (BC) were also measured and the inorganic chemical composition of the PM2.5 collected was determined using X-ray fluorescence spectrometry analysis and the assessment of the cytotoxicity by MTT assay and exclusion method by trypan blue. . For this purpose, were chosen: the Amarelão community where the roasting occurs and the Santa Luzia farm an area without influence of this process. The mean value of PM2.5 (Jan 2124.2 μg/m3; May 1022.2 μg/m3; Sep 1291.9 μg/m3) and BC (Jan 363.6 μg/m3; May 70.0 μg/m3; Sep 69.4 μg/m3) as well as the concentration of the elements Al, Si, P, S, Cl, K, Ca, Ti, Cr, Mn, Fe, Ni, Cu, Zn, Se, Br and Pb obtained at Amarelão was significantly higher than at Santa Luzia farm. The genotoxicity tests with T. pallida indicated a significant increase in the number of MN, NBP and NF and it was found a negative correlation between the frequency of these biomarkers and the rainfall. The concentrations of 200 μg/mL and 400 μg/mL of PM2.5 were cytotoxic to MRC-5 cells. All together, the results indicated genotoxicity and citotoxicity for the community of Amarelão, and the high rates of PM2.5 considered a potential contributor to this effect, mainly by the high presence of transition metals, especially Fe, Ni, Cu, Cr and Zn, these elements have the potential to cause DNA damage. Other nuclear alterations, such as the NPBs and NFs may be used as effective biomarkers of DNA damage in tetrads of Tradescantia pallida. The results of this study enabled the identification of a serious occupational problem. Accordingly, preventative measures and better practices should be adopted to improve both the activity and the quality of life of the population. These measures are of fundamental importance for the sustainable development of this activity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work describes the study and the implementation of the vector speed control for a three-phase Bearingless induction machine with divided winding of 4 poles and 1,1 kW using the neural rotor flux estimation. The vector speed control operates together with the radial positioning controllers and with the winding currents controllers of the stator phases. For the radial positioning, the forces controlled by the internal machine magnetic fields are used. For the radial forces optimization , a special rotor winding with independent circuits which allows a low rotational torque influence was used. The neural flux estimation applied to the vector speed controls has the objective of compensating the parameter dependences of the conventional estimators in relation to the parameter machine s variations due to the temperature increases or due to the rotor magnetic saturation. The implemented control system allows a direct comparison between the respective responses of the speed and radial positioning controllers to the machine oriented by the neural rotor flux estimator in relation to the conventional flux estimator. All the system control is executed by a program developed in the ANSI C language. The DSP resources used by the system are: the Analog/Digital channels converters, the PWM outputs and the parallel and RS-232 serial interfaces, which are responsible, respectively, by the DSP programming and the data capture through the supervisory system

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of the maps obtained from remote sensing orbital images submitted to digital processing became fundamental to optimize conservation and monitoring actions of the coral reefs. However, the accuracy reached in the mapping of submerged areas is limited by variation of the water column that degrades the signal received by the orbital sensor and introduces errors in the final result of the classification. The limited capacity of the traditional methods based on conventional statistical techniques to solve the problems related to the inter-classes took the search of alternative strategies in the area of the Computational Intelligence. In this work an ensemble classifiers was built based on the combination of Support Vector Machines and Minimum Distance Classifier with the objective of classifying remotely sensed images of coral reefs ecosystem. The system is composed by three stages, through which the progressive refinement of the classification process happens. The patterns that received an ambiguous classification in a certain stage of the process were revalued in the subsequent stage. The prediction non ambiguous for all the data happened through the reduction or elimination of the false positive. The images were classified into five bottom-types: deep water; under-water corals; inter-tidal corals; algal and sandy bottom. The highest overall accuracy (89%) was obtained from SVM with polynomial kernel. The accuracy of the classified image was compared through the use of error matrix to the results obtained by the application of other classification methods based on a single classifier (neural network and the k-means algorithm). In the final, the comparison of results achieved demonstrated the potential of the ensemble classifiers as a tool of classification of images from submerged areas subject to the noise caused by atmospheric effects and the water column

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most algorithms for state estimation based on the classical model are just adequate for use in transmission networks. Few algorithms were developed specifically for distribution systems, probably because of the little amount of data available in real time. Most overhead feeders possess just current and voltage measurements at the middle voltage bus-bar at the substation. In this way, classical algorithms are of difficult implementation, even considering off-line acquired data as pseudo-measurements. However, the necessity of automating the operation of distribution networks, mainly in regard to the selectivity of protection systems, as well to implement possibilities of load transfer maneuvers, is changing the network planning policy. In this way, some equipments incorporating telemetry and command modules have been installed in order to improve operational features, and so increasing the amount of measurement data available in real-time in the System Operation Center (SOC). This encourages the development of a state estimator model, involving real-time information and pseudo-measurements of loads, that are built from typical power factors and utilization factors (demand factors) of distribution transformers. This work reports about the development of a new state estimation method, specific for radial distribution systems. The main algorithm of the method is based on the power summation load flow. The estimation is carried out piecewise, section by section of the feeder, going from the substation to the terminal nodes. For each section, a measurement model is built, resulting in a nonlinear overdetermined equations set, whose solution is achieved by the Gaussian normal equation. The estimated variables of a section are used as pseudo-measurements for the next section. In general, a measurement set for a generic section consists of pseudo-measurements of power flows and nodal voltages obtained from the previous section or measurements in real-time, if they exist -, besides pseudomeasurements of injected powers for the power summations, whose functions are the load flow equations, assuming that the network can be represented by its single-phase equivalent. The great advantage of the algorithm is its simplicity and low computational effort. Moreover, the algorithm is very efficient, in regard to the accuracy of the estimated values. Besides the power summation state estimator, this work shows how other algorithms could be adapted to provide state estimation of middle voltage substations and networks, namely Schweppes method and an algorithm based on current proportionality, that is usually adopted for network planning tasks. Both estimators were implemented not only as alternatives for the proposed method, but also looking for getting results that give support for its validation. Once in most cases no power measurement is performed at beginning of the feeder and this is required for implementing the power summation estimations method, a new algorithm for estimating the network variables at the middle voltage bus-bar was also developed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work describes the study and the implementation of the speed control for a three-phase induction motor of 1,1 kW and 4 poles using the neural rotor flux estimation. The vector speed control operates together with the winding currents controller of the stator phasis. The neural flux estimation applied to the vector speed controls has the objective of compensating the parameter dependences of the conventional estimators in relation to the parameter machine s variations due to the temperature increases or due to the rotor magnetic saturation. The implemented control system allows a direct comparison between the respective responses of the speed controls to the machine oriented by the neural rotor flux estimator in relation to the conventional flux estimator. All the system control is executed by a program developed in the ANSI C language. The main DSP recources used by the system are, respectively, the Analog/Digital channels converters, the PWM outputs and the parallel and RS-232 serial interfaces, which are responsible, respectively, by the DSP programming and the data capture through the supervisory system

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Support Vector Machines (SVM) has attracted increasing attention in machine learning area, particularly on classification and patterns recognition. However, in some cases it is not easy to determinate accurately the class which given pattern belongs. This thesis involves the construction of a intervalar pattern classifier using SVM in association with intervalar theory, in order to model the separation of a pattern set between distinct classes with precision, aiming to obtain an optimized separation capable to treat imprecisions contained in the initial data and generated during the computational processing. The SVM is a linear machine. In order to allow it to solve real-world problems (usually nonlinear problems), it is necessary to treat the pattern set, know as input set, transforming from nonlinear nature to linear problem. The kernel machines are responsible to do this mapping. To create the intervalar extension of SVM, both for linear and nonlinear problems, it was necessary define intervalar kernel and the Mercer s theorem (which caracterize a kernel function) to intervalar function

Relevância:

10.00% 10.00%

Publicador:

Resumo:

They are in this study the experimental results of the analysis of thermal performance of composite material made from a plant matrix of polyurethane derived from castor oil of kernel of mamona (COF) and loading of clay-mineral called vermiculite expanded. Bodies of evidence in the proportions in weight of 10%, 15% and 20% were made to determine the thermal properties: conductivity (k), diffusivity (ά) and heat capacity (C), for purposes of comparison, the measurements were also performed the properties of polyurethane of castor without charge and also the oil polyurethane (PU), both already used in thermal insulation. Plates of 0.25 meters of material analyzed were manufactured for use as insulation material in a chamber performance thermal coverage. Thermocouples were distributed on the surface of the cover, and inside the material inside the test chamber and this in turn was subjected to artificial heating, consisting of a bank of incandescent lamps of 3000 w. The results obtained with the composite materials were compared with data from similar tests conducted with the camera alone with: (a) of oil PU, (b) of COF (c) glass wool, (d ) of rock wool. The heat resistance tests were performed with these composites, obtaining temperature limits for use in the range of 100 º C to 130 º C. Based on the analysis of the results of performance and thermal properties, it was possible to conclude that the COF composites with load of expanded vermiculite present behavior very close to those exhibited by commercial insulation material

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Currently there is still a high demand for quality control in manufacturing processes of mechanical parts. This keeps alive the need for the inspection activity of final products ranging from dimensional analysis to chemical composition of products. Usually this task may be done through various nondestructive and destructive methods that ensure the integrity of the parts. The result generated by these modern inspection tools ends up not being able to geometrically define the real damage and, therefore, cannot be properly displayed on a computing environment screen. Virtual 3D visualization may help identify damage that would hardly be detected by any other methods. One may find some commercial softwares that seek to address the stages of a design and simulation of mechanical parts in order to predict possible damages trying to diminish potential undesirable events. However, the challenge of developing softwares capable of integrating the various design activities, product inspection, results of non-destructive testing as well as the simulation of damage still needs the attention of researchers. This was the motivation to conduct a methodological study for implementation of a versatile CAD/CAE computer kernel capable of helping programmers in developing softwares applied to the activities of design and simulation of mechanics parts under stress. In this research it is presented interesting results obtained from the use of the developed kernel showing that it was successfully applied to case studies of design including parts presenting specific geometries, namely: mechanical prostheses, heat exchangers and piping of oil and gas. Finally, the conclusions regarding the experience of merging CAD and CAE theories to develop the kernel, so as to result in a tool adaptable to various applications of the metalworking industry are presented

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate several diffusion equations which extend the usual one by considering the presence of nonlinear terms or a memory effect on the diffusive term. We also considered a spatial time dependent diffusion coefficient. For these equations we have obtained a new classes of solutions and studied the connection of them with the anomalous diffusion process. We start by considering a nonlinear diffusion equation with a spatial time dependent diffusion coefficient. The solutions obtained for this case generalize the usual one and can be expressed in terms of the q-exponential and q-logarithm functions present in the generalized thermostatistics context (Tsallis formalism). After, a nonlinear external force is considered. For this case the solutions can be also expressed in terms of the q-exponential and q-logarithm functions. However, by a suitable choice of the nonlinear external force, we may have an exponential behavior, suggesting a connection with standard thermostatistics. This fact reveals that these solutions may present an anomalous relaxation process and then, reach an equilibrium state of the kind Boltzmann- Gibbs. Next, we investigate a nonmarkovian linear diffusion equation that presents a kernel leading to the anomalous diffusive process. Particularly, our first choice leads to both a the usual behavior and anomalous behavior obtained through a fractionalderivative equation. The results obtained, within this context, correspond to a change in the waiting-time distribution for jumps in the formalism of random walks. These modifications had direct influence in the solutions, that turned out to be expressed in terms of the Mittag-Leffler or H of Fox functions. In this way, the second moment associated to these distributions led to an anomalous spread of the distribution, in contrast to the usual situation where one finds a linear increase with time

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, we study the survival cure rate model proposed by Yakovlev et al. (1993), based on a competing risks structure concurring to cause the event of interest, and the approach proposed by Chen et al. (1999), where covariates are introduced to model the risk amount. We focus the measurement error covariates topics, considering the use of corrected score method in order to obtain consistent estimators. A simulation study is done to evaluate the behavior of the estimators obtained by this method for finite samples. The simulation aims to identify not only the impact on the regression coefficients of the covariates measured with error (Mizoi et al. 2007) but also on the coefficients of covariates measured without error. We also verify the adequacy of the piecewise exponential distribution to the cure rate model with measurement error. At the end, model applications involving real data are made

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we study the survival cure rate model proposed by Yakovlev (1993) that are considered in a competing risk setting. Covariates are introduced for modeling the cure rate and we allow some covariates to have missing values. We consider only the cases by which the missing covariates are categorical and implement the EM algorithm via the method of weights for maximum likelihood estimation. We present a Monte Carlo simulation experiment to compare the properties of the estimators based on this method with those estimators under the complete case scenario. We also evaluate, in this experiment, the impact in the parameter estimates when we increase the proportion of immune and censored individuals among the not immune one. We demonstrate the proposed methodology with a real data set involving the time until the graduation for the undergraduate course of Statistics of the Universidade Federal do Rio Grande do Norte