103 resultados para Amostragem intervalar
Resumo:
In this work we use Interval Mathematics to establish interval counterparts for the main tools used in digital signal processing. More specifically, the approach developed here is oriented to signals, systems, sampling, quantization, coding and Fourier transforms. A detailed study for some interval arithmetics which handle with complex numbers is provided; they are: complex interval arithmetic (or rectangular), circular complex arithmetic, and interval arithmetic for polar sectors. This lead us to investigate some properties that are relevant for the development of a theory of interval digital signal processing. It is shown that the sets IR and R(C) endowed with any correct arithmetic is not an algebraic field, meaning that those sets do not behave like real and complex numbers. An alternative to the notion of interval complex width is also provided and the Kulisch- Miranker order is used in order to write complex numbers in the interval form enabling operations on endpoints. The use of interval signals and systems is possible thanks to the representation of complex values into floating point systems. That is, if a number x 2 R is not representable in a floating point system F then it is mapped to an interval [x;x], such that x is the largest number in F which is smaller than x and x is the smallest one in F which is greater than x. This interval representation is the starting point for definitions like interval signals and systems which take real or complex values. It provides the extension for notions like: causality, stability, time invariance, homogeneity, additivity and linearity to interval systems. The process of quantization is extended to its interval counterpart. Thereafter the interval versions for: quantization levels, quantization error and encoded signal are provided. It is shown that the interval levels of quantization represent complex quantization levels and the classical quantization error ranges over the interval quantization error. An estimation for the interval quantization error and an interval version for Z-transform (and hence Fourier transform) is provided. Finally, the results of an Matlab implementation is given
Resumo:
Crítica descentrada para o senso comum: MUNIZ, Euzébia Maria de Pontes Targino; DANTAS, Juliana Bulhões Alberto; ALBANO, Sebastião Guilherme (Orgs). Amostragem da reflexão acerca da comunicação contemporânea realizada na Universidade Federal do Rio Grande do Norte. Natal, RN: EDUFRN, 2012.
Resumo:
This work present a interval approach to deal with images with that contain uncertainties, as well, as treating these uncertainties through morphologic operations. Had been presented two intervals models. For the first, is introduced an algebraic space with three values, that was constructed based in the tri-valorada logic of Lukasiewiecz. With this algebraic structure, the theory of the interval binary images, that extends the classic binary model with the inclusion of the uncertainty information, was introduced. The same one can be applied to represent certain binary images with uncertainty in pixels, that it was originated, for example, during the process of the acquisition of the image. The lattice structure of these images, allow the definition of the morphologic operators, where the uncertainties are treated locally. The second model, extend the classic model to the images in gray levels, where the functions that represent these images are mapping in a finite set of interval values. The algebraic structure belong the complete lattices class, what also it allow the definition of the elementary operators of the mathematical morphology, dilation and erosion for this images. Thus, it is established a interval theory applied to the mathematical morphology to deal with problems of uncertainties in images
Resumo:
The idea of considering imprecision in probabilities is old, beginning with the Booles George work, who in 1854 wanted to reconcile the classical logic, which allows the modeling of complete ignorance, with probabilities. In 1921, John Maynard Keynes in his book made explicit use of intervals to represent the imprecision in probabilities. But only from the work ofWalley in 1991 that were established principles that should be respected by a probability theory that deals with inaccuracies. With the emergence of the theory of fuzzy sets by Lotfi Zadeh in 1965, there is another way of dealing with uncertainty and imprecision of concepts. Quickly, they began to propose several ways to consider the ideas of Zadeh in probabilities, to deal with inaccuracies, either in the events associated with the probabilities or in the values of probabilities. In particular, James Buckley, from 2003 begins to develop a probability theory in which the fuzzy values of the probabilities are fuzzy numbers. This fuzzy probability, follows analogous principles to Walley imprecise probabilities. On the other hand, the uses of real numbers between 0 and 1 as truth degrees, as originally proposed by Zadeh, has the drawback to use very precise values for dealing with uncertainties (as one can distinguish a fairly element satisfies a property with a 0.423 level of something that meets with grade 0.424?). This motivated the development of several extensions of fuzzy set theory which includes some kind of inaccuracy. This work consider the Krassimir Atanassov extension proposed in 1983, which add an extra degree of uncertainty to model the moment of hesitation to assign the membership degree, and therefore a value indicate the degree to which the object belongs to the set while the other, the degree to which it not belongs to the set. In the Zadeh fuzzy set theory, this non membership degree is, by default, the complement of the membership degree. Thus, in this approach the non-membership degree is somehow independent of the membership degree, and this difference between the non-membership degree and the complement of the membership degree reveals the hesitation at the moment to assign a membership degree. This new extension today is called of Atanassov s intuitionistic fuzzy sets theory. It is worth noting that the term intuitionistic here has no relation to the term intuitionistic as known in the context of intuitionistic logic. In this work, will be developed two proposals for interval probability: the restricted interval probability and the unrestricted interval probability, are also introduced two notions of fuzzy probability: the constrained fuzzy probability and the unconstrained fuzzy probability and will eventually be introduced two notions of intuitionistic fuzzy probability: the restricted intuitionistic fuzzy probability and the unrestricted intuitionistic fuzzy probability
Resumo:
The Support Vector Machines (SVM) has attracted increasing attention in machine learning area, particularly on classification and patterns recognition. However, in some cases it is not easy to determinate accurately the class which given pattern belongs. This thesis involves the construction of a intervalar pattern classifier using SVM in association with intervalar theory, in order to model the separation of a pattern set between distinct classes with precision, aiming to obtain an optimized separation capable to treat imprecisions contained in the initial data and generated during the computational processing. The SVM is a linear machine. In order to allow it to solve real-world problems (usually nonlinear problems), it is necessary to treat the pattern set, know as input set, transforming from nonlinear nature to linear problem. The kernel machines are responsible to do this mapping. To create the intervalar extension of SVM, both for linear and nonlinear problems, it was necessary define intervalar kernel and the Mercer s theorem (which caracterize a kernel function) to intervalar function
Resumo:
The aim of this study is to quantify the presence of major and minor elements in the sediments of estuary Potengi. Four georeferenced sampling points were used in the study, at which sediment samples were collected in the channel of the river and on the right and left banks. In addition, dissolved oxygen, salinity and water conductivity were taken in situ at the time of sample collection. The percentage of organic matter, determined by gravimetry, and granulometric analysis of the sediment samples were conducted in the laboratory. To quantify the major and minor elements a prior test to open the sample was conducted with standard NIST 1646ª estuarine sediment to choose the best methodology to be adopted. The sediment samples were dissolved in microwaves with nitric acid and chloridric acid, according to methodology proposed by US EPA 3051ª. Quantitative analyses of the elements Al, Fe, Cd, Cr, Cu, Mn, Ni, Pb and Zn were conducted by inductively coupled plasma optical emission spectrometry (ICPOES). The results showed that the partial concentrations of the elements analyzed are below average worldwide shale levels, the standard described by Turekian and Wedepohl (1961)
Resumo:
A matemática intervalar é uma teoria matemática originada na década de 60 com o objetivo de responder questões de exatidão e eficiência que surgem na prática da computação científica e na resolução de problemas numéricos. As abordagens clássicas para teoria da computabilidade tratam com problemas discretos (por exemplo, sobre os números naturais, números inteiros, strings sobre um alfabeto finito, grafos, etc.). No entanto, campos da matemática pura e aplicada tratam com problemas envolvendo números reais e números complexos. Isto acontece, por exemplo, em análise numérica, sistemas dinâmicos, geometria computacional e teoria da otimização. Assim, uma abordagem computacional para problemas contínuos é desejável, ou ainda necessária, para tratar formalmente com computações analógicas e computações científicas em geral. Na literatura existem diferentes abordagens para a computabilidade nos números reais, mas, uma importante diferença entre estas abordagens está na maneira como é representado o número real. Existem basicamente duas linhas de estudo da computabilidade no contínuo. Na primeira delas uma aproximação da saída com precisão arbitrária é computada a partir de uma aproximação razoável da entrada [Bra95]. A outra linha de pesquisa para computabilidade real foi desenvolvida por Blum, Shub e Smale [BSS89]. Nesta aproximação, as chamadas máquinas BSS, um número real é visto como uma entidade acabada e as funções computáveis são geradas a partir de uma classe de funções básicas (numa maneira similar às funções parciais recursivas). Nesta dissertação estudaremos o modelo BSS, usado para se caracterizar uma teoria da computabilidade sobre os números reais e estenderemos este para se modelar a computabilidade no espaço dos intervalos reais. Assim, aqui veremos uma aproximação para computabilidade intervalar epistemologicamente diferente da estudada por Bedregal e Acióly [Bed96, BA97a, BA97b], na qual um intervalo real é visto como o limite de intervalos racionais, e a computabilidade de uma função intervalar real depende da computabilidade de uma função sobre os intervalos racionais
Resumo:
This work aims to develop modules that will increase the computational power of the Java-XSC library, and XSC an acronym for "Language Extensions for Scientific Computation . This library is actually an extension of the Java programming language that has standard functions and routines elementary mathematics useful interval. in this study two modules were added to the library, namely, the modulus of complex numbers and complex numbers of module interval which together with the modules original numerical applications that are designed to allow, for example in the engineering field, can be used in devices running Java programs
Resumo:
The interval datatype applications in several areas is important to construct a interval type reusable, i.e., a interval constructor can be applied to any datatype and get intervals this datatype. Since the interval is, of certain form, a set of elements limited for two bounds, left and right, with a order notions, then it s reasonable that interval constructor enclose datatypes with partial order. On the order hand, what we want is work with interval of any datatype like this we work with this datatype then. it s important to guarantee the properties of the datatype when maps to interval of this datatype. Thus, the interval constructor get a theory to parametrized interval type, i.e., a interval with generics parameters (for example rational, real, complex). Sometimes, the interval application in some algebras doesn t guarantee the mainutenance of their properties, for example, when we use interval of real, that satisfies the field properties, it doesn t guarantee the distributivity propertie. A form to surpass this problem Santiago introduced the local equality theory that weakened the notion of strong equality, and thus, allowing some properties are local keeped, what can be discard before. The interval arithmetic generalization aim to apply the interval constructor on ordered algebras weakened for local equality with the purpose of the keep their properties. How the intervals are important in applications with continuous data, it s interesting specify that theory using a specification language that supply a system development using intervals of form disciplined, trustworth and safe. Currently, the algebraic specification language, based in math models, have been use to that intention often. We choose CASL (Common Algebraic Specification Language) among others languages because CASL has several characteristics excellent to parametrized interval type, such as, provide parcialiy and parametrization
Resumo:
This paper proposes a new control chart to monitor a process mean employing a combined npx-X control chart. Basically the procedure consists of splitting the sample of size n into two sub-samples n1 and n2 determined by an optimization search. The sampling occur in two-stages. In the first stage the units of the sub-sample n1 are evaluated by attributes and plotted in npx control chart. If this chart signs then units of second sub-sample are measured and the monitored statistic plotted in X control chart (second stage). If both control charts sign then the process is stopped for adjustment. The possibility of non-inspection in all n items may promote a reduction not only in the cost but also the time spent to examine the sampled items. Performances of the current proposal, individual X and npx control charts are compared. In this study the proposed procedure presents many competitive options for the X control chart for a sample size n and a shift from the target mean. The average time to sign (ATS) of the current proposal lower than the values calculated from an individual X control chart points out that the combined control chart is an efficient tool in monitoring process mean.
Resumo:
Crítica descentrada para o senso comum: MUNIZ, Euzébia Maria de Pontes Targino; DANTAS, Juliana Bulhões Alberto; ALBANO, Sebastião Guilherme (Orgs). Amostragem da reflexão acerca da comunicação contemporânea realizada na Universidade Federal do Rio Grande do Norte. Natal, RN: EDUFRN, 2012.
Resumo:
Comparar o desenvolvimento de profissionais e estudantes de enfermagem quanto à realização da técnica de curativo. Metodologia: estudo descritivo com abordagem quantitativa. A amostra foi escolhida através de amostragem por conveniência, tendo participado da pesquisa 14 profissionais de enfermagem, que foram observados no período de março a junho de 2009, e 24 estudantes de enfermagem, regularmente matriculados no último período do Curso de Graduação em Enfermagem da Universidade Federal da Paraíba, os quais foram observados no período de março e maio de 2010. O procedimento de coleta de dados foi efetuado através da observação não participante e preenchimento de um roteiro semi-estruturado, após a aprovação do Comitê de Ética em Pesquisa do Hospital Universitário Lauro Wanderley, sob Protocolo n. 011/09. Resultados: na maioria dos aspectos analisados, os estudantes se sobressaíram aos profissionais, principalmente com relação à lavagem das mãos, orientação do paciente e utilização de movimentos únicos para limpeza da ferida. Conclusão: os profissionais devem melhorar seu conhecimento, buscando atualização nessa área, e os estudantes devem ser supervisionados durante a realização do procedimento
Resumo:
This research is part of the field of organizational studies, focusing on organizational purchase behavior and, specifically, trust interorganizational at the purchases. This topic is current and relevant by addressing the development of good relations between buyer-supplier that increases the exchange of information, increases the length of relationship, reduces the hierarchical controls and improves performance. Furthermore, although there is a vast literature on trust, the scientific work that deal specifically at the trust interorganizational still need further research to synthesize and validate the variables that generate this phenomenon. In this sense, this investigation is to explain the antecedents of trust interorganizational by the relationship between the variable operational performance, organizational characteristics, shared values and interpersonal relationships on purchases by manufacturing industries, in order to develop a robust literature, most consensual, that includes the current sociological and economic, considering the effect of interpersonal relationships in this phenomenon. This proposal is configured in a new vision of the antecedents of interorganizational trust, described as significant quantitative from models Morgan and Hunt (1994), Doney and Cannon (1997), Zhao and Cavusgil (2006) and Nyaga, Whipple, Lynch (2011), as well as qualitative analysis of Tacconi et al. (2011). With regard to methodological aspects, the study assumes the form of a descriptive, survey type, and causal trace theoretical and empirical. As for his nature, the investigation, explicative character, has developed a quantitative approach with the use of exploratory factor analysis and structural equation modeling SEM, with the use of IBM software SPSS Amos 18.0, using the method of maximum verisimilitude, and supported by technical bootstraping. The unit of analysis was the buyer-supplier relationship, in which the object under investigation was the supplier organization in view of the purchasing company. 237 valid questionnaires were collected among key informants, using a simple random sampling developed in manufacturing industries (SIC 10-33), located in the city of Natal and in the region of Natal. The first results of descriptive analysis demonstrate the phenomenon of interorganizational trust, in which purchasing firms believe, feel secure about the supplier. This demonstration showed high levels of intensity, predominantly among the vendors that supply the company with materials that are used directly in the production process. The exploratory and confirmatory factor analysis, performed on each variable alone, generated a set of observable and unobservable variables more consistent, giving rise to a model, that needed to be further specified. This again specify model consists of trajectories was positive, with a good fit, with a composite reliability and variance extracted satisfactory, and demonstrates convergent and discriminant validity, in which the factor loadings are significant and strong explanatory power. Given the findings that reinforce the model again specify data, suggesting a high probability that this model may be more suited for the study population, the results support the explanation that interorganizational trust depends on purchases directly from interpersonal relationships, sharing value and operating performance and indirectly of personal relationships, social networks, organizational characteristics, physical and relational aspect of performance. It is concluded that this trust can be explained by a set of interactions between these three determinants, where the focus is on interpersonal relationships, with the largest path coefficient for the factor under study
Resumo:
The current study presents the characteristics of self-efficacy of students of Administration course, who work and do not work. The study was conducted through a field research, descriptive, addressed quantitatively using statistical procedures. Was studied a population composed of 394 students distributed in three Higher Education Institutions, in the metropolitan region of Belém, in the State of Pará. The sampling was not probabilistic by accessibility, with a sample of 254 subjects. The instrument for data collection was a questionnaire composed of a set of questions divided into three sections: the first related to sociodemographic data, the second section was built to identify the work situation of the respondent and the third section was built with issues related to General Perceived Self-Efficacy Scale proposed by Schwarzer and Jerusalem (1999). Sociodemographic data were processed using methods of descriptive statistics. This procedure allowed characterizing the subjects of the sample. To identify the work situation, the analysis of frequency and percentage was used, which allowed to classify in percentage, the respondents who worked and those that did not work, and the data related to the scale of self-efficacy were processed quantitatively by the method of multivariate statistics using the software of program Statistical Package for Social Sciences for Windows - SPSS, version 17 from the process of Exploratory Factor Analysis. This procedure allowed characterizing the students who worked and the students who did not worked. The results were discussed based on Social Cognitive Theory from the construct of self-efficacy of Albert Bandura (1977). The study results showed a young sample, composed the majority of single women with work experience, and indicated that the characteristics of self-efficacy of students who work and students who do not work are different. The self-efficacy beliefs of students who do not work are based on psychological expectations, whereas the students who work demonstrated that their efficacy beliefs are sustained by previous experiences. A student who does not work proved to be reliant in their abilities to achieve a successful performance in their activities, believing it to be easy to achieve your goals and to face difficult situations at work, simply by invest a necessary effort and trust in their abilities. One who has experience working proved to be reliant in their abilities to conduct courses of action, although know that it is not easy to achieve your goals, and in unexpected situations showed its ability to solve difficult problems
Resumo:
This study aimed to measure the perception of maturity project management of state boards of Rio Grande do Norte by the perception of its managers. Argues that project management has been highlighted as a critical factor for the success of any organization, because the projects are directly related to the set of activities that result in organizational innovation as products, services and processes and the improvement of project management is directly aligned with the main pillars of the New Public Management. Methodologically, this is a quantitative research of a descriptive nature in which 161 forms were applied with coordinators and subcoordinators of state departments of Rio Grande do Norte, culminating in a sampling error of less than 6% to 95% confidence according to the procedures finite sampling. The process of tabulation and analysis was done using the package Statistical Package for Social Sciences - SPSS 18.0 and worked with techniques such as mean, standard deviation, frequency distributions, cluster analysis and factor analysis. The results indicate that the levels of maturity in project management in state departments of Rio Grande do Norte is below the national average and that behavioral skills are the main problem for improving management in these departments. It was possible to detect the existence of two groups of different perceptions about the management of projects, indicating, according to the managers, there are islands of excellence in project management in some sectors of the state departments. It was also observed that there are eight factors that affect maturity in project management: Planning and Control , Development of Management Skills , Project Management Environment , Acceptance of the Subject Project Management , Stimulus to Performance , Project Evaluation and Learning , Project Management Office and Visibility of Project Managers . It concludes that the project management in state departments of Rio Grande do Norte has no satisfactory levels of maturity in project management, affecting the levels of efficiency and effectiveness of the state apparatus, which shows that some of the assumptions that guide the New Public Management are not getting the levels of excellence nailed by this management model