40 resultados para limiar auditivo


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The relevance of rising healthcare costs is a main topic in complementary health companies in Brazil. In 2011, these expenses consumed more than 80% of the monthly health insurance in Brazil. Considering the administrative costs, it is observed that the companies operating in this market work, on average, at the threshold between profit and loss. This paper presents results after an investigation of the welfare costs of a health plan company in Brazil. It was based on the KDD process and explorative Data Mining. A diversity of results is presented, such as data summarization, providing compact descriptions of the data, revealing common features and intrinsic observations. Among the key findings was observed that a small portion of the population is responsible for the most demanding of resources devoted to health care

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we present a new clustering method that groups up points of a data set in classes. The method is based in a algorithm to link auxiliary clusters that are obtained using traditional vector quantization techniques. It is described some approaches during the development of the work that are based in measures of distances or dissimilarities (divergence) between the auxiliary clusters. This new method uses only two a priori information, the number of auxiliary clusters Na and a threshold distance dt that will be used to decide about the linkage or not of the auxiliary clusters. The number os classes could be automatically found by the method, that do it based in the chosen threshold distance dt, or it is given as additional information to help in the choice of the correct threshold. Some analysis are made and the results are compared with traditional clustering methods. In this work different dissimilarities metrics are analyzed and a new one is proposed based on the concept of negentropy. Besides grouping points of a set in classes, it is proposed a method to statistical modeling the classes aiming to obtain a expression to the probability of a point to belong to one of the classes. Experiments with several values of Na e dt are made in tests sets and the results are analyzed aiming to study the robustness of the method and to consider heuristics to the choice of the correct threshold. During this work it is explored the aspects of information theory applied to the calculation of the divergences. It will be explored specifically the different measures of information and divergence using the Rényi entropy. The results using the different metrics are compared and commented. The work also has appendix where are exposed real applications using the proposed method

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Image compress consists in represent by small amount of data, without loss a visual quality. Data compression is important when large images are used, for example satellite image. Full color digital images typically use 24 bits to specify the color of each pixel of the Images with 8 bits for each of the primary components, red, green and blue (RGB). Compress an image with three or more bands (multispectral) is fundamental to reduce the transmission time, process time and record time. Because many applications need images, that compression image data is important: medical image, satellite image, sensor etc. In this work a new compression color images method is proposed. This method is based in measure of information of each band. This technique is called by Self-Adaptive Compression (S.A.C.) and each band of image is compressed with a different threshold, for preserve information with better result. SAC do a large compression in large redundancy bands, that is, lower information and soft compression to bands with bigger amount of information. Two image transforms are used in this technique: Discrete Cosine Transform (DCT) and Principal Component Analysis (PCA). Primary step is convert data to new bands without relationship, with PCA. Later Apply DCT in each band. Data Loss is doing when a threshold discarding any coefficients. This threshold is calculated with two elements: PCA result and a parameter user. Parameters user define a compression tax. The system produce three different thresholds, one to each band of image, that is proportional of amount information. For image reconstruction is realized DCT and PCA inverse. SAC was compared with JPEG (Joint Photographic Experts Group) standard and YIQ compression and better results are obtain, in MSE (Mean Square Root). Tests shown that SAC has better quality in hard compressions. With two advantages: (a) like is adaptive is sensible to image type, that is, presents good results to divers images kinds (synthetic, landscapes, people etc., and, (b) it need only one parameters user, that is, just letter human intervention is required

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Among the main challenges in the beer industrial production is the market supply at the lowest cost and high quality, in order to ensure the expectations of customers and. consumers The beer fermentation stage represents approximately 70% of the whole time necessary to its production, having a obligatoriness of strict process controls to avoid becoming bottleneck in beer production. This stage is responsible for the formation of a series of subproducts, which are responsible for the composition of aroma/bouquet existing in beer and some of these subproducts, if produced in larger quantities, they will confer unpleasant taste and odor to the final product. Among the subproducts formed during the fermentation stage, total vicinal diketones is the main component, since it is limiting for product transfusion to the subsequent steps, besides having a low perception threshold by the consumer and giving undesirable taste and odor. Due to the instability of main raw materials quality and also process controls during fermentation, the development of alternative forms of beer production without impacting on total fermentation time and final product quality is a great challenge to breweries. In this work, a prior acidification of the pasty yeast was carried out, utilizing for that phosphoric acid, food grade, reducing yeast pH of about 5.30 to 2.20 and altering its characteristic from flocculent to pulverulent during beer fermentation. An increase of six times was observed in amount of yeast cells in suspension in the second fermentation stage regarding to fermentations by yeast with no prior acidification. With alteration on two input variables, temperature curve and cell multiplication, which goal was to minimize the maximum values for diketones detected in the fermenter tank, a reduction was obtained from peak of formed diacetyl and consequently contributed to reduction in fermentation time and total process time. Several experiments were performed with those process changes in order to verify the influence on the total fermentation time and total vicinal diketones concentration at the end of fermentation. This experiment reached as the best production result a total fermentation time of 151 hours and total vicinal diketone concentration of 0.08 ppm. The mass of yeast in suspension in the second phase of fermentation increased from 2.45 x 106 to 16.38 x 106 cells/mL of yeast, which fact is key to a greater efficiency in reducing total vicinal diketones existing in the medium, confirming that the prior yeast acidification, as well as the control of temperature and yeast cell multiplication in fermentative process enhances the performance of diketones reduction and consequently reduce the total fermentation time with diketones concentration below the expected value (Max: 0.10 ppm)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Expanded Bed Adsorption (EBA) is an integrative process that combines concepts of chromatography and fluidization of solids. The many parameters involved and their synergistic effects complicate the optimization of the process. Fortunately, some mathematical tools have been developed in order to guide the investigation of the EBA system. In this work the application of experimental design, phenomenological modeling and artificial neural networks (ANN) in understanding chitosanases adsorption on ion exchange resin Streamline® DEAE have been investigated. The strain Paenibacillus ehimensis NRRL B-23118 was used for chitosanase production. EBA experiments were carried out using a column of 2.6 cm inner diameter with 30.0 cm in height that was coupled to a peristaltic pump. At the bottom of the column there was a distributor of glass beads having a height of 3.0 cm. Assays for residence time distribution (RTD) revelead a high degree of mixing, however, the Richardson-Zaki coefficients showed that the column was on the threshold of stability. Isotherm models fitted the adsorption equilibrium data in the presence of lyotropic salts. The results of experiment design indicated that the ionic strength and superficial velocity are important to the recovery and purity of chitosanases. The molecular mass of the two chitosanases were approximately 23 kDa and 52 kDa as estimated by SDS-PAGE. The phenomenological modeling was aimed to describe the operations in batch and column chromatography. The simulations were performed in Microsoft Visual Studio. The kinetic rate constant model set to kinetic curves efficiently under conditions of initial enzyme activity 0.232, 0.142 e 0.079 UA/mL. The simulated breakthrough curves showed some differences with experimental data, especially regarding the slope. Sensitivity tests of the model on the surface velocity, axial dispersion and initial concentration showed agreement with the literature. The neural network was constructed in MATLAB and Neural Network Toolbox. The cross-validation was used to improve the ability of generalization. The parameters of ANN were improved to obtain the settings 6-6 (enzyme activity) and 9-6 (total protein), as well as tansig transfer function and Levenberg-Marquardt training algorithm. The neural Carlos Eduardo de Araújo Padilha dezembro/2013 9 networks simulations, including all the steps of cycle, showed good agreement with experimental data, with a correlation coefficient of approximately 0.974. The effects of input variables on profiles of the stages of loading, washing and elution were consistent with the literature

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis we study some problems related to petroleum reservoirs using methods and concepts of Statistical Physics. The thesis could be divided percolation problem in random multifractal support motivated by its potential application in modelling oil reservoirs. We develped an heterogeneous and anisotropic grid that followin two parts. The first one introduce a study of the percolations a random multifractal distribution of its sites. After, we determine the percolation threshold for this grid, the fractal dimension of the percolating cluster and the critical exponents ß and v. In the second part, we propose an alternative systematic of modelling and simulating oil reservoirs. We introduce a statistical model based in a stochastic formulation do Darcy Law. In this model, the distribution of permeabilities is localy equivalent to the basic model of bond percolation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The complex behavior of a wide variety of phenomena that are of interest to physicists, chemists, and engineers has been quantitatively characterized by using the ideas of fractal and multifractal distributions, which correspond in a unique way to the geometrical shape and dynamical properties of the systems under study. In this thesis we present the Space of Fractals and the methods of Hausdorff-Besicovitch, box-counting and Scaling to calculate the fractal dimension of a set. In this Thesis we investigate also percolation phenomena in multifractal objects that are built in a simple way. The central object of our analysis is a multifractal object that we call Qmf . In these objects the multifractality comes directly from the geometric tiling. We identify some differences between percolation in the proposed multifractals and in a regular lattice. There are basically two sources of these differences. The first is related to the coordination number, c, which changes along the multifractal. The second comes from the way the weight of each cell in the multifractal affects the percolation cluster. We use many samples of finite size lattices and draw the histogram of percolating lattices against site occupation probability p. Depending on a parameter, ρ, characterizing the multifractal and the lattice size, L, the histogram can have two peaks. We observe that the probability of occupation at the percolation threshold, pc, for the multifractal is lower than that for the square lattice. We compute the fractal dimension of the percolating cluster and the critical exponent β. Despite the topological differences, we find that the percolation in a multifractal support is in the same universality class as standard percolation. The area and the number of neighbors of the blocks of Qmf show a non-trivial behavior. A general view of the object Qmf shows an anisotropy. The value of pc is a function of ρ which is related to its anisotropy. We investigate the relation between pc and the average number of neighbors of the blocks as well as the anisotropy of Qmf. In this Thesis we study likewise the distribution of shortest paths in percolation systems at the percolation threshold in two dimensions (2D). We study paths from one given point to multiple other points

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The complex behavior of a wide variety of phenomena that are of interest to physicists, chemists, and engineers has been quantitatively characterized by using the ideas of fractal and multifractal distributions, which correspond in a unique way to the geometrical shape and dynamical properties of the systems under study. In this thesis we present the Space of Fractals and the methods of Hausdorff-Besicovitch, box-counting and Scaling to calculate the fractal dimension of a set. In this Thesis we investigate also percolation phenomena in multifractal objects that are built in a simple way. The central object of our analysis is a multifractal object that we call Qmf . In these objects the multifractality comes directly from the geometric tiling. We identify some differences between percolation in the proposed multifractals and in a regular lattice. There are basically two sources of these differences. The first is related to the coordination number, c, which changes along the multifractal. The second comes from the way the weight of each cell in the multifractal affects the percolation cluster. We use many samples of finite size lattices and draw the histogram of percolating lattices against site occupation probability p. Depending on a parameter, ρ, characterizing the multifractal and the lattice size, L, the histogram can have two peaks. We observe that the probability of occupation at the percolation threshold, pc, for the multifractal is lower than that for the square lattice. We compute the fractal dimension of the percolating cluster and the critical exponent β. Despite the topological differences, we find that the percolation in a multifractal support is in the same universality class as standard percolation. The area and the number of neighbors of the blocks of Qmf show a non-trivial behavior. A general view of the object Qmf shows an anisotropy. The value of pc is a function of ρ which is related to its anisotropy. We investigate the relation between pc and the average number of neighbors of the blocks as well as the anisotropy of Qmf. In this Thesis we study likewise the distribution of shortest paths in percolation systems at the percolation threshold in two dimensions (2D). We study paths from one given point to multiple other points. In oil recovery terminology, the given single point can be mapped to an injection well (injector) and the multiple other points to production wells (producers). In the previously standard case of one injection well and one production well separated by Euclidean distance r, the distribution of shortest paths l, P(l|r), shows a power-law behavior with exponent gl = 2.14 in 2D. Here we analyze the situation of one injector and an array A of producers. Symmetric arrays of producers lead to one peak in the distribution P(l|A), the probability that the shortest path between the injector and any of the producers is l, while the asymmetric configurations lead to several peaks in the distribution. We analyze configurations in which the injector is outside and inside the set of producers. The peak in P(l|A) for the symmetric arrays decays faster than for the standard case. For very long paths all the studied arrays exhibit a power-law behavior with exponent g ∼= gl.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study magnetic interface roughness in F/AF bilayers. Two kinds of roughness were considered. The first one consists of isolated defects that divide the substrate in two regions, each one with an AF sub-lattice. The interface exchange coupling is considered uniform and presents a sudden change in the defects line, favoring Neel wall nucleation. Our results show the interface field dependence of the threshold thickness for the reorientation of the magnetization in the ferromagnetic film. Angular profiles show the relaxation of the magnetization, from Neel wall, at the interface, to reoriented state, at the surface. External magnetic field, perpendicular to the easy axis of the substrate, favors the reoriented state. Depending, of the external magnetic field intensity, parallel to the easy axis of the AF, the magnetization profile at surface can be parallel or perpendicular to the field direction. The second one treats of distributed deffects, periodically. The shape hysteresis curves, exchange bias and coercivity were characterized by interface field intensity and roughness pattern. Our results show that dipolar effects decrease the exchange bias and coercivity

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Treadmill training with partial body weight support (BWS) has shown many benefits for patients after a stroke. But their findings are not well known when combined with biofeedback. OBJETIVE: The purpose of this study was to evaluate the immediate effects of biofeedback, visual and auditory, combined with treadmill training with BWS on on walking functions of hemiplegic subjects. METHODS: We conducted a clinical trial, randomized controlled trial with 30 subjects in the chronic stage of stroke, underwent treadmill training with BWS (control), combined with visual biofeedback, given by the monitor of the treadmill through the symbolic appearance of feet as the subject gave the step; or auditory biofeedback, using a metronome with a frequency of 115% of the cadence of the individual. The subjects were evaluated by kinematics, and the data obtained by the Motion Analysis System Qualisys. To assess differences between groups and within each group after training was applied to ANOVA 3 x 2 repeated measures. RESULTS: There were no statistical differences between groups in any variable spatio-temporal and angular motion, but within each group there was an increase in walking speed and stride length after the training. The group of visual biofeedback increased the stance period and reduced the swing period and reason of symmetry, and the group auditory biofeedback reduced the double stance period. The range of motion of the knee and ankle and the plantar flexion increased in the visual biofeedback group. CONCLUSION: There are no differences between the immediate effects of gait training on a treadmill with BWS performed with and without visual or auditory biofeedback. However, the visual biofeedback can promote changes in a larger number of variables spatiotemporal and angular gait

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The phylogeny is one of the main activities of the modern taxonomists and a way to reconstruct the history of the life through comparative analysis of these sequences stored in their genomes aimed find any justification for the origin or evolution of them. Among the sequences with a high level of conservation are the genes of repair because it is important for the conservation and maintenance of genetic stability. Hence, variations in repair genes, as the genes of the nucleotide excision repair (NER), may indicate a possible gene transfer between species. This study aimed to examine the evolutionary history of the components of the NER. For this, sequences of UVRA, UVRB, UVRC and XPB were obtained from GenBank by Blast-p, considering 10-15 as cutoff to create a database. Phylogenetic studies were done using algorithms in PAUP programs, BAYES and PHYLIP package. Phylogenetic trees were build with protein sequences and with sequences of 16S ribosomal RNA for comparative analysis by the methods of parsimony, likelihood and Bayesian. The XPB tree shows that archaeal´s XPB helicases are similar to eukaryotic helicases. According to this data, we infer that the eukaryote nucleotide excision repair system had appeared in Archaea. At UVRA, UVRB and UVRC trees was found a monophyletic group formed by three species of epsilonproteobacterias class, three species of mollicutes class and archaeabacterias of Methanobacteria and Methanococci classes. This information is supported by a tree obtained with the proteins, UVRA, UVRB and UVRC concatenated. Thus, although there are arguments in the literature defending the horizontal transfer of the system uvrABC of bacteria to archaeabacterias, the analysis made in this study suggests that occurred a vertical transfer, from archaeabacteria, of both the NER genes: uvrABC and XPs. According the parsimony, this is the best way because of the occurrence of monophyletic groups, the time of divergence of classes and number of archaeabacterias species with uvrABC system

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research has as object of study the city of Caicó, Rio Grande do Norte State, between the middle part of the decade of 1920 and the beginning of 1930. It intended to perceive the projects thought to the city of Caicó as well as the challenges due to the new ideas of modernity that were circulating around the contemporary world. So it consists in an important historical exercise about the relation between history and space in to the extent that itself comes to surface a city deals by diverse angles whose perspectives can be read in several "fragments of memory", such as newspapers, trials-crime, reports, memories, books, etc., those show the tension between the traditional and the modern way of life one. In this way, it´s tried to transform the space, giving it a new reference, inspired in what occurs in the Brazilian big cities and around the world, through the use of the techniques, of the electricity, of the movies, of the press, cars, medicine and so on. At the same time in that is necessary-itself deal with the permanencies such as the old manners and the droughts and theirs "flagellated" people. Therefore, it is in this difficult phase that is tried legitimize the city of Caicó as the "Capital of the Seridó", in the threshold between the refusal and the seduction

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The work is to make a brief discussion of methods to estimate the parameters of the Generalized Pareto distribution (GPD). Being addressed the following techniques: Moments (moments), Maximum Likelihood (MLE), Biased Probability Weighted Moments (PWMB), Unbiased Probability Weighted Moments (PWMU), Mean Power Density Divergence (MDPD), Median (MED), Pickands (PICKANDS), Maximum Penalized Likelihood (MPLE), Maximum Goodness-of-fit (MGF) and the Maximum Entropy (POME) technique, the focus of this manuscript. By way of illustration adjustments were made for the Generalized Pareto distribution, for a sequence of earthquakes intraplacas which occurred in the city of João Câmara in the northeastern region of Brazil, which was monitored continuously for two years (1987 and 1988). It was found that the MLE and POME were the most efficient methods, giving them basically mean squared errors. Based on the threshold of 1.5 degrees was estimated the seismic risk for the city, and estimated the level of return to earthquakes of intensity 1.5°, 2.0°, 2.5°, 3.0° and the most intense earthquake never registered in the city, which occurred in November 1986 with magnitude of about 5.2º

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we present the principal fractals, their caracteristics, properties abd their classification, comparing them to Euclidean Geometry Elements. We show the importance of the Fractal Geometry in the analysis of several elements of our society. We emphasize the importance of an appropriate definition of dimension to these objects, because the definition we presently know doesn t see a satisfactory one. As an instrument to obtain these dimentions we present the Method to count boxes, of Hausdorff- Besicovich and the Scale Method. We also study the Percolation Process in the square lattice, comparing it to percolation in the multifractal subject Qmf, where we observe som differences between these two process. We analize the histogram grafic of the percolating lattices versus the site occupation probability p, and other numerical simulations. And finaly, we show that we can estimate the fractal dimension of the percolation cluster and that the percolatin in a multifractal suport is in the same universality class as standard percolation. We observe that the area of the blocks of Qmf is variable, pc is a function of p which is related to the anisotropy of Qmf