44 resultados para Componentes de variância
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
The objective is to analyze the relationship between risk and number of stocks of a portfolio for an individual investor when stocks are chosen by "naive strategy". For this, we carried out an experiment in which individuals select actions to reproduce this relationship. 126 participants were informed that the risk of first choice would be an asset average of all standard deviations of the portfolios consist of a single asset, and the same procedure should be used for portfolios composed of two, three and so on, up to 30 actions . They selected the assets they want in their portfolios without the support of a financial analysis. For comparison we also tested a hypothetical simulation of 126 investors who selected shares the same universe, through a random number generator. Thus, each real participant is compensated for random hypothetical investor facing the same opportunity. Patterns were observed in the portfolios of individual participants, characterizing the curves for the components of the samples. Because these groupings are somewhat arbitrary, it was used a more objective measure of behavior: a simple linear regression for each participant, in order to predict the variance of the portfolio depending on the number of assets. In addition, we conducted a pooled regression on all observations by analyzing cross-section. The result of pattern occurs on average but not for most individuals, many of which effectively "de-diversify" when adding seemingly random bonds. Furthermore, the results are slightly worse using a random number generator. This finding challenges the belief that only a small number of titles is necessary for diversification and shows that there is only applicable to a large sample. The implications are important since many individual investors holding few stocks in their portfolios
Resumo:
The building envelope is the principal mean of interaction between indoors and environment, with direct influence on thermal and energy performance of the building. By intervening in the envelope, with the proposal of specific architectural elements, it is possible to promote the use of passive strategies of conditioning, such as natural ventilation. The cross ventilation is recommended by the NBR 15220-3 as the bioclimatic main strategy for the hot and humid climate of Natal/RN, offering among other benefits, the thermal comfort of occupants. The analysis tools of natural ventilation, on the other hand, cover a variety of techniques, from the simplified calculation methods to computer fluid dynamics, whose limitations are discussed in several papers, but without detailing the problems encountered. In this sense, the present study aims to evaluate the potential of wind catchers, envelope elements used to increase natural ventilation in the building, through CFD simplified simulation. Moreover, it seeks to quantify the limitations encountered during the analysis. For this, the procedure adopted to evaluate the elements implementation and efficiency was the CFD simulation, abbreviation for Computer Fluid Dynamics, with the software DesignBuilder CFD. It was defined a base case, where wind catchers were added with various settings, to compare them with each other and appreciate the differences in flows and air speeds encountered. Initially there has been done sensitivity tests for familiarization with the software and observe simulation patterns, mapping the settings used and simulation time for each case simulated. The results show the limitations encountered during the simulation process, as well as an overview of the efficiency and potential of wind catchers, with the increase of ventilation with the use of catchers, differences in air flow patterns and significant increase in air speeds indoors, besides changes found due to different element geometries. It is considered that the software used can help designers during preliminary analysis in the early stages of design
Resumo:
Untreated effluents that reach surface water affect the aquatic life and humans. This study aimed to evaluate the wastewater s toxicity (municipal, industrial and shrimp pond effluents) released in the Estuarine Complex of Jundiaí- Potengi, Natal/RN, through chronic quantitative e qualitative toxicity tests using the test organism Mysidopsis Juniae, CRUSTACEA, MYSIDACEA (Silva, 1979). For this, a new methodology for viewing chronic effects on organisms of M. juniae was used (only renewal), based on another existing methodology to another testorganism very similar to M. Juniae, the M. Bahia (daily renewal).Toxicity tests 7 days duration were used for detecting effects on the survival and fecundity in M. juniae. Lethal Concentration 50% (LC50%) was determined by the Trimmed Spearman-Karber; Inhibition Concentration 50% (IC50%) in fecundity was determined by Linear Interpolation. ANOVA (One Way) tests (p = 0.05) were used to determinate the No Observed Effect Concentration (NOEC) and Low Observed Effect Concentration (LOEC). Effluents flows were measured and the toxic load of the effluents was estimated. Multivariate analysis - Principal Component Analysis (PCA) and Correspondence Analysis (CA) - identified the physic-chemical parameters better explain the patterns of toxicity found in survival and fecundity of M. juniae. We verified the feasibility of applying the only renewal system in chronic tests with M. Juniae. Most efluentes proved toxic on the survival and fecundity of M. Juniae, except for some shrimp pond effluents. The most toxic effluent was ETE Lagoa Aerada (LC50, 6.24%; IC50, 4.82%), ETE Quintas (LC50, 5.85%), Giselda Trigueiro Hospital (LC50, 2.05%), CLAN (LC50, 2.14%) and COTEMINAS (LC50, IC50 and 38.51%, 6.94%). The greatest toxic load was originated from ETE inefficient high flow effluents, textile effluents and CLAN. The organic load was related to the toxic effects of wastewater and hospital effluents in survival of M. Juniae, as well as heavy metals, total residual chlorine and phenols. In industrial effluents was found relationship between toxicity and organic load, phenols, oils and greases and benzene. The effects on fertility were related, in turn, with chlorine and heavy metals. Toxicity tests using other organisms of different trophic levels, as well as analysis of sediment toxicity are recommended to confirm the patterns found with M. Juniae. However, the results indicate the necessity for implementation and improvement of sewage treatment systems affluent to the Potengi s estuary
Resumo:
O processamento térmico de materiais cerâmicos via energia de microondas, no estágio atual, vem ganhando cada dia mais importância, tendo em vista suas inúmeras aplicações, como por exemplo: aplicação de microondas na área de processamento mineral (aquecimento de minérios antes da moagem, secagem, redução carbotérmica de óxidos minerais, lixiviação, fusão, pré-tratamento de minérios e concentrados de ouro refratário, regeneração de carvão, etc. de acordo com Kigman & Rowson, 1998). Em virtude de uma série de vantagens em potencial, frente aos métodos convencionais de aquecimento, como redução no tempo de processamento; economia de energia; diminuição do diâmetro médio das partículas e melhoramento nas propriedades tecnológicas em geral, esta tecnologia vem se destacando. Neste contexto, o objetivo geral deste trabalho, é desenvolver uma pesquisa visando identificar e caracterizar novas opções de matérias-primas cerâmicas como argilas, feldspatos e caulins que sejam eficazes para definir a formulação de uma ou mais massas para produção de componentes de cerâmica estrutural com propriedades físicas, mecânicas e estéticas adequadas após passarem por sinterização convencional e por energia de microondas destacando as vantagens desta última. Além dos requisitos técnicos e de processo, as formulações apresentadas deverão atender às expectativas de preço e de logística de fornecimento. No estudo foram conformados corpos-de-prova por extrusão e prensagem, sinterizados em fornos microondas e convencional, sob ciclos de queima mais rápidos que os atualmente praticados. As matérias-primas foram caracterizadas e analisadas, utilizando as técnicas de fluorescência por raios X (FRX), difração por raios X (DRX), análise térmica diferencial (DTA), análise térmica gravimétrica (DTG), análise granulométrica (AG), microscopia eletrônica de varredura (MEV), absorção d agua (AA), massa especifica aparente (MEA), porosidade aparente (PA), retração linear (RL) e tensão de ruptura e flexão (TRF). Os resultados obtidos indicaram que as propriedades tecnológicas de Absorção de água (AA) e Tensão de Ruptura e flexão (TRF), proposto no trabalho foram adquiridos com sucesso e estão bem além do limite exigido pelas especificações das normas da ABNT NBR 15.270/05 e 15.310/09
Resumo:
The kinanthropometric characteristics are used by the sports science as selection criteria and detection of talents. Hence, this study aimed at comparing the anthropometrical profile, the body composition, the somatotype and the vertical jumps of the beach volleyball players. This study consists of 79 male beach volleyball players, being forty nine (n=49) Brazilian participants of the National Circuit and thirty (n=30) of 15 countries participating in the XV Pan American Games. In order to analyze the vertical jumps of the Brazilian the participants were allocated into two groups (G1 and G2) in agreement with the national ranking of their teams. The vertical jump protocol developed by Smith and collaborators was used to evaluate the vertical jumps of spike and block. The Heath-Carter anthropometrical technique (1990) was used for calculating the somatotype. The Student s t test with the Bonferroni adjustment was used to calculate the differences among the investigated variables. The multiple regression analysis was used to identify the contributions of the anthropometrical variables in the performance of the vertical jumps and the multivariance analysis was used to calculate the differences among the components of the somatotype. The Brazilian athletes of G1 were better than G2 in the spike jump (p <0.01), block jump (p <0.01) and in the block difference (p <0.01). The prediction model of the spike jump of G2 included the body mass and standing spike reach (adjusted R2 = 0.77), the body mass and the standing block reach were also included in the model of the block jump (adjusted R2 = 0.73). The regression model of G1 was not statistically significant. As for the somatotype, statistically significant differences were found between the Brazilians and the Pan Americans (Wilks' lambda = 0.498; p <0.05). The Brazilian somatotype was classified as balanced mesomorph (2.7-4.3-3.0) and the Pan American somatotype as endomorphic mesomorph (3.5-4.6-2.4). As to the specific position of the block game (2.8-4.3-2.9) and the defense game (2.6-4.4-3.0), the Brazilian somatotype was classified as balanced mesomorph and the Pan American somatotype, the block (3.7-4.4-2.4) and the defense (3.4-4.9-2.3), was classified as endomorphic mesomorph. In conclusion, the vertical jump height (spike and block) influences the male Brazilian beach volleyball players performance. The physical type of the Brazilian blockers and defenders was similar with relationship to the somatotype. The Brazilian and Pan American beach volleyball players differ in terms of kinanthropometric characteristics. This work had a multidisciplinary feature with the participation of several departments and laboratories, like the Physiotherapy Department, the Nutrition Department, the Physical Education Laboratory, thus corroborating the multidisciplinary research feature
Resumo:
The dyslipidemia and excess weight in adolescents, when combined, suggest a progression of risk factors for cardiovascular disease (CVD). Besides these, the dietary habits and lifestyle have also been considered unsuitable impacting the development of chronic diseases. The study objectives were: (1) estimate the prevalence of lipid profile and correlate with body mass index (BMI), waist circumference (WC) and waist / height ratio (WHR) in adolescents, considering the maturation sexual, (2) know the sources of variance in the diet and the number of days needed to estimate the usual diet of adolescents and (3) describe the dietary patterns and lifestyle of adolescents, family history of CVD and age correlates them with the patterns of risk for CVD, adjusted for sexual maturation. A cross-sectional study was performed with 432 adolescents, aged 10-19 years from public schools of the Natal city, Brazil. The dyslipidemias were evaluated considering the lipid profile, the index of I Castelli (TC / HDL) and II (LDL / HDL) and non-HDL cholesterol. Anthropometric indicators were BMI, WC and WHR. The intake of energy, nutrients including fiber, fatty acids and cholesterol was estimated from two 24-hour recalls (24HR). The variables of lipid profile, anthropometric and clinical data were used in the models of Pearson correlation and linear regression, considering the sexual maturation. The variance ratio of the diet was calculated from the component-person variance, determined by analysis of variance (ANOVA). The definition of the number of days to estimate the usual intake of each nutrient was obtained by taking the hypothetical correlation (r) ≥ 0.9, between nutrient intake and the true observed. We used the principal component analysis as a method of extracting factors that 129 accounted for the dependent variables and known cardiovascular risk obtained from the lipid profile, the index for Castelli I and II, non-HDL cholesterol, BMI, and WC the WHR. Dietary patterns and lifestyle were obtained from the independent variables, based on nutrients consumed and physical activity weekly. In the study of principal component analysis (PCA) was investigated associations between the patterns of cardiovascular risk factors in dietary patterns and lifestyle, age and positive family history of CVD, through bivariate and multiple logistic regression adjusted for sexual maturation. The low HDL-C dyslipidemia was most prevalent (50.5%) for adolescents. Significant correlations were observed between hypercholesterolemia and positive family history of CVD (r = 0.19, p <0.01) and hypertriglyceridemia with BMI (r = 0.30, p <0.01), with the CC (r = 0.32, p <0.01) and WHR (r = 0.33, p <0.01). The linear model constructed with sexual maturation, age and BMI explained about 1 to 10.4% of the variation in the lipid profile. The sources of variance between individuals were greater for all nutrients in both sexes. The reasons for variances were 1 for all nutrients were higher in females. The results suggest that to assess the diet of adolescents with greater precision, 2 days would be enough to R24h consumption of energy, carbohydrates, fiber, saturated and monounsaturated fatty acids. In contrast, 3 days would be recommended for protein, lipid, polyunsaturated fatty acids and cholesterol. Two cardiovascular risk factors as have been extracted in the ACP, referring to the dependent variables: the standard lipid profile (HDL-C and non-HDL cholesterol) and "standard anthropometric index (BMI, WC, WHR) with a power explaining 75% of the variance of the original data. The factors are representative of two independent variables led to dietary patterns, "pattern 130 western diet" and "pattern protein diet", and one on the lifestyle, "pattern energy balance". Together, these patterns provide an explanation power of 67%. Made adjustment for sexual maturation in males remained significant variables: the associations between puberty and be pattern anthropometric indicator (OR = 3.32, CI 1.34 to 8.17%), and between family history of CVD and the pattern lipid profile (OR = 2.62, CI 1.20 to 5.72%). In females adolescents, associations were identified between age after the first stage of puberty with anthropometric pattern (OR = 3.59, CI 1.58 to 8.17%) and lipid profile (OR = 0.33, CI 0.15 to 0.75%). Conclusions: The low HDL-C was the most prevalent dyslipidemia independent of sex and nutritional status of adolescents. Hypercholesterolemia was influenced by family history of CVD and sexual maturation, in turn, hypertriglyceridemia was closely associated with anthropometric indicators. The variance between the diets was greater for all nutrients. This fact reflected in a variance ratio less than 1 and consequently in a lower number of days requerid to estimate the usual diet of adolescents considering gender. The two dietary patterns were extracted and the pattern considered unhealthy lifestyle as healthy. The associations were found between the patterns of CVD risk with age and family history of CVD in the studied adolescents
Resumo:
Farming of marine shrimp is growing worldwide and the Litopenaeus vannamei (L. vannamei) shrimp is the species most widely cultivated. Shrimp is an attractive food for its nutritional value and sensory aspects, being essential the maintenance of this attributes throughout storage, which takes place largely under freezing. The aim of this research was to evaluate quality characteristics of Litopenaeus vannamei shrimp, during freezing storage and to verify the effect of rosemary (Rosmarinus officinalis) adding. Considering the reutilization of processing shrimp wastes, total carotenoids analysis were conducted in waste of Litopenaeus vannamei shrimp and in the flour obtained after dryer. Monthly physicochemical and sensorial analysis were carried out on shrimp stored at 28,3 ± 3,8ºC for 180 days. Samples were placed in polyethylene bags and were categorized as whole shrimp (WS), peeled shrimp (PS), and PS with 0,5% dehydrated rosemary (RS). TBARS, pH, total carotenoid and sensorial Quantitative Descriptive Analysis (QDA) were carried out. Carotenoid total analysis was conducted in fresh wastes and processed flour (0 day) and after 60, 120 and 180 days of frozen storage. After 180 days, RS had lower pH (p = 0.001) and TBARS (p = 0.001) values and higher carotenoids (p = 0.003), while WS showed higher carotenoid losses. Sensory analysis showed that WS were firmer although rancid taste and smell were perceived with greater intensity (p = 0.001). Rancid taste was detected in RS only at 120 days at significantly lower intensity (p = 0.001) than WS and PS. Fresh wastes had 42.74μg/g of total carotenoids and processed flour 98.51μg/g. After 180 days of frozen storage, total carotenoids were significantly lower than 0 day (p<0,05). The addition of rosemary can improve sensory quality of frozen shrimp and reduce nutritional losses during storage. Shrimp wastes and flour of L. vannamei shrimp showed considerable astaxanthin content however, during storage it was observed losses in this pigment
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico
Resumo:
The Back School has been used to prevent and to treat back pain since 1969, however its effectiveness still is controversy in literature. The objective of this study was to evaluate the effectiveness of a program of "Back School" in patients with nonspecific chronic low back pain, directed by rheumatologists and orthopedic doctors to Physiotherapy School Clinic of Universidade Potiguar (UnP) Natal/RN- Brazil, in period of May/2002 to December/2003. Seventy patients, with age varying from 18 to 60 years, were randomized in two groups: Experimental group (group A) with 34 patients, which was subdivided in groups with 6 to 8 components. This group participated of a theoretician and practical program of Back School with 4 lessons, one day per week, with 60 minutes of duration; to the Control group (group B), with 36 patients, was explained that the group should stay four months in a waiting list . Both groups, had been carried through three evaluations, by a blind, to patient group, independent observer: initial evaluation, after four and sixteen weeks. The following variables were analyzed: pain intensity, measured by analog visual scale (AVS), functional disability, measured by disability questionnaire of Roland and Morris and the spinal mobility measured by Schöber s method. In statistical analysis it was used variance analysis ANOVA, the test of Newman-Keuls multiple comparations, and the Pearson correlation coefficient, with significance level p<0.05. Thirteen patients (18,6%) didn t complete the evaluations, (5 from experimental group and 8 from control group). At last, 57 patients were studied, (29 from Experimental group and 28 from Control group), it was observed a significant statistical improve just for Experimental group in variables pain intensity (p=0,0001), functional disability
Resumo:
The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.
Resumo:
Due to the current need of the industry to integrate data of the beginning of production originating from of several sources and of transforming them in useful information for sockets of decisions, a search exists every time larger for systems of visualization of information that come to collaborate with that functionality. On the other hand, a common practice nowadays, due to the high competitiveness of the market, it is the development of industrial systems that possess characteristics of modularity, distribution, flexibility, scalability, adaptation, interoperability, reusability and access through web. Those characteristics provide an extra agility and a larger easiness in adapting to the frequent changes of demand of the market. Based on the arguments exposed above, this work consists of specifying a component-based architecture, with the respective development of a system based on that architecture, for the visualization of industrial data. The system was conceived to be capable to supply on-line information and, optionally, historical information of variables originating from of the beginning of production. In this work it is shown that the component-based architecture developed possesses the necessary requirements for the obtaining of a system robust, reliable and of easy maintenance, being, like this, in agreement with the industrial needs. The use of that architecture allows although components can be added, removed or updated in time of execution, through a manager of components through web, still activating more the adaptation process and updating of the system
Resumo:
Nowadays, where the market competition requires products with better quality and a constant search for cost savings and a better use of raw materials, the research for more efficient control strategies becomes vital. In Natural Gas Processin Units (NGPUs), as in the most chemical processes, the quality control is accomplished through their products composition. However, the chemical composition analysis has a long measurement time, even when performed by instruments such as gas chromatographs. This fact hinders the development of control strategies to provide a better process yield. The natural gas processing is one of the most important activities in the petroleum industry. The main economic product of a NGPU is the liquefied petroleum gas (LPG). The LPG is ideally composed by propane and butane, however, in practice, its composition has some contaminants, such as ethane and pentane. In this work is proposed an inferential system using neural networks to estimate the ethane and pentane mole fractions in LPG and the propane mole fraction in residual gas. The goal is to provide the values of these estimated variables in every minute using a single multilayer neural network, making it possibly to apply inferential control techniques in order to monitor the LPG quality and to reduce the propane loss in the process. To develop this work a NGPU was simulated in HYSYS R software, composed by two distillation collumns: deethanizer and debutanizer. The inference is performed through the process variables of the PID controllers present in the instrumentation of these columns. To reduce the complexity of the inferential neural network is used the statistical technique of principal component analysis to decrease the number of network inputs, thus forming a hybrid inferential system. It is also proposed in this work a simple strategy to correct the inferential system in real-time, based on measurements of the chromatographs which may exist in process under study
Resumo:
Blind Source Separation (BSS) refers to the problem of estimate original signals from observed linear mixtures with no knowledge about the sources or the mixing process. Independent Component Analysis (ICA) is a technique mainly applied to BSS problem and from the algorithms that implement this technique, FastICA is a high performance iterative algorithm of low computacional cost that uses nongaussianity measures based on high order statistics to estimate the original sources. The great number of applications where ICA has been found useful reects the need of the implementation of this technique in hardware and the natural paralelism of FastICA favors the implementation of this algorithm on digital hardware. This work proposes the implementation of FastICA on a reconfigurable hardware platform for the viability of it's use in blind source separation problems, more specifically in a hardware prototype embedded in a Field Programmable Gate Array (FPGA) board for the monitoring of beds in hospital environments. The implementations will be carried out by Simulink models and it's synthesizing will be done through the DSP Builder software from Altera Corporation.
Resumo:
This work considers the development of a filtering system composed of an intelligent algorithm, that separates information and noise coming from sensors interconnected by Foundation Fieldbus (FF) network. The algorithm implementation will be made through FF standard function blocks, with on-line training through OPC (OLE for Process Control), and embedded technology in a DSP (Digital Signal Processor) that interacts with the fieldbus devices. The technique ICA (Independent Component Analysis), that explores the possibility of separating mixed signals based on the fact that they are statistically independent, was chosen to this Blind Source Separation (BSS) process. The algorithm and its implementations will be Presented, as well as the results
Resumo:
We developed an assay methodology that considered the temperature variation and the scanning electron microscopy as a method to quantify and characterize respectively the consumption evolution in three 46 LA machines, with internal combustion and two-stroke engines, 7.64 cm3 cylinder capacity, 23.0 millimeters diameter and 18.4 millimeters course, RPM service from 2.000 to 16.000 rpm, 1.2 HP power, and 272 grams weight. The investigated engines components were: (1) head of the engine (Al-Si alloy), (2) piston (Al-Si alloy) and (3) piston pin (AISI 52100 steel). The assays were carried out on a desktop; engines 1 and 2 were assayed with no load, whereas in two assays of engine 3 we added a fan with wind speed that varied from 8.10 m/s to 11.92 m/s, in order to identify and compare the engine dynamic behavior as related to the engines assayed with no load. The temperatures of the engine s surface and surroundings were measured by two type K thermopairs connected to the assay device and registered in a microcomputer with data recording and parameters control and monitoring software, throughout the assays. The consumed surface of the components was analyzed by scanning electron microscopy (SEM) and microanalysis-EDS. The study was complemented with shape deformation and mass measurement assays. The temperature variation was associated with the oxides morphology and the consumption mechanisms were discussed based on the relation between the thermal mechanical effects and the responses of the materials characterization