882 resultados para Relative complexity
Resumo:
To assess the quality of school education, much of educational research is concerned with comparisons of test scores means or medians. In this paper, we shift this focus and explore test scores data by addressing some often neglected questions. In the case of Brazil, the mean of test scores in Math for students of the fourth grade has declined approximately 0,2 standard deviation in the late 1990s. But what about changes in the distribution of scores? It is unclear whether the decline was caused by deterioration in student performance in upper and/or lower tails of the distribution. To answer this question, we propose the use of the relative distribution method developed by Handcock and Morris (1999). The advantage of this methodology is that it compares two distributions of test scores data through a single distribution and synthesizes all the differences between them. Moreover, it is possible to decompose the total difference between two distributions in a level effect (changes in median) and shape effect (changes in shape of the distribution). We find that the decline of average-test scores is mainly caused by a worsening in the position of all students throughout the distribution of scores and is not only specific to any quantile of distribution.
Resumo:
A avaliação de risco sísmico, fundamental para as decisões sobre as estruturas de obras de engenharia e mitigação de perdas, envolve fundamentalmente a análise de ameaça sísmica. Calcular a ameaça sísmica é o mesmo que calcular a probabilidade de que certo nível de determinada medida de intensidade em certo local durante um certo tempo seja excedido. Dependendo da complexidade da atividade geológica essas estimativas podem ser bas- tante sofisticadas. Em locais com baixa sismicidade, como é o caso do Brasil, o pouco tempo (geológico) de observação e a pouca quantidade de informação são fontes de muitas incer- tezas e dificuldade de análise pelos métodos mais clássicos e conhecidos que geralmente consideram, através de opiniões de especialistas, determinadas zonas sísmicas. Serão discutidas algumas técnicas de suavização e seus fundamentos como métodos al- ternativos ao zoneamento, em seguida se exemplifica suas aplicações no caso brasileiro.
Resumo:
The papers aims at considering the issue of relative efficiency measurement in the context of the public sector. In particular, we consider the efficiency measurement approach provided by Data Envelopment Analysis (DEA). The application considered the main Brazilian federal universities for the year of 1994. Given the large number of inputs and outputs, this paper advances the idea of using factor analysis to explore common dimensions in the data set. Such procedure made possible a meaningful application of DEA, which finally provided a set of efficiency scores for the universities considered .
Resumo:
The Brazilian start-up Local Wander plans to enter the tourism sector with a mobile application aiming to enable a new form of travel research. A web-based survey has been sent out to the start-up’s target audience (n: 236) in order to gain further relevant information for the designing of Local Wander’s market entry strategy. By applying the diffusion of innovation theory, this thesis could detect five different adopter categories, originally described by Rogers (1962), among Local Wander’s target audience based on their adoption intention. The Early Market was observed to be significantly bigger than the theory predicted. Research revealed four characteristics to be of significant impact on the adoption intention: Relative Perceived Product Advantage, Perceived Product Complexity, Compatibility with digital travel research sources, and the adopter’s Innovativeness towards mobile applications. Specific characteristics in order to identify Local Wander’s early users, the so called Innovators, were detected giving indications for further necessary company market research. Findings showed that the diffusion of innovation framework is a helpful tool for start-ups’ prospective decision making and market entry strategy planning.
Resumo:
Starting from the idea that economic systems fall into complexity theory, where its many agents interact with each other without a central control and that these interactions are able to change the future behavior of the agents and the entire system, similar to a chaotic system we increase the model of Russo et al. (2014) to carry out three experiments focusing on the interaction between Banks and Firms in an artificial economy. The first experiment is relative to Relationship Banking where, according to the literature, the interaction over time between Banks and Firms are able to produce mutual benefits, mainly due to reduction of the information asymmetry between them. The following experiment is related to information heterogeneity in the credit market, where the larger the bank, the higher their visibility in the credit market, increasing the number of consult for new loans. Finally, the third experiment is about the effects on the credit market of the heterogeneity of prices that Firms faces in the goods market.
Resumo:
This manuscript describes the development and validation of an ultra-fast, efficient, and high throughput analytical method based on ultra-high performance liquid chromatography (UHPLC) equipped with a photodiode array (PDA) detection system, for the simultaneous analysis of fifteen bioactive metabolites: gallic acid, protocatechuic acid, (−)-catechin, gentisic acid, (−)-epicatechin, syringic acid, p-coumaric acid, ferulic acid, m-coumaric acid, rutin, trans-resveratrol, myricetin, quercetin, cinnamic acid and kaempferol, in wines. A 50-mm column packed with 1.7-μm particles operating at elevated pressure (UHPLC strategy) was selected to attain ultra-fast analysis and highly efficient separations. In order to reduce the complexity of wine extract and improve the recovery efficiency, a reverse-phase solid-phase extraction (SPE) procedure using as sorbent a new macroporous copolymer made from a balanced ratio of two monomers, the lipophilic divinylbenzene and the hydrophilic N-vinylpyrrolidone (Oasis™ HLB), was performed prior to UHPLC–PDA analysis. The calibration curves of bioactive metabolites showed good linearity within the established range. Limits of detection (LOD) and quantification (LOQ) ranged from 0.006 μg mL−1 to 0.58 μg mL−1, and from 0.019 μg mL−1 to 1.94 μg mL−1, for gallic and gentisic acids, respectively. The average recoveries ± SD for the three levels of concentration tested (n = 9) in red and white wines were, respectively, 89 ± 3% and 90 ± 2%. The repeatability expressed as relative standard deviation (RSD) was below 10% for all the metabolites assayed. The validated method was then applied to red and white wines from different geographical origins (Azores, Canary and Madeira Islands). The most abundant component in the analysed red wines was (−)-epicatechin followed by (−)-catechin and rutin, whereas in white wines syringic and p-coumaric acids were found the major phenolic metabolites. The method was completely validated, providing a sensitive analysis for bioactive phenolic metabolites detection and showing satisfactory data for all the parameters tested. Moreover, was revealed as an ultra-fast approach allowing the separation of the fifteen bioactive metabolites investigated with high resolution power within 5 min.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The aim of this study was to determine the relative potency of racemic ketamine and S(+)-ketamine for the hypnotic effect and to evaluate the clinical anesthesia produced by equianesthetic doses of these two substances in dogs. One hundred and eight dogs were allocated in groups R2, R2.5, R3, R6, R9, R12, S2, S2.5, S3, S6, S9, and S12, to receive by intravenous route 2, 2.5, 3, 6, 9, and 12 mg/kg of ketamine or S(+)-ketamine, respectively. A dose-effect curve was drawn with the dose logarithm and the percentage of dogs that presented hypnosis in each group. The curve was used to obtain a linear regression, to determine the effective doses 100 and the potency relationship. In another experimental phase, eight groups of five dogs received 3, 6, 9 and 12 mg/kg of ketamine or S(+)-ketamine to evaluate the periods of latency, hypnosis, and total recovery. The times in which the dogs reached the sternal position, attempted to stand up for the first time, recovered the standing position, and started to walk were also recorded. The hypnotic dose for ketamine was 9.82 +/- 3.02 (6.86-16.5) mg/kg and for S(+)-ketamine was 7.76 +/- 2.17 (5.86-11.5) mg/kg. The time of hypnosis was longer in R3 and the first attempt to stand up occurred early in R6 when compared with S3 and S6 respectively. When R9 (100% of hypnosis with ketamine) and S6 [100% of hypnosis with S(+)-ketamine] were compared (1:1.5 ratio), the time to sternal position (12 +/- 2.5 and 20.2 +/- 5.6 min respectively) and the total recovery time (45 +/- 5.5 and 60.2 +/- 5.2 min respectively) were significantly shorter with S(+)-ketamine. It was concluded that the potency ratio between ketamine and S(+)-ketamine in dogs is smaller than the one reported in other species, and that the dose obtained after a reduction of 50%, as usually performed in humans, would not be enough to obtain equianesthetic effects in dogs.
Resumo:
Objetivando avaliar o desenvolvimento relativo dos componentes do peso vivo (PV), dos cortes comerciais e dos tecidos da carcaça, utilizaram-se 40 cabritos Saanen. Os animais foram abatidos ao atingir 5,0; 12,5; 20,0; 27,5 e 35,0 kg de PV e a carcaça foi seccionada em paleta, pescoço, 1ª a 5ª costelas, 6ª a 13ª costelas, peito/fralda, lombo e perna. A perna foi dissecada em ossos, músculos e gordura. Utilizou-se a equação alométrica Y=aXb para estimar o desenvolvimento relativo. O crescimento do tecido ósseo foi precoce, o do tecido muscular intermediário e o da gordura crescimento tardio, uma vez que a gordura subcutânea é depositada mais tardiamente. Os cortes comerciais apresentaram coeficiente de alometria isogônico, com exceção do corte da 6ª a 13ª costelas e do peito/fralda. O desenvolvimento da carcaça e dos não-componentes da carcaça acompanhou o peso de corpo vazio. Cabritos com 35 kg de PV possuem proporção de músculos e relação músculo:osso adequadas, mas apresentam proporção de gordura maior que a observada nos animais abatidos com 20 kg de PV.
Resumo:
Frequency Selective Surfaces (FSS) are periodic structures in one or two dimensions that act as spatial filters, can be formed by elements of type conductors patches or apertures, functioning as filters band-stop or band-pass respectively. The interest in the study of FSS has grown through the years, because such structures meet specific requirements as low-cost, reduced dimensions and weighs, beyond the possibility to integrate with other microwave circuits. The most varied applications for such structures have been investigated, as for example, radomes, antennas systems for airplanes, electromagnetic filters for reflective antennas, absorbers structures, etc. Several methods have been used for the analysis of FSS, among them, the Wave Method (WCIP). Are various shapes of elements that can be used in FSS, as for example, fractal type, which presents a relative geometric complexity. This work has as main objective to propose a simplification geometric procedure a fractal FSS, from the analysis of influence of details (gaps) of geometry of the same in behavior of the resonance frequency. Complementarily is shown a simple method to adjust the frequency resonance through analysis of a FSS, which uses a square basic cell, in which are inserted two reentrance and dimensions these reentrance are varied, making it possible to adjust the frequency. For this, the structures are analyzed numerically, using WCIP, and later are characterized experimentally comparing the results obtained. For the two cases is evaluated, the influence of electric and magnetic fields, the latter through the electric current density vector. Is realized a bibliographic study about the theme and are presented suggestions for the continuation of this work
Resumo:
The petroleum industry, in consequence of an intense activity of exploration and production, is responsible by great part of the generation of residues, which are considered toxic and pollutants to the environment. Among these, the oil sludge is found produced during the production, transportation and refine phases. This work had the purpose to develop a process to recovery the oil present in oil sludge, in order to use the recovered oil as fuel or return it to the refining plant. From the preliminary tests, were identified the most important independent variables, like: temperature, contact time, solvents and acid volumes. Initially, a series of parameters to characterize the oil sludge was determined to characterize its. A special extractor was projected to work with oily waste. Two experimental designs were applied: fractional factorial and Doehlert. The tests were carried out in batch process to the conditions of the experimental designs applied. The efficiency obtained in the oil extraction process was 70%, in average. Oil sludge is composed of 36,2% of oil, 16,8% of ash, 40% of water and 7% of volatile constituents. However, the statistical analysis showed that the quadratic model was not well fitted to the process with a relative low determination coefficient (60,6%). This occurred due to the complexity of the oil sludge. To obtain a model able to represent the experiments, the mathematical model was used, the so called artificial neural networks (RNA), which was generated, initially, with 2, 4, 5, 6, 7 and 8 neurons in the hidden layer, 64 experimental results and 10000 presentations (interactions). Lesser dispersions were verified between the experimental and calculated values using 4 neurons, regarding the proportion of experimental points and estimated parameters. The analysis of the average deviations of the test divided by the respective training showed up that 2150 presentations resulted in the best value parameters. For the new model, the determination coefficient was 87,5%, which is quite satisfactory for the studied system
Resumo:
The wavelet transform is used to reduce the high frequency multipath of pseudorange and carrier phase GPS double differences (DDs). This transform decomposes the DD signal, thus separating the high frequencies due to multipath effects. After the decomposition, the wavelet shrinkage is performed by thresholding to eliminate the high frequency component. Then the signal can be reconstructed without the high frequency component. We show how to choose the best threshold. Although the high frequency multipath is not the main multipath error component, its correction provides improvements of about 30% in pseudorange average residuals and 24% in carrier phases. The results also show that the ambiguity solutions become more reliable after correcting the high frequency multipath.
Resumo:
The quality of the vertical distribution measurements of humidity in the atmosphere is very important in meteorology due to the crucial role that water vapor plays in the earth's energy budget. The radiosonde is the humidity measurement device that provides the best vertical resolution. Also, radiosondes are the operational devices that are used to measure the vertical profile of atmospheric water vapor. The World Meteorological Organization (WMO) has carried out several intercomparison experiments at different climatic zones in order to identify the differences between the available commercial sensors. This article presents the results of an experiment that was carried out in Brazil in 2001 in which major commercial radiosonde manufacturers [e.g., Graw Radiosondes GmbH & Co., KG (Germany); MODEM (France); InterMet Systems (United States); Sippican, Inc. (United States); and Vaisala (Finland)] were involved. One of the main goals of this experiment was to evaluate the performance of the different humidity sensors in a tropical region. This evaluation was performed for different atmospheric layers and distinct periods of the day. It also considers the computation of the integrated water vapor (IWV). The results showed that the humidity measurements achieved by the different sensors were quite similar in the low troposphere (the bias median value regarding the RS80 was around 1.8%) and were quite dispersed in the superior layers (the median rms regarding the RS80 was around 14.9%).
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Considering that counting the percentage of CD4 T lymphocytes can add prognostic information regarding patients infected with HIV, the aim of this study was to evaluate the percentage values of CD4+ T lymphocytes from 81 patients determined by flow cytometry and estimated by flow cytometry in conjunction with a hematology counter. Means were compared through the Student's t-test. Pearson's correlation was determined, and the agreement between results was tested by Bland-Altman. The level of significance was P < 0.05. It was found a significantly higher mean difference between the relative values of CD4+ T lymphocytes to the hematologic counter (P < 0.05), for all strata studied. Positive and significant correlations (P < 0.01) were found between the strata CD4 < 200 cells/mL (r = 0.93), between 200 and 500 cells/mL (r = 0.65), and >500 cells/mL (r = 0.81). The limits of agreement were 1.0 +/- 3.8% for the stratum of CD4 < 200 cells/mL, approximately 2.2 +/- 13.5% for the stratum of CD4 between 200 and 500 cells/mL, and approximately 6.2 +/- 20.4% for the stratum > 500 cells/mL. The differences in the percentages of CD4+ T lymphocytes obtained by different methodologies could lead to conflict when used in clinical decisions related to the treatment and care of people infected with HIV.