912 resultados para Data distribution
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Reabilitação Oral - FOAR
Resumo:
Pós-graduação em Agronomia (Energia na Agricultura) - FCA
Resumo:
Visando caracterizar a distribuição de freqüência da evapotranspiração de referência na região de Piracicaba, SP, e avaliar as práticas usuais de estimativa da necessidade de água para fins de dimensionamento de sistemas de irrigação, foram utilizados 30 anos de dados de evapotranspiração do mês de setembro, que foi dividido em períodos de 5, 10, 15 e 30 dias. As distribuições Beta e Normal foram aplicadas aos dados e ambas mostraram-se aptas para representá-los. Para valores de probabilidade de ocorrência iguais ou superiores a 60%, verificou-se o aumento do valor da evapotranspiração de referência com a diminuição do período. Adotando como parâmetros o período de máxima exigência hídrica de 2 a 3 semanas e a evapotranspiração de referência ao nível de 75% de probabilidade, verificou-se que o uso do valor médio mensal da evapotranspiração no dimensionamento de sistemas de irrigação conduz ao subdimensionamento, enquanto a adoção do máximo valor diário da evapotranspiração acarreta o superdimensionamento.
Resumo:
The Box-Cox transformation is a technique mostly utilized to turn the probabilistic distribution of a time series data into approximately normal. And this helps statistical and neural models to perform more accurate forecastings. However, it introduces a bias when the reversion of the transformation is conducted with the predicted data. The statistical methods to perform a bias-free reversion require, necessarily, the assumption of Gaussianity of the transformed data distribution, which is a rare event in real-world time series. So, the aim of this study was to provide an effective method of removing the bias when the reversion of the Box-Cox transformation is executed. Thus, the developed method is based on a focused time lagged feedforward neural network, which does not require any assumption about the transformed data distribution. Therefore, to evaluate the performance of the proposed method, numerical simulations were conducted and the Mean Absolute Percentage Error, the Theil Inequality Index and the Signal-to-Noise ratio of 20-step-ahead forecasts of 40 time series were compared, and the results obtained indicate that the proposed reversion method is valid and justifies new studies. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
Aimed to analyze the scientific literature on information management and knowledge management in the most relevant journals in the Information Science field, noting in this way then relevance and impact to this field. It is a qualitative research and analyzed the scientific literature about this subject published in journals (online access) in the Information Science area and classified as Qualis of Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES), specifically those who were assessed as level 'A' and 'B', covering a total of 26 journal titles. We applied the Bradford's Law for the scientific production analysis, especially in relation to the articles distribution in terms of proximity or distance variables. We observed that the terms 'information management' and 'knowledge management' are contemporary and have been gained relevance over time. We also founded that the general rule: few produces a lot and many produces a little is true if considered the characteristics of the studied data distribution.
Resumo:
The ankle sprains represent the most common injuries in sports and basketball. In this sense, the use of ankle bracing and strength capacity analysis of the ankle evertor and invertor muscles, have been suggested as preventive measures and important tools for identifying risk factors associated with ankle sprains. However, questions still persist as to effect of the use ankle bracing on biomechanical variables related to the stability of the ankle. For this reason, this study aims to analyze the effect of the use of ankle bracing on peak torque (PT) of ankle evertor and invertor muscles and on eccentric evertor/concentric invertor torque ratio (EVEECC/INVCON), during the basketball match-play simulation. Ten healthy college basketball players, without mechanics or functional ankle instability performed a laboratory-based protocol representative of work rates observed during basketball match-play, in two different situations, with and without use of ankle bracing. The test was composed of a succession of intermittent physical effort equally distributed in four periods of 10 minutes each, considering the mechanical and physiological demands of a basketball match-play. Prior to the start of the trial (Evaluation 1) and after 2° (Evaluation 2) and 4° (Evaluation 3) periods, the subjects performed five maximal isokinetic concentric and eccentric contractions of ankle invertor and evertor muscles, separated by two minutes rest, at 60 °/s and 120 °/s. After testing for normality of data distribution with the Shapiro-Wilk test, was used the ANOVA repeated measures for two factors and post-hoc Bonferroni test for comparison of variables between assessments. Was adopted p < 0.05. There was no significant difference for PT and EVEECC/INVCON torque ratio between assessments. There was a decrease in PT EVEECC at 60º/s and 120º/s for the ...(Complete abstract click electronic access below)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
We introduce a new kind of likelihood function based on the sequence of moments of the data distribution. Both binned and unbinned data samples are discussed, and the multivariate case is also derived. Building on this approach we lay out the formalism of shape analysis for signal searches. In addition to moment-based likelihoods, standard likelihoods and approximate statistical tests are provided. Enough material is included to make the paper self-contained from the perspective of shape analysis. We argue that the moment-based likelihoods can advantageously replace unbinned standard likelihoods for the search of nonlocal signals, by avoiding the step of fitting Monte Carlo generated distributions. This benefit increases with the number of variables simultaneously analyzed. The moment-based signal search is exemplified and tested in various 1D toy models mimicking typical high-energy signal-background configurations. Moment-based techniques should be particularly appropriate for the searches for effective operators at the LHC.
Resumo:
Background: Caesarean section rates in Brazil have been steadily increasing. In 2009, for the first time, the number of children born by this type of procedure was greater than the number of vaginal births. Caesarean section is associated with a series of adverse effects on the women and newborn, and recent evidence suggests that the increasing rates of prematurity and low birth weight in Brazil are associated to the increasing rates of Caesarean section and labour induction. Methods: Nationwide hospital-based cohort study of postnatal women and their offspring with follow-up at 45 to 60 days after birth. The sample was stratified by geographic macro-region, type of the municipality and by type of hospital governance. The number of postnatal women sampled was 23,940, distributed in 191 municipalities throughout Brazil. Two electronic questionnaires were applied to the postnatal women, one baseline face-to-face and one follow-up telephone interview. Two other questionnaires were filled with information on patients' medical records and to assess hospital facilities. The primary outcome was the percentage of Caesarean sections (total, elective and according to Robson's groups). Secondary outcomes were: post-partum pain; breastfeeding initiation; severe/near miss maternal morbidity; reasons for maternal mortality; prematurity; low birth weight; use of oxygen use after birth and mechanical ventilation; admission to neonatal ICU; stillbirths; neonatal mortality; readmission in hospital; use of surfactant; asphyxia; severe/near miss neonatal morbidity. The association between variables were investigated using bivariate, stratified and multivariate model analyses. Statistical tests were applied according to data distribution and homogeneity of variances of groups to be compared. All analyses were taken into consideration for the complex sample design. Discussion: This study, for the first time, depicts a national panorama of labour and birth outcomes in Brazil. Regardless of the socioeconomic level, demand for Caesarean section appears to be based on the belief that the quality of obstetric care is closely associated to the technology used in labour and birth. Within this context, it was justified to conduct a nationwide study to understand the reasons that lead pregnant women to submit to Caesarean sections and to verify any association between this type of birth and it's consequences on postnatal health.
Resumo:
Abstract Background Regardless the regulatory function of microRNAs (miRNA), their differential expression pattern has been used to define miRNA signatures and to disclose disease biomarkers. To address the question of whether patients presenting the different types of diabetes mellitus could be distinguished on the basis of their miRNA and mRNA expression profiling, we obtained peripheral blood mononuclear cell (PBMC) RNAs from 7 type 1 (T1D), 7 type 2 (T2D), and 6 gestational diabetes (GDM) patients, which were hybridized to Agilent miRNA and mRNA microarrays. Data quantification and quality control were obtained using the Feature Extraction software, and data distribution was normalized using quantile function implemented in the Aroma light package. Differentially expressed miRNAs/mRNAs were identified using Rank products, comparing T1DxGDM, T2DxGDM and T1DxT2D. Hierarchical clustering was performed using the average linkage criterion with Pearson uncentered distance as metrics. Results The use of the same microarrays platform permitted the identification of sets of shared or specific miRNAs/mRNA interaction for each type of diabetes. Nine miRNAs (hsa-miR-126, hsa-miR-1307, hsa-miR-142-3p, hsa-miR-142-5p, hsa-miR-144, hsa-miR-199a-5p, hsa-miR-27a, hsa-miR-29b, and hsa-miR-342-3p) were shared among T1D, T2D and GDM, and additional specific miRNAs were identified for T1D (20 miRNAs), T2D (14) and GDM (19) patients. ROC curves allowed the identification of specific and relevant (greater AUC values) miRNAs for each type of diabetes, including: i) hsa-miR-1274a, hsa-miR-1274b and hsa-let-7f for T1D; ii) hsa-miR-222, hsa-miR-30e and hsa-miR-140-3p for T2D, and iii) hsa-miR-181a and hsa-miR-1268 for GDM. Many of these miRNAs targeted mRNAs associated with diabetes pathogenesis. Conclusions These results indicate that PBMC can be used as reporter cells to characterize the miRNA expression profiling disclosed by the different diabetes mellitus manifestations. Shared miRNAs may characterize diabetes as a metabolic and inflammatory disorder, whereas specific miRNAs may represent biological markers for each type of diabetes, deserving further attention.
Resumo:
Many research fields are pushing the engineering of large-scale, mobile, and open systems towards the adoption of techniques inspired by self-organisation: pervasive computing, but also distributed artificial intelligence, multi-agent systems, social networks, peer-topeer and grid architectures exploit adaptive techniques to make global system properties emerge in spite of the unpredictability of interactions and behaviour. Such a trend is visible also in coordination models and languages, whenever a coordination infrastructure needs to cope with managing interactions in highly dynamic and unpredictable environments. As a consequence, self-organisation can be regarded as a feasible metaphor to define a radically new conceptual coordination framework. The resulting framework defines a novel coordination paradigm, called self-organising coordination, based on the idea of spreading coordination media over the network, and charge them with services to manage interactions based on local criteria, resulting in the emergence of desired and fruitful global coordination properties of the system. Features like topology, locality, time-reactiveness, and stochastic behaviour play a key role in both the definition of such a conceptual framework and the consequent development of self-organising coordination services. According to this framework, the thesis presents several self-organising coordination techniques developed during the PhD course, mainly concerning data distribution in tuplespace-based coordination systems. Some of these techniques have been also implemented in ReSpecT, a coordination language for tuple spaces, based on logic tuples and reactions to events occurring in a tuple space. In addition, the key role played by simulation and formal verification has been investigated, leading to analysing how automatic verification techniques like probabilistic model checking can be exploited in order to formally prove the emergence of desired behaviours when dealing with coordination approaches based on self-organisation. To this end, a concrete case study is presented and discussed.
Resumo:
The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.
Resumo:
Exposimeters are increasingly applied in bioelectromagnetic research to determine personal radiofrequency electromagnetic field (RF-EMF) exposure. The main advantages of exposimeter measurements are their convenient handling for study participants and the large amount of personal exposure data, which can be obtained for several RF-EMF sources. However, the large proportion of measurements below the detection limit is a challenge for data analysis. With the robust ROS (regression on order statistics) method, summary statistics can be calculated by fitting an assumed distribution to the observed data. We used a preliminary sample of 109 weekly exposimeter measurements from the QUALIFEX study to compare summary statistics computed by robust ROS with a naïve approach, where values below the detection limit were replaced by the value of the detection limit. For the total RF-EMF exposure, differences between the naïve approach and the robust ROS were moderate for the 90th percentile and the arithmetic mean. However, exposure contributions from minor RF-EMF sources were considerably overestimated with the naïve approach. This results in an underestimation of the exposure range in the population, which may bias the evaluation of potential exposure-response associations. We conclude from our analyses that summary statistics of exposimeter data calculated by robust ROS are more reliable and more informative than estimates based on a naïve approach. Nevertheless, estimates of source-specific medians or even lower percentiles depend on the assumed data distribution and should be considered with caution.
Resumo:
Free space optical (FSO) communication links can experience extreme signal degradation due to atmospheric turbulence induced spatial and temporal irradiance fuctuations (scintillation) in the laser wavefront. In addition, turbulence can cause the laser beam centroid to wander resulting in power fading, and sometimes complete loss of the signal. Spreading of the laser beam and jitter are also artifacts of atmospheric turbulence. To accurately predict the signal fading that occurs in a laser communication system and to get a true picture of how this affects crucial performance parameters like bit error rate (BER) it is important to analyze the probability density function (PDF) of the integrated irradiance fuctuations at the receiver. In addition, it is desirable to find a theoretical distribution that accurately models these ?uctuations under all propagation conditions. The PDF of integrated irradiance fuctuations is calculated from numerical wave-optic simulations of a laser after propagating through atmospheric turbulence to investigate the evolution of the distribution as the aperture diameter is increased. The simulation data distribution is compared to theoretical gamma-gamma and lognormal PDF models under a variety of scintillation regimes from weak to very strong. Our results show that the gamma-gamma PDF provides a good fit to the simulated data distribution for all aperture sizes studied from weak through moderate scintillation. In strong scintillation, the gamma-gamma PDF is a better fit to the distribution for point-like apertures and the lognormal PDF is a better fit for apertures the size of the atmospheric spatial coherence radius ρ0 or larger. In addition, the PDF of received power from a Gaussian laser beam, which has been adaptively compensated at the transmitter before propagation to the receiver of a FSO link in the moderate scintillation regime is investigated. The complexity of the adaptive optics (AO) system is increased in order to investigate the changes in the distribution of the received power and how this affects the BER. For the 10 km link, due to the non-reciprocal nature of the propagation path the optimal beam to transmit is unknown. These results show that a low-order level of complexity in the AO provides a better estimate for the optimal beam to transmit than a higher order for non-reciprocal paths. For the 20 km link distance it was found that, although minimal, all AO complexity levels provided an equivalent improvement in BER and that no AO complexity provided the correction needed for the optimal beam to transmit. Finally, the temporal power spectral density of received power from a FSO communication link is investigated. Simulated and experimental results for the coherence time calculated from the temporal correlation function are presented. Results for both simulation and experimental data show that the coherence time increases as the receiving aperture diameter increases. For finite apertures the coherence time increases as the communication link distance is increased. We conjecture that this is due to the increasing speckle size within the pupil plane of the receiving aperture for an increasing link distance.