940 resultados para Transform statistics
Resumo:
The structural characteristics of raw coal and hydrogen peroxide (H2O2)-oxidized coals were investigated using scanning electron microscopy, X-ray diffraction (XRD), Raman spectra, and Fourier transform infrared (FT-IR) spectroscopy. The results indicate that the derivative coals oxidized by H2O2 are improved noticeably in aromaticity and show an increase first and then a decrease up to the highest aromaticity at 24 h. The stacking layer number of crystalline carbon decreases and the aspect ratio (width versus stacking height) increases with an increase in oxidation time. The content of crystalline carbon shows the same change tendency as the aromaticity measured by XRD. The hydroxyl bands of oxidized coals become much stronger due to an increase in soluble fatty acids and alcohols as a result of the oxidation of the aromatic and aliphatic C‐H bonds. In addition, the derivative coals display a decrease first and then an increase in the intensity of aliphatic C‐H bond and present a diametrically opposite tendency in the aromatic C‐H bonds with an increase in oxidation time. There is good agreement with the changes of aromaticity and crystalline carbon content as measured by XRD and Raman spectra. The particle size of oxidized coals (<200 nm in width) shows a significant decrease compared with that of raw coal (1 μm). This study reveals that the optimal oxidation time is ∼24 h for improving the aromaticity and crystalline carbon content of H2O2-oxidized coals. This process can help us obtain superfine crystalline carbon materials similar to graphite in structure.
Resumo:
The practice of statistics is the focus of the world in which professional statisticians live. To understand meaningfully what this practice is about, students need to engage in it themselves. Acknowledging the limitations of a genuine classroom setting, this study attempted to expose four classes of year 5 students (n=91) to an authentic experience of the practice of statistics. Setting an overall context of people’s habits that are considered environmentally friendly, the students sampled their class and set criteria for being environmentally friendly based on questions from the Australian Bureau of Statistics CensusAtSchool site. They then analysed the data and made decisions, acknowledging their degree of certainty, about three populations based on their criteria: their class, year 5 students in their school and year 5 students in Australia. The next step was to collect a random sample the size of their class from an Australian Bureau of Statistics ‘population’, analyse it and again make a decision about Australian year 5 students. At the end, they suggested what further research they might do. The analysis of students’ responses gives insight into primary students’ capacity to appreciate and understand decision making, and to participate in the practice of statistics, a topic that has received very little attention in the literature. Based on the total possible score of 23 from student workbook entries, 80 % of students achieved at least a score of 11.
Resumo:
Many statistical forecast systems are available to interested users. In order to be useful for decision-making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and their statistical manifestation have been firmly established, the forecasts must also provide some quantitative evidence of `quality’. However, the quality of statistical climate forecast systems (forecast quality) is an ill-defined and frequently misunderstood property. Often, providers and users of such forecast systems are unclear about what ‘quality’ entails and how to measure it, leading to confusion and misinformation. Here we present a generic framework to quantify aspects of forecast quality using an inferential approach to calculate nominal significance levels (p-values) that can be obtained either by directly applying non-parametric statistical tests such as Kruskal-Wallis (KW) or Kolmogorov-Smirnov (KS) or by using Monte-Carlo methods (in the case of forecast skill scores). Once converted to p-values, these forecast quality measures provide a means to objectively evaluate and compare temporal and spatial patterns of forecast quality across datasets and forecast systems. Our analysis demonstrates the importance of providing p-values rather than adopting some arbitrarily chosen significance levels such as p < 0.05 or p < 0.01, which is still common practice. This is illustrated by applying non-parametric tests (such as KW and KS) and skill scoring methods (LEPS and RPSS) to the 5-phase Southern Oscillation Index classification system using historical rainfall data from Australia, The Republic of South Africa and India. The selection of quality measures is solely based on their common use and does not constitute endorsement. We found that non-parametric statistical tests can be adequate proxies for skill measures such as LEPS or RPSS. The framework can be implemented anywhere, regardless of dataset, forecast system or quality measure. Eventually such inferential evidence should be complimented by descriptive statistical methods in order to fully assist in operational risk management.
Resumo:
In this paper, we generalize the existing rate-one space frequency (SF) and space-time frequency (STF) code constructions. The objective of this exercise is to provide a systematic design of full-diversity STF codes with high coding gain. Under this generalization, STF codes are formulated as linear transformations of data. Conditions on these linear transforms are then derived so that the resulting STF codes achieve full diversity and high coding gain with a moderate decoding complexity. Many of these conditions involve channel parameters like delay profile (DP) and temporal correlation. When these quantities are not available at the transmitter, design of codes that exploit full diversity on channels with arbitrary DIP and temporal correlation is considered. Complete characterization of a class of such robust codes is provided and their bit error rate (BER) performance is evaluated. On the other hand, when channel DIP and temporal correlation are available at the transmitter, linear transforms are optimized to maximize the coding gain of full-diversity STF codes. BER performance of such optimized codes is shown to be better than those of existing codes.
Resumo:
The power of projects has been demonstrated by the growth in their use across an increasing range of industries and workplaces in recent years. Not only has the number of people involved in project management increased, but the qualifications and backgrounds of those people have also broadened, with engineering no longer being the only path to project management (PM). Predicting the career trajectories in Project Management has become more important for both organisations employing project managers and project managers building career portfolios. Our research involved interviewing more than 75 project officers and project managers across a range of industries to explore their career journey. We used Wittgenstein’s family resemblance theory is to analyse the information from the transcripts to identify the extent to which the roles of participants fit with the commonly accepted definition of project management. Findings demonstrate diversity of project manager backgrounds and experiences and relational competencies across these backgrounds that form and shape PM careers.
Resumo:
This paper presents the architecture and the VHDL design of an integer 2-D DCT used in the H.264/AVC. The 2-D DCT computation is performed by exploiting it’s orthogonality and separability property. The symmetry of the forward and inverse transform is used in this implementation. To reduce the computation overhead for the addition, subtraction and multiplication operations, we analyze the suitability of carry-free position independent residue number system (RNS) for the implementation of 2-D DCT. The implementation has been carried out in VHDL for Altera FPGA. We used the negative number representation in RNS, bit width analysis of the transforms and dedicated registers present in the Logic element of the FPGA to optimize the area. The complexity and efficiency analysis show that the proposed architecture could provide higher through-put.
Resumo:
Climate variability and change are risk factors for climate sensitive activities such as agriculture. Managing these risks requires "climate knowledge", i.e. a sound understanding of causes and consequences of climate variability and knowledge of potential management options that are suitable in light of the climatic risks posed. Often such information about prognostic variables (e.g. yield, rainfall, run-off) is provided in probabilistic terms (e.g. via cumulative distribution functions, CDF), whereby the quantitative assessments of these alternative management options is based on such CDFs. Sound statistical approaches are needed in order to assess whether difference between such CDFs are intrinsic features of systems dynamics or chance events (i.e. quantifying evidences against an appropriate null hypothesis). Statistical procedures that rely on such a hypothesis testing framework are referred to as "inferential statistics" in contrast to descriptive statistics (e.g. mean, median, variance of population samples, skill scores). Here we report on the extension of some of the existing inferential techniques that provides more relevant and adequate information for decision making under uncertainty.
Resumo:
The National Health Interview Survey - Disability supplement (NHIS-D) provides information that can be used to understand myriad topics related to health and disability. The survey provides comprehensive information on multiple disability conceptualizations that can be identified using information about health conditions (both physical and mental), activity limitations, and service receipt (e.g. SSI, SSDI, Vocational Rehabilitation). This provides flexibility for researchers in defining populations of interest. This paper provides a description of the data available in the NHIS-D and information on how the data can be used to better understand the lives of people with disabilities.
Resumo:
An in-depth knowledge about the characteristics of lightning generated currents will facilitate evaluation of the interception efficacy of lightning protection systems. In addition, it would aid in extraction of valuable statistics (from measured current data) on local lightning parameters. Incidentally, present day knowledge on characteristics of lightning induced current in typical lightning protection systems is rather limited. This is particularly true with closely interconnected protection systems, like the one employed in Indian Satellite Launch Pad-II. This system is taken as a specific example in the present study. Various aspects suggest that theoretical modelling would be the best possible approach for the intended work. From the survey of pertinent literature, it is concluded that electromagnetic modelling of lightning return-stroke with current source at the channel base is best suited for this study. Numerical electromagnetic code was used for the required electromagnetic field solution and Fourier transform techniques were employed for computing time-domain results. A validation for the numerical modelling is provided by laborator experiments on a reduced scale model of the system. Apart from ascertaining the influence of various parameters, salient characteristics of tower base currents for different kinds of events are deduced. This knowledge can be used in identifying the type of event, as well as its approximate location. A method for estimation of injected stroke current has also been proposed.
Resumo:
Management of the commercial harvest of kangaroos relies on quotas set annually as a proportion of regular estimates of population size. Surveys to generate these estimates are expensive and, in the larger states, logistically difficult; a cheaper alternative is desirable. Rainfall is a disappointingly poor predictor of kangaroo rate of increase in many areas, but harvest statistics (sex ratio, carcass weight, skin size and animals shot per unit time) potentially offer cost-effective indirect monitoring of population abundance (and therefore trend) and status (i.e. under-or overharvest). Furthermore, because harvest data are collected continuously and throughout the harvested areas, they offer the promise of more intensive and more representative coverage of harvest areas than aerial surveys do. To be useful, harvest statistics would need to have a close and known relationship with either population size or harvest rate. We assessed this using longterm (11-22 years) data for three kangaroo species (Macropus rufus, M. giganteus and M. fuliginosus) and common wallaroos (M. robustus) across South Australia, New South Wales and Queensland. Regional variation in kangaroo body size, population composition, shooter efficiency and selectivity required separate analyses in different regions. Two approaches were taken. First, monthly harvest statistics were modelled as a function of a number of explanatory variables, including kangaroo density, harvest rate and rainfall. Second, density and harvest rate were modelled as a function of harvest statistics. Both approaches incorporated a correlated error structure. Many but not all regions had relationships with sufficient precision to be useful for indirect monitoring. However, there was no single relationship that could be applied across an entire state or across species. Combined with rainfall-driven population models and applied at a regional level, these relationships could be used to reduce the frequency of aerial surveys without compromising decisions about harvest management.
Resumo:
The simultaneous state and parameter estimation problem for a linear discrete-time system with unknown noise statistics is treated as a large-scale optimization problem. The a posterioriprobability density function is maximized directly with respect to the states and parameters subject to the constraint of the system dynamics. The resulting optimization problem is too large for any of the standard non-linear programming techniques and hence an hierarchical optimization approach is proposed. It turns out that the states can be computed at the first levelfor given noise and system parameters. These, in turn, are to be modified at the second level.The states are to be computed from a large system of linear equations and two solution methods are considered for solving these equations, limiting the horizon to a suitable length. The resulting algorithm is a filter-smoother, suitable for off-line as well as on-line state estimation for given noise and system parameters. The second level problem is split up into two, one for modifying the noise statistics and the other for modifying the system parameters. An adaptive relaxation technique is proposed for modifying the noise statistics and a modified Gauss-Newton technique is used to adjust the system parameters.
Resumo:
A very general and numerically quite robust algorithm has been proposed by Sastry and Gauvrit (1980) for system identification. The present paper takes it up and examines its performance on a real test example. The example considered is the lateral dynamics of an aircraft. This is used as a vehicle for demonstrating the performance of various aspects of the algorithm in several possible modes.
Resumo:
Fourier Transform (FT)-near infra-red spectroscopy (NIRS) was investigated as a non-invasive technique for estimating percentage (%) dry matter of whole intact 'Hass' avocado fruit. Partial least squares (PLS) calibration models were developed from the diffuse reflectance spectra to predict % dry matter, taking into account effects of seasonal variation. It is found that seasonal variability has a significant effect on model predictive performance for dry matter in avocados. The robustness of the calibration model, which in general limits the application for the technique, was found to increase across years (seasons) when more seasonal variability was included in the calibration set. The R-v(2) and RMSEP for the single season prediction models predicting on an independent season ranged from 0.09 to 0.61 and 2.63 to 5.00, respectively, while for the two season models predicting on the third independent season, they ranged from 0.34 to 0.79 and 2.18 to 2.50, respectively. The bias for single season models predicting an independent season was as high as 4.429 but <= 1.417 for the two season combined models. The calibration model encompassing fruit from three consecutive years yielded predictive statistics of R-v(2) = 0.89, RMSEP = 1.43% dry matter with a bias of -0.021 in the range 16.1-39.7% dry matter for the validation population encompassing independent fruit from the three consecutive years. Relevant spectral information for all calibration models was obtained primarily from oil, carbohydrate and water absorbance bands clustered in the 890-980, 1005-1050, 1330-1380 and 1700-1790 nm regions. These results indicate the potential of FT-NIRS, in diffuse reflectance mode, to non-invasively predict the % dry matter of whole 'Hass' avocado fruit and the importance of the development of a calibration model that incorporates seasonal variation. Crown Copyright (c) 2012 Published by Elsevier B.V. All rights reserved.