898 resultados para 340402 Econometric and Statistical Methods
Resumo:
The United States Supreme Court has handed down a once in a generation patent law decision that will have important ramifications for the patentability of non-physical methods, both internationally and in Australia. In Bilski v Kappos, the Supreme Court considered whether an invention must either be tied to a machine or apparatus, or transform an article into a different state or thing to be patentable. It also considered for the first time whether business methods are patentable subject matter. The decision will be of particular interest to practitioners who followed the litigation in Grant v Commissioner of Patents, a Federal Court decision in which a Brisbane-based inventor was denied a patent over a method of protecting an asset from the claims of creditors.
Resumo:
The need for a house rental model in Townsville, Australia is addressed. Models developed for predicting house rental levels are described. An analytical model is built upon a priori selected variables and parameters of rental levels. Regression models are generated to provide a comparison to the analytical model. Issues in model development and performance evaluation are discussed. A comparison of the models indicates that the analytical model performs better than the regression models.
Resumo:
Parametric and generative modelling methods are ways in which computer models are made more flexible, and of formalising domain-specific knowledge. At present, no open standard exists for the interchange of parametric and generative information. The Industry Foundation Classes (IFC) which are an open standard for interoperability in building information models is presented as the base for an open standard in parametric modelling. The advantage of allowing parametric and generative representations are that the early design process can allow for more iteration and changes can be implemented quicker than with traditional models. This paper begins with a formal definition of what constitutes to be parametric and generative modelling methods and then proceeds to describe an open standard in which the interchange of components could be implemented. As an illustrative example of generative design, Frazer’s ‘Reptiles’ project from 1968 is reinterpreted.
Resumo:
1. Autonomous acoustic recorders are widely available and can provide a highly efficient method of species monitoring, especially when coupled with software to automate data processing. However, the adoption of these techniques is restricted by a lack of direct comparisons with existing manual field surveys. 2. We assessed the performance of autonomous methods by comparing manual and automated examination of acoustic recordings with a field-listening survey, using commercially available autonomous recorders and custom call detection and classification software. We compared the detection capability, time requirements, areal coverage and weather condition bias of these three methods using an established call monitoring programme for a nocturnal bird, the little spotted kiwi(Apteryx owenii). 3. The autonomous recorder methods had very high precision (>98%) and required <3% of the time needed for the field survey. They were less sensitive, with visual spectrogram inspection recovering 80% of the total calls detected and automated call detection 40%, although this recall increased with signal strength. The areal coverage of the spectrogram inspection and automatic detection methods were 85% and 42% of the field survey. The methods using autonomous recorders were more adversely affected by wind and did not show a positive association between ground moisture and call rates that was apparent from the field counts. However, all methods produced the same results for the most important conservation information from the survey: the annual change in calling activity. 4. Autonomous monitoring techniques incur different biases to manual surveys and so can yield different ecological conclusions if sampling is not adjusted accordingly. Nevertheless, the sensitivity, robustness and high accuracy of automated acoustic methods demonstrate that they offer a suitable and extremely efficient alternative to field observer point counts for species monitoring.
Resumo:
Quantitative market data has traditionally been used throughout marketing and business as a tool to inform and direct design decisions. However, in our changing economic climate, businesses need to innovate and create products their customers will love. Deep customer insight methods move beyond just questioning customers and aims to provoke true emotional responses in order to reveal new opportunities that go beyond functional product requirements. This paper explores traditional market research methods and compares them to methods used to gain deep customer insights. This study reports on a collaborative research project with seven small to medium enterprises and four multi-national organisations. Firms were introduced to a design led innovation approach, and were taught the different methods to gain deep customer insights. Interviews were conducted to understand the experience and outcomes of pre-existing research methods and deep customer insight approaches. Findings concluded that deep customer insights were unlikely to be revealed through traditional market research techniques. The theoretical outcome of this study is a complementary methods matrix, providing guidance on appropriate research methods in accordance to a project’s timeline.
Resumo:
The methodology undertaken, the channel model and the system model created for developing a novel adaptive equalization method and a novel channel tracking method for uplink of MU-MIMO-OFDM systems is presented in this paper. The results show that the channel tracking method works with 97% accuracy, while the training-based initial channel estimation method shows poor performance in estimating the actual channel comparatively.
Resumo:
Objectives: To examine factors associated with the uptake of i) long-acting reversible, ii) permanent and iii) traditional contraceptive methods among Australian women. Methods: Participants in the Australian Longitudinal Study on Women's Health born in 1973–78 reported on their contraceptive use at three surveys: 2003, 2006 and 2009. The participants were 5,849 women aged 25–30 in 2003 randomly sampled from Medicare. The main outcome measure was current contraceptive method at age 28–33 years categorised as long-acting reversible methods (implant, IUD, injection), permanent (tubal ligation, vasectomy), and traditional methods (oral contraceptive pills, condoms, withdrawal, safe period). Results: Compared to women living in major cities, women in inner regional areas were more likely to use long-acting (OR=1.26, 95%CI 1.03–1.55) or permanent methods (OR=1.43, 95%CI 1.17–1.76). Women living in outer regional/remote areas were more likely than women living in cities to use long-acting (OR=1.65, 95%CI 1.31–2.08) or permanent methods (OR=1.69, 95%CI 1.43–2.14). Conclusions: Location of residence is an important factor in women's choices about long-acting and permanent contraception in addition to the number and age of their children. Implications: Further research is needed to understand the role of geographical location in women's access to contraceptive options in Australia.
Resumo:
There is a wide range of potential study designs for intervention studies to decrease nosocomial infections in hospitals. The analysis is complex due to competing events, clustering, multiple timescales and time-dependent period and intervention variables. This review considers the popular pre-post quasi-experimental design and compares it with randomized designs. Randomization can be done in several ways: randomization of the cluster [intensive care unit (ICU) or hospital] in a parallel design; randomization of the sequence in a cross-over design; and randomization of the time of intervention in a stepped-wedge design. We introduce each design in the context of nosocomial infections and discuss the designs with respect to the following key points: bias, control for nonintervention factors, and generalizability. Statistical issues are discussed. A pre-post-intervention design is often the only choice that will be informative for a retrospective analysis of an outbreak setting. It can be seen as a pilot study with further, more rigorous designs needed to establish causality. To yield internally valid results, randomization is needed. Generally, the first choice in terms of the internal validity should be a parallel cluster randomized trial. However, generalizability might be stronger in a stepped-wedge design because a wider range of ICU clinicians may be convinced to participate, especially if there are pilot studies with promising results. For analysis, the use of extended competing risk models is recommended.
Resumo:
"First published in 1988, Ecological and Behavioral Methods for the Study of Bats is widely acknowledged as the primary reference for both amateur and professional bat researchers. Bats are the second most diverse group of mammals on the earth. They live on every continent except Antarctica, ranging from deserts to tropical forests to mountains, and their activities have a profound effect on the ecosystems in which they live. Despite their ubiquity and importance, bats are challenging to study. This volume provides researchers, conservationists, and consultants with the ecological background and specific information essential for studying bats in the wild and in captivity. Chapters detail many of the newest and most commonly used field and laboratory techniques needed to advance the study of bats, describe how these methods are applied to the study of the ecology and behavior of bats, and offer advice on how to interpret the results of research. The book includes forty-three chapters, fourteen of which are new to the second edition, with information on molecular ecology and evolution, bioacoustics, chemical communication, flight dynamics, population models, and methods for assessing postnatal growth and development. Fully illustrated and featuring contributions from the world’s leading experts in bat biology, this reference contains everything bat researchers and natural resource managers need to know for the study and conservation of this wide-ranging, ecologically vital, and diverse taxon."--Publisher website
Resumo:
This paper reports on two lengthy studies in Physical education teacher education (PETE) conducted independently but which are epistemologically and methodologically linked. The paper describes how personal construct theory (PCT) and its associated methods provided a means for PETE students to reflexively construct their ideas about teaching physical education over an extended period. Data are drawn from each study in the form of a story of a single participant to indicate how this came about. Furthermore we suggest that PCT might be both a useful research strategy and an effective approach to facilitate professional development in a teacher education setting.
Resumo:
The relationship between mathematics and statistical reasoning frequently receives comment (Vere-Jones 1995, Moore 1997); however most of the research into the area tends to focus on mathematics anxiety. Gnaldi (2003) showed that in a statistics course for psychologists, the statistical understanding of students at the end of the course depended on students’ basic numeracy, rather than the number or level of previous mathematics courses the student had undertaken. As part of a study into the development of statistical thinking at the interface between secondary and tertiary education, students enrolled in an introductory data analysis subject were assessed regarding their statistical reasoning, basic numeracy skills, mathematics background and attitudes towards statistics. This work reports on some key relationships between these factors and in particular the importance of numeracy to statistical reasoning.
Resumo:
The effects of tillage practises and the methods of chemical application on atrazine and alachlor losses through run-off were evaluated for five treatments: conservation (untilled) and surface (US), disk and surface, plow and surface, disk and preplant-incorporated, and plow and preplant-incorporated treatments. A rainfall simulator was used to create 63.5 mm h-1 of rainfall for 60 min and 127 mm h-1 for 15 min. Rainfall simulation occurred 24-36 h after chemical application. There was no significant difference in the run-off volume among the treatments but the untilled treatment significantly reduced erosion loss. The untilled treatments had the highest herbicide concentration and the disk treatments were higher than the plow treatments. The surface treatments showed a higher concentration than the incorporated treatments. The concentration of herbicides in the water decreased with time. Among the experimental sites, the one with sandy loam soil produced the greatest losses, both in terms of the run-off volume and herbicide loss. The US treatments had the highest loss and the herbicide incorporation treatments had smaller losses through run-off as the residue cover was effective in preventing herbicide losses. Incorporation might be a favorable method of herbicide application to reduce the herbicide losses by run-off.
Resumo:
We consider the development of statistical models for prediction of constituent concentration of riverine pollutants, which is a key step in load estimation from frequent flow rate data and less frequently collected concentration data. We consider how to capture the impacts of past flow patterns via the average discounted flow (ADF) which discounts the past flux based on the time lapsed - more recent fluxes are given more weight. However, the effectiveness of ADF depends critically on the choice of the discount factor which reflects the unknown environmental cumulating process of the concentration compounds. We propose to choose the discount factor by maximizing the adjusted R-2 values or the Nash-Sutcliffe model efficiency coefficient. The R2 values are also adjusted to take account of the number of parameters in the model fit. The resulting optimal discount factor can be interpreted as a measure of constituent exhaustion rate during flood events. To evaluate the performance of the proposed regression estimators, we examine two different sampling scenarios by resampling fortnightly and opportunistically from two real daily datasets, which come from two United States Geological Survey (USGS) gaging stations located in Des Plaines River and Illinois River basin. The generalized rating-curve approach produces biased estimates of the total sediment loads by -30% to 83%, whereas the new approaches produce relatively much lower biases, ranging from -24% to 35%. This substantial improvement in the estimates of the total load is due to the fact that predictability of concentration is greatly improved by the additional predictors.