1000 resultados para Statistical principles


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Clearing of native vegetation is a major threat to biodiversity in Australia. In Queensland, clearing has resulted in extensive ecosystem transformation, especially in the more fertile parts of the landscape. In this paper, we examine Queensland, Australian and some overseas evidence of the impact of clearing and related fragmentation effects on terrestrial biota. The geographic locus is the semi-arid regions. although we recognise that coastal regions have been extensively cleared. The evidence reviewed here suggests that the reduction of remnant vegetation to 30% will result in the loss of 25-35% of vertebrate fauna, with the full impact not realised for another 50-100 years, or even longer. Less mobile, habitat specialists and rare species appear to be particularly at risk. We propose three broad principles For effective biodiversity conservation in Queensland: (i) regional native vegetation retention thresholds of 50910: (ii) regional ecosystem thresholds of 30%: and (iii) landscape design and planning principles that protect large remnants, preferably > 2000 ha, as core habitats. Under these retention thresholds. no further clearing would be permitted in the extensively cleared biogeographic regions such as Brigalow Belt and New England Tablelands. Some elements of the biota. however, will require more detailed knowledge and targeted retention and management to ensure their security. The application of resource sustainability and economic criteria outlined elsewhere in this volume should be applied to ensure that the biogeographic regions in the north and west of Queensland that are largely intact continue to provide extensive wildlife habitat.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computer Science is a subject which has difficulty in marketing itself. Further, pinning down a standard curriculum is difficult-there are many preferences which are hard to accommodate. This paper argues the case that part of the problem is the fact that, unlike more established disciplines, the subject does not clearly distinguish the study of principles from the study of artifacts. This point was raised in Curriculum 2001 discussions, and debate needs to start in good time for the next curriculum standard. This paper provides a starting point for debate, by outlining a process by which principles and artifacts may be separated, and presents a sample curriculum to illustrate the possibilities. This sample curriculum has some positive points, though these positive points are incidental to the need to start debating the issue. Other models, with a less rigorous ordering of principles before artifacts, would still gain from making it clearer whether a specific concept was fundamental, or a property of a specific technology. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effect of number of samples and selection of data for analysis on the calculation of surface motor unit potential (SMUP) size in the statistical method of motor unit number estimates (MUNE) was determined in 10 normal subjects and 10 with amyotrophic lateral sclerosis (ALS). We recorded 500 sequential compound muscle action potentials (CMAPs) at three different stable stimulus intensities (10–50% of maximal CMAP). Estimated mean SMUP sizes were calculated using Poisson statistical assumptions from the variance of 500 sequential CMAP obtained at each stimulus intensity. The results with the 500 data points were compared with smaller subsets from the same data set. The results using a range of 50–80% of the 500 data points were compared with the full 500. The effect of restricting analysis to data between 5–20% of the CMAP and to standard deviation limits was also assessed. No differences in mean SMUP size were found with stimulus intensity or use of different ranges of data. Consistency was improved with a greater sample number. Data within 5% of CMAP size gave both increased consistency and reduced mean SMUP size in many subjects, but excluded valid responses present at that stimulus intensity. These changes were more prominent in ALS patients in whom the presence of isolated SMUP responses was a striking difference from normal subjects. Noise, spurious data, and large SMUP limited the Poisson assumptions. When these factors are considered, consistent statistical MUNE can be calculated from a continuous sequence of data points. A 2 to 2.5 SD or 10% window are reasonable methods of limiting data for analysis. Muscle Nerve 27: 320–331, 2003

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distance learners are self-directed learners traditionally taught via study books, collections of readings, and exercises to test understanding of learning packages. Despite advances in e-Learning environments and computer-based teaching interfaces, distance learners still lack opportunities to participate in exercises and debates available to classroom learners, particularly through non-text based learning techniques. Effective distance teaching requires flexible learning opportunities. Using arguments developed in interpretation literature, we argue that effective distance learning must also be Entertaining, Relevant, Organised, Thematic, Involving and Creative—E.R.O.T.I.C. (after Ham, 1992). We discuss an experiment undertaken with distance learners at The University of Queensland Gatton Campus, where we initiated an E.R.O.T.I.C. external teaching package aimed at engaging distance learners but using multimedia, including but not limited to text-based learning tools. Student responses to non-text media were positive.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Provision of health information to people with aphasia is inadequate. Current practice in providing printed health education materials to people with aphasia does not routinely take into consideration their language and associated reading difficulties. Aims: This study aimed to investigate if people with aphasia can comprehend health information contained in printed health education materials and if the application of aphasia-friendly principles is effective in assisting them to comprehend health information. It was hypothesised that participants with aphasia would comprehend significantly more information from aphasia-friendly materials than from existing materials. Other aims included investigating if the effectiveness of the aphasia-friendly principles is related to aphasia severity, if people with aphasia are more confident in responding to health information questions after they have read the aphasia-friendly material, if they prefer to read the aphasia-friendly brochures, and if they prefer to read the brochure type that resulted in the greatest increase in their knowledge. Methods & Procedures: Twelve participants with mild to moderately severe aphasia were matched according to their reading abilities. A pre and post experimental design was employed with repeated measures ANOVA (p

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite its widespread use, the Coale-Demeny model life table system does not capture the extensive variation in age-specific mortality patterns observed in contemporary populations, particularly those of the countries of Eastern Europe and populations affected by HIV/AIDS. Although relational mortality models, such as the Brass logit system, can identify these variations, these models show systematic bias in their predictive ability as mortality levels depart from the standard. We propose a modification of the two-parameter Brass relational model. The modified model incorporates two additional age-specific correction factors (gamma(x), and theta(x)) based on mortality levels among children and adults, relative to the standard. Tests of predictive validity show deviations in age-specific mortality rates predicted by the proposed system to be 30-50 per cent lower than those predicted by the Coale-Demeny system and 15-40 per cent lower than those predicted using the original Brass system. The modified logit system is a two-parameter system, parameterized using values of l(5) and l(60).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evaluation of patients for rehabilitation after musculoskeletal injury involves identifying, grading and assessing the injury and its impact on the patient's normal activities. Management is guided by a multidisciplinary team, comprising the patient, doctor and physical therapist, with other health professionals recruited as required. Parallel interventions involving the various team members are specified in a customised management plan. The key component of the plan is active mobilisation utilising strengthening, flexibility and endurance exercise programs. Passive physical treatments (heat, ice, and manual therapy), as well as drug therapy and psychological interventions, are used as adjunctive therapy. Biomechanical devices or techniques (eg, orthotic devices) may also be helpful. Coexisting conditions such as depression and drug dependence are treated at the same time as the injury. Effective team communication, simulated environmental testing and, for those employed, contact with the employer facilitate a staged return to normal living, sports and occupational activities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A growing number of predicting corporate failure models has emerged since 60s. Economic and social consequences of business failure can be dramatic, thus it is not surprise that the issue has been of growing interest in academic research as well as in business context. The main purpose of this study is to compare the predictive ability of five developed models based on three statistical techniques (Discriminant Analysis, Logit and Probit) and two models based on Artificial Intelligence (Neural Networks and Rough Sets). The five models were employed to a dataset of 420 non-bankrupt firms and 125 bankrupt firms belonging to the textile and clothing industry, over the period 2003–09. Results show that all the models performed well, with an overall correct classification level higher than 90%, and a type II error always less than 2%. The type I error increases as we move away from the year prior to failure. Our models contribute to the discussion of corporate financial distress causes. Moreover it can be used to assist decisions of creditors, investors and auditors. Additionally, this research can be of great contribution to devisers of national economic policies that aim to reduce industrial unemployment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A growing number of predicting corporate failure models has emerged since 60s. Economic and social consequences of business failure can be dramatic, thus it is not surprise that the issue has been of growing interest in academic research as well as in business context. The main purpose of this study is to compare the predictive ability of five developed models based on three statistical techniques (Discriminant Analysis, Logit and Probit) and two models based on Artificial Intelligence (Neural Networks and Rough Sets). The five models were employed to a dataset of 420 non-bankrupt firms and 125 bankrupt firms belonging to the textile and clothing industry, over the period 2003–09. Results show that all the models performed well, with an overall correct classification level higher than 90%, and a type II error always less than 2%. The type I error increases as we move away from the year prior to failure. Our models contribute to the discussion of corporate financial distress causes. Moreover it can be used to assist decisions of creditors, investors and auditors. Additionally, this research can be of great contribution to devisers of national economic policies that aim to reduce industrial unemployment.

Relevância:

20.00% 20.00%

Publicador: