827 resultados para data-based reporting
Resumo:
This study examined elementary school teachers’ knowledge of their legislative and policy-based reporting duties with respect to child sexual abuse. Data were collected from 470 elementary school teachers from urban and rural government and nongovernment schools in 3 Australian states, which at the time of the study had 3 different legislative reporting duties for teachers. Teachers completed the 8-part Teacher Reporting Questionnaire (TRQ). Multinomial logistic regression analysis was used to determine factors associated with (a) teachers’ legislation knowledge and (b) teachers’ policy knowledge. Teachers with higher levels of knowledge had a combination of pre- and in-service training about child sexual abuse and more positive attitudes toward reporting, held administration positions in their school, and had reported child sexual abuse at least once during their teaching career. They were also more likely to work in the state with the strongest legislative reporting duty, which had been in place the longest.
Resumo:
Spatial data are now prevalent in a wide range of fields including environmental and health science. This has led to the development of a range of approaches for analysing patterns in these data. In this paper, we compare several Bayesian hierarchical models for analysing point-based data based on the discretization of the study region, resulting in grid-based spatial data. The approaches considered include two parametric models and a semiparametric model. We highlight the methodology and computation for each approach. Two simulation studies are undertaken to compare the performance of these models for various structures of simulated point-based data which resemble environmental data. A case study of a real dataset is also conducted to demonstrate a practical application of the modelling approaches. Goodness-of-fit statistics are computed to compare estimates of the intensity functions. The deviance information criterion is also considered as an alternative model evaluation criterion. The results suggest that the adaptive Gaussian Markov random field model performs well for highly sparse point-based data where there are large variations or clustering across the space; whereas the discretized log Gaussian Cox process produces good fit in dense and clustered point-based data. One should generally consider the nature and structure of the point-based data in order to choose the appropriate method in modelling a discretized spatial point-based data.
Resumo:
In recent years, concern has arisen over the effects of increasing carbon dioxide (CO2) in the earth's atmosphere due to the burning of fossil fuels. One way to mitigate increase in atmospheric CO2 concentration and climate change is carbon sequestration to forest vegeta-tion through photosynthesis. Comparable regional scale estimates for the carbon balance of forests are therefore needed for scientific and political purposes. The aim of the present dissertation was to improve methods for quantifying and verifying inventory-based carbon pool estimates of the boreal forests in the mineral soils. Ongoing forest inventories provide a data based on statistically sounded sampling for estimating the level of carbon stocks and stock changes, but improved modelling tools and comparison of methods are still needed. In this dissertation, the entire inventory-based large-scale forest carbon stock assessment method was presented together with some separate methods for enhancing and comparing it. The enhancement methods presented here include ways to quantify the biomass of understorey vegetation as well as to estimate the litter production of needles and branches. In addition, the optical remote sensing method illustrated in this dis-sertation can be used to compare with independent data. The forest inventory-based large-scale carbon stock assessment method demonstrated here provided reliable carbon estimates when compared with independent data. Future ac-tivity to improve the accuracy of this method could consist of reducing the uncertainties regarding belowground biomass and litter production as well as the soil compartment. The methods developed will serve the needs for UNFCCC reporting and the reporting under the Kyoto Protocol. This method is principally intended for analysts or planners interested in quantifying carbon over extensive forest areas.
Resumo:
This paper provides algorithms that use an information-theoretic analysis to learn Bayesian network structures from data. Based on our three-phase learning framework, we develop efficient algorithms that can effectively learn Bayesian networks, requiring only polynomial numbers of conditional independence (CI) tests in typical cases. We provide precise conditions that specify when these algorithms are guaranteed to be correct as well as empirical evidence (from real world applications and simulation tests) that demonstrates that these systems work efficiently and reliably in practice.
Resumo:
This paper considers an extension to the skew-normal model through the inclusion of an additional parameter which can lead to both uni- and bi-modal distributions. The paper presents various basic properties of this family of distributions and provides a stochastic representation which is useful for obtaining theoretical properties and to simulate from the distribution. Moreover, the singularity of the Fisher information matrix is investigated and maximum likelihood estimation for a random sample with no covariates is considered. The main motivation is thus to avoid using mixtures in fitting bimodal data as these are well known to be complicated to deal with, particularly because of identifiability problems. Data-based illustrations show that such model can be useful. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
O objetivo deste estudo é fazer uma análise da relação entre o erro de previsão dos analistas de mercado quanto à rentabilidade das empresas listadas na BM&FBOVESPA S.A. (Bovespa) e os requerimentos de divulgação do International Financial Reporting Standards (IFRS). Isto foi feito através da regressão do erro de previsão dos analistas, utilizando a metodologia de dados em painel no ano de implantação do IFRS no Brasil, 2010, e, complementarmente em 2012, para referenciamento desses dados. Partindo desse pressuposto, foi determinado o erro de previsão das empresas listadas na Bovespa através de dados de rentabilidade (índice de lucro por ação/earnings per share) previstos e realizados, disponíveis nas bases de dados I/B/E/S Earnings Consensus Information, providos pela plataforma Thomson ONE Investment Banking e Economática Pro®, respectivamente. Os resultados obtidos indicam uma relação negativa entre o erro de previsão e o cumprimento dos requisitos de divulgação do IFRS, ou seja, quanto maior a qualidade nas informações divulgadas, menor o erro de previsão dos analistas. Portanto, esses resultados sustentam a perspectiva de que o grau de cumprimento das normas contábeis é tão ou mais importante do que as próprias normas. Adicionalmente, foi verificado que quando a empresa listada na BM&FBOVESPA é vinculada a Agência Reguladora, seu erro de previsão não é alterado. Por fim, esses resultados sugerem que é importante que haja o aprimoramento dos mecanismos de auditoria das firmas quanto ao cumprimento dos requerimentos normativos de divulgação, tais como: penalidades pela não observância da norma (enforcement), estruturas de governança corporativa e auditorias interna e externa.
Resumo:
Determination of the utility harmonic impedance based on measurements is a significant task for utility power-quality improvement and management. Compared to those well-established, accurate invasive methods, the noninvasive methods are more desirable since they work with natural variations of the loads connected to the point of common coupling (PCC), so that no intentional disturbance is needed. However, the accuracy of these methods has to be improved. In this context, this paper first points out that the critical problem of the noninvasive methods is how to select the measurements that can be used with confidence for utility harmonic impedance calculation. Then, this paper presents a new measurement technique which is based on the complex data-based least-square regression, combined with two techniques of data selection. Simulation and field test results show that the proposed noninvasive method is practical and robust so that it can be used with confidence to determine the utility harmonic impedances.
Resumo:
BACKGROUND: In industrialized countries vaccination coverage remains suboptimal, partly because of perception of an increased risk of asthma. Epidemiologic studies of the association between childhood vaccinations and asthma have provided conflicting results, possibly for methodologic reasons such as unreliable vaccination data, biased reporting, and reverse causation. A recent review stressed the need for additional, adequately controlled large-scale studies. OBJECTIVE: Our goal was to determine if routine childhood vaccination against pertussis was associated with subsequent development of childhood wheezing disorders and asthma in a large population-based cohort study. METHODS: In 6811 children from the general population born between 1993 and 1997 in Leicestershire, United Kingdom, respiratory symptom data from repeated questionnaire surveys up to 2003 were linked to independently collected vaccination data from the National Health Service database. We compared incident wheeze and asthma between children of different vaccination status (complete, partial, and no vaccination against pertussis) by computing hazard ratios. Analyses were based on 6048 children, 23 201 person-years of follow-up, and 2426 cases of new-onset wheeze. RESULTS: There was no evidence for an increased risk of wheeze or asthma in children vaccinated against pertussis compared with nonvaccinated children. Adjusted hazard ratios comparing fully and partially vaccinated with nonvaccinated children were close to one for both incident wheeze and asthma. CONCLUSION: This study provides no evidence of an association between vaccination against pertussis in infancy and an increased risk of later wheeze or asthma and does not support claims that vaccination against pertussis might significantly increase the risk of childhood asthma.
Resumo:
Changing factors (mainly traffic intensity and weather conditions) affecting road conditions require a suitable optimal speed at any time. To solve this problem, variable speed limit systems (VSL) ? as opposed to fixed limits ? have been developed in recent decades. This term has included a number of speed management systems, most notably dynamic speed limits (DSL). In order to avoid the indiscriminate use of both terms in the literature, this paper proposes a simple classification and offers a review of some experiences, how their effects are evaluated and their results This study also presents a key indicator, which measures the speed homogeneity and a methodology to obtain the data based on floating cars and GPS technology applying it to a case study on a section of the M30 urban motorway in Madrid (Spain).
Resumo:
Background: Parkinson’s disease (PD) is an incurable neurological disease with approximately 0.3% prevalence. The hallmark symptom is gradual movement deterioration. Current scientific consensus about disease progression holds that symptoms will worsen smoothly over time unless treated. Accurate information about symptom dynamics is of critical importance to patients, caregivers, and the scientific community for the design of new treatments, clinical decision making, and individual disease management. Long-term studies characterize the typical time course of the disease as an early linear progression gradually reaching a plateau in later stages. However, symptom dynamics over durations of days to weeks remains unquantified. Currently, there is a scarcity of objective clinical information about symptom dynamics at intervals shorter than 3 months stretching over several years, but Internet-based patient self-report platforms may change this. Objective: To assess the clinical value of online self-reported PD symptom data recorded by users of the health-focused Internet social research platform PatientsLikeMe (PLM), in which patients quantify their symptoms on a regular basis on a subset of the Unified Parkinson’s Disease Ratings Scale (UPDRS). By analyzing this data, we aim for a scientific window on the nature of symptom dynamics for assessment intervals shorter than 3 months over durations of several years. Methods: Online self-reported data was validated against the gold standard Parkinson’s Disease Data and Organizing Center (PD-DOC) database, containing clinical symptom data at intervals greater than 3 months. The data were compared visually using quantile-quantile plots, and numerically using the Kolmogorov-Smirnov test. By using a simple piecewise linear trend estimation algorithm, the PLM data was smoothed to separate random fluctuations from continuous symptom dynamics. Subtracting the trends from the original data revealed random fluctuations in symptom severity. The average magnitude of fluctuations versus time since diagnosis was modeled by using a gamma generalized linear model. Results: Distributions of ages at diagnosis and UPDRS in the PLM and PD-DOC databases were broadly consistent. The PLM patients were systematically younger than the PD-DOC patients and showed increased symptom severity in the PD off state. The average fluctuation in symptoms (UPDRS Parts I and II) was 2.6 points at the time of diagnosis, rising to 5.9 points 16 years after diagnosis. This fluctuation exceeds the estimated minimal and moderate clinically important differences, respectively. Not all patients conformed to the current clinical picture of gradual, smooth changes: many patients had regimes where symptom severity varied in an unpredictable manner, or underwent large rapid changes in an otherwise more stable progression. Conclusions: This information about short-term PD symptom dynamics contributes new scientific understanding about the disease progression, currently very costly to obtain without self-administered Internet-based reporting. This understanding should have implications for the optimization of clinical trials into new treatments and for the choice of treatment decision timescales.
Resumo:
The incredible rapid development to huge volumes of air travel, mainly because of jet airliners that appeared to the sky in the 1950s, created the need for systematic research for aviation safety and collecting data about air traffic. The structured data can be analysed easily using queries from databases and running theseresults through graphic tools. However, in analysing narratives that often give more accurate information about the case, mining tools are needed. The analysis of textual data with computers has not been possible until data mining tools have been developed. Their use, at least among aviation, is still at a moderate level. The research aims at discovering lethal trends in the flight safety reports. The narratives of 1,200 flight safety reports from years 1994 – 1996 in Finnish were processed with three text mining tools. One of them was totally language independent, the other had a specific configuration for Finnish and the third originally created for English, but encouraging results had been achieved with Spanish and that is why a Finnish test was undertaken, too. The global rate of accidents is stabilising and the situation can now be regarded as satisfactory, but because of the growth in air traffic, the absolute number of fatal accidents per year might increase, if the flight safety will not be improved. The collection of data and reporting systems have reached their top level. The focal point in increasing the flight safety is analysis. The air traffic has generally been forecasted to grow 5 – 6 per cent annually over the next two decades. During this period, the global air travel will probably double also with relatively conservative expectations of economic growth. This development makes the airline management confront growing pressure due to increasing competition, signify cant rise in fuel prices and the need to reduce the incident rate due to expected growth in air traffic volumes. All this emphasises the urgent need for new tools and methods. All systems provided encouraging results, as well as proved challenges still to be won. Flight safety can be improved through the development and utilisation of sophisticated analysis tools and methods, like data mining, using its results supporting the decision process of the executives.
Resumo:
Developing an effective impact evaluation framework, managing and conducting rigorous impact evaluations, and developing a strong research and evaluation culture within development communication organisations presents many challenges. This is especially so when both the community and organisational context is continually changing and the outcomes of programs are complex and difficult to clearly identify.----- This paper presents a case study from a research project being conducted from 2007-2010 that aims to address these challenges and issues, entitled Assessing Communication for Social Change: A New Agenda in Impact Assessment. Building on previous development communication projects which used ethnographic action research, this project is developing, trailing and rigorously evaluating a participatory impact assessment methodology for assessing the social change impacts of community radio programs in Nepal. This project is a collaboration between Equal Access – Nepal (EAN), Equal Access – International, local stakeholders and listeners, a network of trained community researchers, and a research team from two Australian universities. A key element of the project is the establishment of an organisational culture within EAN that values and supports the impact assessment process being developed, which is based on continuous action learning and improvement. The paper describes the situation related to monitoring and evaluation (M&E) and impact assessment before the project began, in which EAN was often reliant on time-bound studies and ‘success stories’ derived from listener letters and feedback. We then outline the various strategies used in an effort to develop stronger and more effective impact assessment and M&E systems, and the gradual changes that have occurred to date. These changes include a greater understanding of the value of adopting a participatory, holistic, evidence-based approach to impact assessment. We also critically review the many challenges experienced in this process, including:----- • Tension between the pressure from donors to ‘prove’ impacts and the adoption of a bottom-up, participatory approach based on ‘improving’ programs in ways that meet community needs and aspirations.----- • Resistance from the content teams to changing their existing M&E practices and to the perceived complexity of the approach.----- • Lack of meaningful connection between the M&E and content teams.----- • Human resource problems and lack of capacity in analysing qualitative data and reporting results.----- • The contextual challenges, including extreme poverty, wide cultural and linguistic diversity, poor transport and communications infrastructure, and political instability.----- • A general lack of acceptance of the importance of evaluation within Nepal due to accepting everything as fate or ‘natural’ rather than requiring investigation into a problem.
Resumo:
Over 3000 cases of child sexual abuse are identified every year in Australia, but the real incidence is higher still. As a strategy to identify child sexual abuse, Australian States and Territories have enacted legislation requiring members of selected professions, including teachers, to report suspected cases. In addition, policy-based reporting obligations have been developed by professions, including the teaching profession. These legislative and industry-based developments have occurred in a context of growing awareness of the incidence and consequences of child sexual abuse. Teachers have frequent contact and close relationships with children, and possess expertise in monitoring changes in children’s behaviour. Accordingly, teachers are seen as being well-placed to detect and report suspected child sexual abuse. To date, however, there has been little empirical research into the operation of these reporting duties. The extent of teachers’ awareness of their duties to report child sexual abuse is unknown. Further, there is little evidence about teachers’ past reporting practice. Teachers’ duties to report sexual abuse, especially those in legislation, differ between States, and it is not known whether or how these differences affect reporting practice. This article presents results from the first large-scale Australian survey of teachers in three States with different reporting laws: New South Wales, Queensland, and Western Australia. The results indicate levels of teacher knowledge of reporting duties, reveal evidence about past reporting practice, and provide insights into anticipated future reporting practice and legal compliance. The findings have implications for reform of legislation and policy, training of teachers about the reporting of child sexual abuse, and enhancement of child protection.
Resumo:
This thesis investigates profiling and differentiating customers through the use of statistical data mining techniques. The business application of our work centres on examining individuals’ seldomly studied yet critical consumption behaviour over an extensive time period within the context of the wireless telecommunication industry; consumption behaviour (as oppose to purchasing behaviour) is behaviour that has been performed so frequently that it become habitual and involves minimal intentions or decision making. Key variables investigated are the activity initialised timestamp and cell tower location as well as the activity type and usage quantity (e.g., voice call with duration in seconds); and the research focuses are on customers’ spatial and temporal usage behaviour. The main methodological emphasis is on the development of clustering models based on Gaussian mixture models (GMMs) which are fitted with the use of the recently developed variational Bayesian (VB) method. VB is an efficient deterministic alternative to the popular but computationally demandingMarkov chainMonte Carlo (MCMC) methods. The standard VBGMMalgorithm is extended by allowing component splitting such that it is robust to initial parameter choices and can automatically and efficiently determine the number of components. The new algorithm we propose allows more effective modelling of individuals’ highly heterogeneous and spiky spatial usage behaviour, or more generally human mobility patterns; the term spiky describes data patterns with large areas of low probability mixed with small areas of high probability. Customers are then characterised and segmented based on the fitted GMM which corresponds to how each of them uses the products/services spatially in their daily lives; this is essentially their likely lifestyle and occupational traits. Other significant research contributions include fitting GMMs using VB to circular data i.e., the temporal usage behaviour, and developing clustering algorithms suitable for high dimensional data based on the use of VB-GMM.
Resumo:
It is only in recent years that the critical role that spatial data can play in disaster management and strengthening community resilience has been recognised. The recognition of this importance is singularly evident from the fact that in Australia spatial data is considered as soft infrastructure. In the aftermath of every disaster this importance is being increasingly strengthened with state agencies paying greater attention to ensuring the availability of accurate spatial data based on the lessons learnt. For example, the major flooding in Queensland during the summer of 2011 resulted in a comprehensive review of responsibilities and accountability for the provision of spatial information during such natural disasters. A high level commission of enquiry completed a comprehensive investigation of the 2011 Brisbane flood inundation event and made specific recommendations concerning the collection of and accessibility to spatial information for disaster management and for strengthening community resilience during and after a natural disaster. The lessons learnt and processes implemented were subsequently tested by natural disasters during subsequent years. This paper provides an overview of the practical implementation of the recommendations of the commission of enquiry. It focuses particularly on the measures adopted by the state agencies with the primary role for managing spatial data and the evolution of this role in Queensland State, Australia. The paper concludes with a review of the development of the role and the increasing importance of spatial data as an infrastructure for disaster planning and management which promotes the strengthening of community resilience.